Conversation
| .usrFrmDepth = 30, // frame buffer depth | ||
| #if defined(PLATFORM_T40) || defined(PLATFORM_T41) | ||
| .usrFrmDepth = 30, | ||
| .aecChn = AUDIO_AEC_CHANNEL_FIRST_LEFT, // T40/T41 require aecChn to be initialized |
| u_ctx->imaging_dirty = false; | ||
| } | ||
| } | ||
| #else |
|
|
||
| Detection::Detection(int osdGrp, uint16_t stream_width, uint16_t stream_height) | ||
| : osdGrp(osdGrp), stream_width(stream_width), stream_height(stream_height), | ||
| enabled(false), initialized(false), lastModTime(0), currentBoxCount(0) |
There was a problem hiding this comment.
Rather than an always on/off; shouldn't it just do detections if motion is detected?
There was a problem hiding this comment.
I mean I wouldn't use it that way, but it could be supported in the futrure -- if I were using on device detections, id just want it always processing snapshots -- its not like its a battery cam.
|
|
||
| DetectionResult result; | ||
| if (!parseDetectionJSON(json_path, result)) return; | ||
|
|
There was a problem hiding this comment.
Shouldn't it also run a script when it detects something like how Motion runs a script?
There was a problem hiding this comment.
scripts can leverage the /tmp/detections.json directly, this is for drawing overlays on the RTSP stream using OSD memory. OR mars_detect could trigger a script under certain conditions in that pipeline -- I just feel we shouldn't overly complicate the streamer app. onivf server could also do something with the detections json file.

To leverage, add to your
prudynt.jsonconfig:Then separately push the mars binary and runtime libs to /opt and run your favorite mars model in dameon mode:
LD_LIBRARY_PATH=/opt ./mars_detect -t .5 tinydet_best_float.mars -d