High CPU usage in the events/timeline views—do they support CUDA/NVDEC? #1235
-
|
Hello! I've set up Viseron 3.4.1 as a 24/7 NVR with motion and person detection configured on some cameras. It's running on a server equipped with a Xeon e3-1245v5 and a GTX 1060 6GB. The system currently has 5 cameras. Most of the time, including when the Live view is up on another computer, CPU usage on the server is around ~30%. The installed image is one with CUDA enabled and I can confirm it is working with nvidia-smi, which shows processes for Viseron, Darknet, and the ffmpeg instances for all cameras. My understanding is that all encode and decode should be able to be off-loaded to the GPU, and it generally seems to be so. However, when I select the Timeline or Events view on a viewing machine, the server's CPU usage spikes up to 90-100%. When I run nvidia-smi, I don't see any new ffmpeg instances show up as processes. Both of these suggest to me that the timeline and events views are using the CPU to decode and send frames rather than NVDEC. Is this intended behavior/is there a reason for this? Since this server also runs file shares and other apps, I'd like to keep CPU resources freed up. Is there any way to get Timeline and Events playback to offload decode to the GPU? Let me know, and thanks! I'm impressed with Viseron overall and want to keep it running smoothly for the long haul. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
|
Frames are not decoded for the Events/timeline, they are simply read from disk as is so that is not the issue. The Timeline works by utilizing HLS playlists, which are reloaded every 5-ish seconds to get new segments. There is something called HLS Delta playlists that could be implemented to reduce the need to load all segments on each fetch but that is complicated business that i have not looked into yet |
Beta Was this translation helpful? Give feedback.
Frames are not decoded for the Events/timeline, they are simply read from disk as is so that is not the issue.
What process is using the CPU? Is it Viseron + PostgreSQL?
The Timeline works by utilizing HLS playlists, which are reloaded every 5-ish seconds to get new segments.
If you view many cameras at the same time, that all record 24/7 with longer retention periods it is a lot of data to process and the SQL queries will use the CPU.
There is something called HLS Delta playlists that could be implemented to reduce the need to load all segments on each fetch but that is complicated business that i have not looked into yet