Page 1 of 1

Blue Iris Rendering

Posted: Tue May 07, 2024 7:51 am
by idapalmer
We recently switched from our camera network from ACC 7 (Avigilon) to Blue Iris due to ballooning licensing costs.

One immediate thing I've noticed is that ACC 7 rendering was smooth and efficient when viewing multiple streams. The image quality was always good and there were no issues.

Since switching to Blue Iris, the stream rendering with multiple streams isn't great. And I cannot adjust scaling without pinning the server CPU. No substreams are setup but I don't want to have to.

It is my understanding that Blue Iris renders only with the CPU. Our server is 10 years old but supports Quick Sync. Does anyone know if ACC 7 renders with the Nvidia graphics card that's installed? I am trying to figure out the difference here.

Thanks,

Re: Blue Iris Rendering

Posted: Tue May 07, 2024 11:56 pm
by Pogo
Hardware acceleration is selected in the main system setup screen (or per camera if no default is established). Basic 'Intel' would be recommended for QuickSync. A Nvidea selection is also available. If established after cameras have already been installed and configured, the 'default' will apply after a restart of the cameras.

Depending on your vintage of QuickSync, it may or may not support H.265. (Skylake was the first iteration that supported it.) If not, the H.265 decoding will be handled by the CPU instead. The Blue Iris implementation of H.265 leaves much to be desired and should be avoided if possible -- especially with 4K cameras.

Sub streams are your friend unless you are adamant about full res multi-camera real-time monitoring on a very large viewing area. Otherwise, you are just wasting processing horsepower for no reason at all since recording direct to disk (which you should also do) can still be done full res as will full-screen soloing of a camera from a desktop group display that would otherwise be sub streams.