Your Satisfaction Level with Blue Iris 4K Performance
Posted: Wed Aug 07, 2024 3:50 pm
The continuing inability of Blue Iris to adequately handle 4K activity is an increasing source of dissatisfaction with its development for me.
I could care less if CPAI can tell the difference between a goat and a bicycle, but do care about horrendously poor rendering of even an h264 4K live stream at 15fps on a 4th Gen i7, a 6th Gen i5, and lastly, a 9th Gen i5 server. 4K stutters, hops, stops on I-frames, tears with h265, and generally sucks on every server I've built.
I've jumped through all the hoops..., going on three years now. Wasted countless hours trying all the bad advice some believe to be gospel on 'optimizing' Blue Iris -- as if there's only their way of using the platform. First bit of BS advice was a Skylake processor was required due to the h265 support and additional h264 GPU processing. Then it became an 8th Gen as the magic bullet so I built a 9th Gen server. Still no joy. I've tried it all multiple times. I've also run the same cameras on the same servers through VLC viewer with flawless results. Gee, what does that indicate?
I personally don't stare at license plates in a parking lot all day at 1fps. My system is primarily for live surveillance in real time. Blue Iris performs the task wonderfully and has for 3 plus years -- until it was tasked with doing so using 4K sources. Seems to me that development focus should be paying at least some attention to this particular issue..., unless of course it's been quietly fixed in a release that broke something else since my last upgrade.
The BI apologists always claim "we're not making movies here" or "there's really no difference between 30fps and 15fps"..., among other shallow lines of bullshit to simply avoid the elephant in the room, which is Blue Iris sucks at rendering 4K streams at anywhere near the capabilities of the cameras people spend hundreds of dollars for. Instead, they're advised to reduce the capabilities of their expensive new schmega-pixel cameras by cutting its balls off to satisfy the shortcomings of Blue Iris for pure crap 4K decoding anyway. I personally find that absurd.
Then again, maybe it's just me. Regardless, I'm very frustrated by my personal experiences with this issue and am interested in how others perceive their BI system performance with 4K sources.
If you're frustrated too, at least I'm not alone. If on the other hand you're pulling 4K 30fps streams like running water, what am I missing and how are you doing it?
TIA for any and all relevant input and/or enlightenment.
I could care less if CPAI can tell the difference between a goat and a bicycle, but do care about horrendously poor rendering of even an h264 4K live stream at 15fps on a 4th Gen i7, a 6th Gen i5, and lastly, a 9th Gen i5 server. 4K stutters, hops, stops on I-frames, tears with h265, and generally sucks on every server I've built.
I've jumped through all the hoops..., going on three years now. Wasted countless hours trying all the bad advice some believe to be gospel on 'optimizing' Blue Iris -- as if there's only their way of using the platform. First bit of BS advice was a Skylake processor was required due to the h265 support and additional h264 GPU processing. Then it became an 8th Gen as the magic bullet so I built a 9th Gen server. Still no joy. I've tried it all multiple times. I've also run the same cameras on the same servers through VLC viewer with flawless results. Gee, what does that indicate?
I personally don't stare at license plates in a parking lot all day at 1fps. My system is primarily for live surveillance in real time. Blue Iris performs the task wonderfully and has for 3 plus years -- until it was tasked with doing so using 4K sources. Seems to me that development focus should be paying at least some attention to this particular issue..., unless of course it's been quietly fixed in a release that broke something else since my last upgrade.
The BI apologists always claim "we're not making movies here" or "there's really no difference between 30fps and 15fps"..., among other shallow lines of bullshit to simply avoid the elephant in the room, which is Blue Iris sucks at rendering 4K streams at anywhere near the capabilities of the cameras people spend hundreds of dollars for. Instead, they're advised to reduce the capabilities of their expensive new schmega-pixel cameras by cutting its balls off to satisfy the shortcomings of Blue Iris for pure crap 4K decoding anyway. I personally find that absurd.
Then again, maybe it's just me. Regardless, I'm very frustrated by my personal experiences with this issue and am interested in how others perceive their BI system performance with 4K sources.
If you're frustrated too, at least I'm not alone. If on the other hand you're pulling 4K 30fps streams like running water, what am I missing and how are you doing it?
TIA for any and all relevant input and/or enlightenment.