There's a lot of info out there about how 5.1 640kbps AC3 is a standard target for lossless audio, and that EAC3 is more efficient than that so 640kbps 5.1 EAC3 is even better. But there's hardly any info out there about what bitrate to target for 7.1 EAC3. Is it 128kbps per channel? Does LFE channel count? And how much does Atmos bloat the bitrate?
Netflix serves 768kbps 5.1 EAC3 Atmos. Assuming 128kbps / channel (+LFE), normal AC3 is 640kbps (5x128). 768kbps / 128kbps = 6 channels...but its still 5.1 Atmos. Is Atmos really taking up an entire channel's worth of bandwidth on top of normal 5.1? There's not a lot of information that I can find to clear this up.
Would "pretty much transparent" lossless bitrates for 7.1 EAC3 be 896kbps (7 channels x 128kbps, and LFE) and for Atmos 1024kbps (7 channels x 128kbps, and LFE, and 128kbps for Atmos)?
5.1 - 640kbps
5.1 Atmos - 768kbps
7.1 - 896kbps
7.1 Atmos - 1024kbps
Assume you have production quality tools capable of encoding 7.1 and not the currently available open source limitations of 5.1 EAC3.
+ Reply to Thread
Results 1 to 6 of 6
Anybody got an opinion?
Opinion on this is worthless, people won't agree
Just extrapolating the ~128kbps/channel of 5.1 would give us 896kbps for 7.1, and probably 1024kbps or more if It included Atmos metadata.
I have been using commercial tooling to encode DD+ 7.1 at 896kbps and it sounds the same to me as the source, though I have yet to really put it through paces on a good set of speakers.