VideoHelp Forum
+ Reply to Thread
Results 1 to 6 of 6
Thread
  1. There's a lot of info out there about how 5.1 640kbps AC3 is a standard target for lossless audio, and that EAC3 is more efficient than that so 640kbps 5.1 EAC3 is even better. But there's hardly any info out there about what bitrate to target for 7.1 EAC3. Is it 128kbps per channel? Does LFE channel count? And how much does Atmos bloat the bitrate?

    Netflix serves 768kbps 5.1 EAC3 Atmos. Assuming 128kbps / channel (+LFE), normal AC3 is 640kbps (5x128). 768kbps / 128kbps = 6 channels...but its still 5.1 Atmos. Is Atmos really taking up an entire channel's worth of bandwidth on top of normal 5.1? There's not a lot of information that I can find to clear this up.

    Would "pretty much transparent" lossless bitrates for 7.1 EAC3 be 896kbps (7 channels x 128kbps, and LFE) and for Atmos 1024kbps (7 channels x 128kbps, and LFE, and 128kbps for Atmos)?

    EAC3:
    5.1 - 640kbps
    5.1 Atmos - 768kbps
    7.1 - 896kbps
    7.1 Atmos - 1024kbps

    Assume you have production quality tools capable of encoding 7.1 and not the currently available open source limitations of 5.1 EAC3.
    Quote Quote  
  2. Anybody got an opinion?
    Quote Quote  
  3. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    Opinion on this is worthless, people won't agree
    Quote Quote  
  4. DD+ is lossy not lossless. Bitrates are listed here https://en.wikipedia.org/wiki/Dolby_Digital_Plus#Technical_details
    netflix will go the lowest bitrate possible
    Quote Quote  
  5. Originally Posted by 4kblurayguru View Post
    DD+ is lossy not lossless. Bitrates are listed here https://en.wikipedia.org/wiki/Dolby_Digital_Plus#Technical_details
    netflix will go the lowest bitrate possible
    I know it's lossy - I'm asking for opinions on what is considered "transparent" for 7.1. 640kbps is largely considered transparent for 5.1 DD/DD+, I wonder what it is for 7.1.

    Just extrapolating the ~128kbps/channel of 5.1 would give us 896kbps for 7.1, and probably 1024kbps or more if It included Atmos metadata.

    I have been using commercial tooling to encode DD+ 7.1 at 896kbps and it sounds the same to me as the source, though I have yet to really put it through paces on a good set of speakers.
    Quote Quote  
  6. Originally Posted by Msuix View Post
    Originally Posted by 4kblurayguru View Post
    DD+ is lossy not lossless. Bitrates are listed here https://en.wikipedia.org/wiki/Dolby_Digital_Plus#Technical_details
    netflix will go the lowest bitrate possible
    I know it's lossy - I'm asking for opinions on what is considered "transparent" for 7.1. 640kbps is largely considered transparent for 5.1 DD/DD+, I wonder what it is for 7.1.

    Just extrapolating the ~128kbps/channel of 5.1 would give us 896kbps for 7.1, and probably 1024kbps or more if It included Atmos metadata.

    I have been using commercial tooling to encode DD+ 7.1 at 896kbps and it sounds the same to me as the source, though I have yet to really put it through paces on a good set of speakers.
    Just going off what you said about AC3 in your OP. Wouldn't the commercial software you use be able to add the atmos to DD+? does it allow you to add the bitrate normally or automatically during conversion?
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!