VideoHelp Forum
+ Reply to Thread
Results 1 to 23 of 23
Thread
  1. Member
    Join Date
    Nov 2013
    Location
    Western Australia
    Search Comp PM
    Hi all. I apologize if this has been asked before but I actually couldn't find the answer after doing a search.

    Long story short, I rip Bluray movies and then encode them. I encode the audio as AC3 because I heard that has the most compatibility for old TV USB slots ect, which is important to me.

    I am not an audiophile but at the same time I don't want my audio in movies to sound bad.

    What would be the minimum bitrates for AC3 5.1 and 2.0 Stereo tracks without it sounding noticeably bad/worse?
    Quote Quote  
  2. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    My minimum bitrate for 5.1 is 448 and stereo is 192,others will give you different bitrates but I like sound quality and never reduce the original size.
    I think,therefore i am a hamster.
    Quote Quote  
  3. Member
    Join Date
    Nov 2013
    Location
    Western Australia
    Search Comp PM
    Originally Posted by johns0 View Post
    My minimum bitrate for 5.1 is 448 and stereo is 192,others will give you different bitrates but I like sound quality and never reduce the original size.
    Ok great, those are the exact numbers I've been using. Except for 2.0 Stereo commentary tracks I use 160. That should be fine hey cus its mostly just voices?
    Quote Quote  
  4. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    160 is good enough for voices but 192 is better for music.
    I think,therefore i am a hamster.
    Quote Quote  
  5. Mr. Computer Geek dannyboy48888's Avatar
    Join Date
    May 2007
    Location
    Texas, USA
    Search Comp PM
    I use via ffmpeg 96k for mono, 192 for stereo and 448 for 5.1 never had it sound "lacking" and in spek it goes up to 18khz-22khz frequency response depending on the source and channel choice I chose.
    if all else fails read the manual
    Quote Quote  
  6. Usually 64kbps per channel is suffcient.
    Quote Quote  
  7. Member Skiller's Avatar
    Join Date
    Oct 2013
    Location
    Germany
    Search PM
    Originally Posted by pandy View Post
    Usually 64kbps per channel is suffcient.
    Not for AC3. AC3 is about as efficient as MP2 and thus needs practically the same bitrates (192 minimum for stereo).
    Quote Quote  
  8. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    I think pandy meant 5.1 where that's like 384.
    I think,therefore i am a hamster.
    Quote Quote  
  9. Member Skiller's Avatar
    Join Date
    Oct 2013
    Location
    Germany
    Search PM
    I see, yes, 384 would be what I'd consider the bottom acceptable minimum for 5.1 AC3.
    This works alright for 5.1 but certainly 64 KBit/s per channel is not acceptable for stereo.
    Quote Quote  
  10. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    128 kbit is acceptable for ac3 stereo but thats for basic sound.
    I think,therefore i am a hamster.
    Quote Quote  
  11. Originally Posted by Skiller View Post
    Originally Posted by pandy View Post
    Usually 64kbps per channel is suffcient.
    Not for AC3. AC3 is about as efficient as MP2 and thus needs practically the same bitrates (192 minimum for stereo).
    Nope, MP2 use different (simpler) coding technology than AC3, AC3 use MDCT and as such spectral efficiency is comparable to MP3. OP asked for minimum AC3 bitrate so 64kbps per channel is sufficient from average customer perspective.
    Quote Quote  
  12. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    I would put AC3 quality & efficiency at some midpoint between MP2 and MP3. It has more spectral tools than MP2, but not as aggressively used as in MP3. And it is of the same era as those algorithms. I find it laughable that anyone would consider 64kbps/channel to be sufficient, however. At that rate it is noticeably compromised.
    The OP did say "minimum...WITHOUT it sounding noticeably worse" (CAPS are mine).

    Honestly, I don't understand the need to minimize the size of audio, when audio is already such a small percentage of the overall size. WHY make it worse, just for a few measly MB?
    Example: diff between 5Mbps Video + 128kbps AC3 audio vs. 5Mbps Video + 256kbps AC3 Audio ("double" the bitrate) is ~100MB for a 2 hour movie which is going to already be at least 4.4GB for the video alone. This is goofy nickle & dime stuff, at the cost of quality.

    But maybe the OP cannot tell the difference. I can, easily. And I'm no "golden ears", regardless of my Audio background.


    Scott
    Last edited by Cornucopia; 28th Mar 2023 at 08:12.
    Quote Quote  
  13. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    I believe the op was asking so the ac3 wouldn't be too low of a bitrate since he needed to encode it for compatibility purposes.
    I think,therefore i am a hamster.
    Quote Quote  
  14. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Yes, that is certainly part of it. But

    1. Support for AC3 is usually yes or no, not based on bitrate. If the playback system supports AC3, it will support just about ANY of the valid AC3 bitrates (high or low). Of course, giving it an invalid bitrate could do weird things, but I don't know of any common tool that gives invalid bitrates.
    2. If the source is not already in AC3 and it needs to be converted, this is a valid point, but one which there still is not a need to skimp unnecessarily. If the source is ALREADY in AC3, there is IMO also no need to even convert (and lose quality), regardless of the bitrate.

    Scott
    Quote Quote  
  15. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    If i had to recode the audio for playback issues i would set the bitrate at the same as the original audio or even a bit higher to keep almost original quality.
    I think,therefore i am a hamster.
    Quote Quote  
  16. Member Skiller's Avatar
    Join Date
    Oct 2013
    Location
    Germany
    Search PM
    Originally Posted by pandy View Post
    Nope, MP2 use different (simpler) coding technology than AC3, AC3 use MDCT and as such spectral efficiency is comparable to MP3.
    That's the theory. But have you actually listened and compared MP2 vs. AC3? I have. Both MP2 and AC3 start to sound noticably compromised starting at 160 KBit/s and below, while both are fine at 192. Thus my conculsion, when it comes to bitrate choice, both can and be treated equally, even if AC3 is technically slightly more advanced.
    128 KBit/s AC3 sounds atrocious to my ears (encoded with a reference encoder btw, Sonic Foundry Soft Encode), and I do find MP3 to be acceptable at 128, so I don't think I'm being too picky.


    I absolutely agree with Cornucopia about this all being miniscule for a video, where the video part is so much larger than the audio anyways. Just use a slightly higher bitrate and be happy with good quality audio is my recommendation.
    Quote Quote  
  17. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    Mp3 songs at 128 sound bad to me,gotta be 192 or higher.
    I think,therefore i am a hamster.
    Quote Quote  
  18. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    I have got a lot (10k+) of mp3 songs (in addition to other formats), most of which I acquired a decade or so ago when storage capacity was much less than it is now. Every one of the 128kbps and 160kbps ones, if I still like the song, I am hoping to replace with a newly souced aac/mp4 at similar or higher rate, or an mp3 at 192 or higher rate (224, 256, 320). Because uuggghh, you can hear the artifacts and it really grates after a while.

    Here is a great test track to compare bitrate vs codec vs quality: the Theme from Shaft. (1977)
    When the hi-hat and wah-wah rhythm guitar are going cyclically, it is about as demanding to a codec as raw random noise is, which is to say pretty demanding. You can instantly tell when you've lost quality when that rhythm track stops being scintillating and starts mushing together. And 128 does not cut it. Far from it.

    Scott
    Quote Quote  
  19. Originally Posted by Skiller View Post
    That's the theory. But have you actually listened and compared MP2 vs. AC3? I have. Both MP2 and AC3 start to sound noticably compromised starting at 160 KBit/s and below, while both are fine at 192. Thus my conculsion, when it comes to bitrate choice, both can and be treated equally, even if AC3 is technically slightly more advanced.
    128 KBit/s AC3 sounds atrocious to my ears (encoded with a reference encoder btw, Sonic Foundry Soft Encode), and I do find MP3 to be acceptable at 128, so I don't think I'm being too picky.
    This is fact not theory - MP2 use way simpler approach and as such way lower spectral coding gain than AC3. You can find source where even at highest possible bitrate MP2/AC3 will perform poorly as any similar lossy codec. And as we living in free world anyone can use any bitrate he likes.
    To be honest it is hilarious to read such passionate comments in times where huge amount of population use wireless bluetooth headphones or commonly consume music from monophonic sources such as smartphones...
    Quote Quote  
  20. Member Skiller's Avatar
    Join Date
    Oct 2013
    Location
    Germany
    Search PM
    Originally Posted by pandy View Post
    This is fact not theory
    Originally Posted by pandy View Post
    You can find source where even at highest possible bitrate MP2/AC3 will perform poorly as any similar lossy codec.
    I'm not arguing with either, but apparently I did not make my point. My point is, what use is a fact if there is no perceivable effect in the real world? You did not answer my question whether you can hear a difference between MP2 and AC3 at borderline bitrates. I can't and I tried hard. Both start to sound unacceptable to me at the very same bitrate.


    Originally Posted by pandy View Post
    To be honest it is hilarious to read such passionate comments in times where huge amount of population use wireless bluetooth headphones or commonly consume music from monophonic sources such as smartphones...
    Well I don't use either of the two. Maybe that's why.
    Quote Quote  
  21. Originally Posted by Skiller View Post
    I'm not arguing with either, but apparently I did not make my point. My point is, what use is a fact if there is no perceivable effect in the real world? You did not answer my question whether you can hear a difference between MP2 and AC3 at borderline bitrates. I can't and I tried hard. Both start to sound unacceptable to me at the very same bitrate.
    Problem is lied in threshold of perception not lack of artifacts. And my hearing abilities are not part of this thread, it is quite common to use 64kbps per channel in AC3 as lowest acceptable per channel bitrate denominator - for example Dolby information:
    http://www.ctia.org/docs/default-source/fcc-filings/technical-report-of-the-klcs-kjla-...ring-pilot.pdf

    We asked Dolby Labs, how many kbps per channel was actually necessary without any
    perceptible audio degradation. Dolby’s answer was 64 kbps per channel. This response seemed
    too low based on common practice, so we confirmed this several times with Dolby Labs. The
    question was also posed to Dolby regarding the encoding of a dual mono pair as one commonly
    finds in Secondary Audio Program (“SAP”) and/or Descriptive Video Information (“DVI”). The
    answer here was 64 kbps as the two channels were redundant. They stated that the decoder
    would reproduce it in dual mono because that is the way it is coded to start and that since the two
    channels were essentially repeated, additional bits were not needed. In a 5.1 audio configuration,
    384 kbps was the specified number. We tested these settings and could hear no difference
    between 128 kbps for a stereo pair and 192 kbps on our test television receivers.
    It certainly is
    possible that there are sets of ears and Hi-Fi sound systems that one could perceive a difference.
    Since the AC-3 system is proprietary to Dolby, there is no one else to ask about this. Dolby
    states in the referenced paper: “Single channel services may have a bit-rate as low as 32 kb/s.
    The overall data stream is defined for bit rates ranging from 32 kb/s up to 640 kb/s. The higher
    rates are intended to allow the basic audio quality to exceed audiophile expectations.
    Quote Quote  
  22. Member Skiller's Avatar
    Join Date
    Oct 2013
    Location
    Germany
    Search PM
    Originally Posted by pandy View Post
    Problem is lied in threshold of perception not lack of artifacts.
    True, I was underestimating this factor, I think.

    But Dolby promoting their own technology as needing only 64 KBit/s per channel is something I am very wary of (especially knowing how stereo 128 AC3 sounds).
    Then again, if this is listended to through the flimsy speakers of a flat TV, possibly in a slightly noisy environment, then yes, of course there is no discernable difference between 128 and 192 for presumably any pair of ears on the planet...
    Quote Quote  
  23. Originally Posted by Skiller View Post
    But Dolby promoting their own technology as needing only 64 KBit/s per channel is something I am very wary of (especially knowing how stereo 128 AC3 sounds).
    Then again, if this is listended to through the flimsy speakers of a flat TV, possibly in a slightly noisy environment, then yes, of course there is no discernable difference between 128 and 192 for presumably any pair of ears on the planet...
    In overall not sure how ffmpeg and other open source ac-3 encoder implementations behave but perhaps original Dolby encoder extensively exploit interchannel correlation and from Dolby perspective 64kbps can be just enough for common materials where interchannel correlation is quite high. Dolby in AC3 times had rather low experience with lossy coding and this may explain why they use quite different terminology to describe codec inner life - seem this is what they call 'coupling'.
    http://www.mp3-tech.org/programmer/docs/ac3-flex.pdf
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!