VideoHelp Forum




+ Reply to Thread
Results 1 to 10 of 10
  1. Member
    Join Date
    Jan 2007
    Location
    Canada
    Search Comp PM
    I'm wondering if someone can offer some advice/expertise on the effects of encoding Dolby Digital (.ac3) at lower bit rates.

    For instance, aside from bit rate/file size, what's the difference between encoding 5.1 mono wavs to 384kbps or 448kbps.
    Obviously more compression is applied, but what kind of compression? How are the frequencies affected? I read somewhere, that the upper frequencies are attenuated more using a lower bit rate, dynamic range may be affected and so on.

    I've been encoding to either 384kbps or 448kbps for most of my work, DVDs and 640kbps for Blu-rays.
    I'm wondering if I should forgo the lower bit rates due to the amount of loss incurred or is the loss small enough that it would go unnoticed by most people.

    On a side note, I don't want to hear about Dolby Digital being a lossy format. I already know that, there are hundreds of threads and articles talking about DD being lossy. Yet, despite being lossy, DD has been good enough for DVD and Blu-ray production and also used in theaters for many, many years. So, really, how bad can it be?

    On to the question at hand...
    Thanx in advance
    Quote Quote  
  2. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    64kbps per channel is the lowest recommended bitrate by Dolby, for transparency. But some content is more demanding than others, so take what you will from that.

    We asked Dolby Labs, how many kbps per channel was actually necessary without any perceptible audio degradation. Dolby’s answer was 64 kbps per channel. This response seemed too low based on common practice, so we confirmed this several times with Dolby Labs. The question was also posed to Dolby regarding the encoding of a dual mono pair as one commonly finds in Secondary Audio Program (“SAP”) and/or Descriptive Video Information (“DVI”). The answer here was 64 kbps as the two channels were redundant. They stated that the decoder would reproduce it in dual mono because that is the way it is coded to start and that since the two channels were essentially repeated, additional bits were not needed. In a 5.1 audio configuration, 384 kbps was the specified number. We tested these settings and could hear no difference between 128 kbps for a stereo pair and 192 kbps on our test television receivers.
    Source: Page 12 http://www.ctia.org/docs/default-source/fcc-filings/technical-report-of-the-klcs-kjla-...ring-pilot.pdf

    Just about every lossy codec will clip the higher frequencies, in order to focus on the lower frequencies. The lower the bitrate, the more clipping happens in order to try to preserve any quality at the lower frequencies. Dolby AC3 encoders work the same.
    Last edited by KarMa; 21st Apr 2016 at 00:10.
    Quote Quote  
  3. Although.....
    I suspect those tests would have been carried out using a Dolby AC3 encoder, whereas most mere mortals would be using Aften or FFmpeg.

    Not that I'm saying they're bad AC3 encoders, just that it's likely at best they would only be equivalent in quality to the Dolby encoder at a given bitrate.

    And not that I do much AC3 encoding, but if I did I might find myself not specifying a bitrate and letting the encoder decide, which for Aften is 448kbps for 5.1ch. I'm pretty sure ffmpeg is the same. http://aften.sourceforge.net/longhelp.html
    Mind you the LFE channel doesn't seem to count and 448kbps is used for 5.0ch too. Maybe that's something to do with DVD compliant bitrates, or maybe it indicates some bitrate wiggle room, or maybe it just means I have no idea what I'm talking about, given I rarely do any AC3 encoding....
    Quote Quote  
  4. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Yeah, using the proprietary Dolby encoder is better. I did my own tests with it (using Adobe Media Encoder), and found it to be obviously better than Aften doing my own ABX tests in Foobar. So with non-Dolby AC3, just use higher bitrates.
    Last edited by KarMa; 21st Apr 2016 at 04:11.
    Quote Quote  
  5. Member
    Join Date
    Jan 2007
    Location
    Canada
    Search Comp PM
    Thanx for the great info guys.
    Not sure where KarMa found the pdf on how broadcasters use/determine bitrate usage for metadata, audio and video, etc. but seems like a good read though.

    The 64kbps/channel isn't actually a surprise to me though. According to Wikipedia, 320kbps (64 x 5) is what's common for use in theaters.
    Also, the .1 of 5.1 isn't actually part of the DVD specification, (correct me if I'm wrong). It's actually mostly derived from the center channel and constitutes only about 10% of the total bitrate.

    I really need to keep bookmarks/favorites more often for reference. However, here is an interesting article:
    http://www.practical-home-theater-guide.com/dolby-vs-dts.html

    I currently use Sony Vegas and Adobe for my editing/encoding needs. So, yeah I use the proprietary Dolby encoder.
    Audio transparency is what I'm looking for. Sure there are some people and some home theater systems that can perceive small differences, but I'm looking at the vast majority.

    If I wanted to preserve the ORIGINAL data, I'd be using either Dolby TrueHD or DTS Master Audio.
    Thanks again guys, this is exactly what I was looking for.
    Quote Quote  
  6. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    Originally Posted by KarMa View Post
    We tested these settings and could hear no difference between 128 kbps for a stereo pair and 192 kbps on our test television receivers.
    Source: Page 12 http://www.ctia.org/docs/default-source/fcc-filings/technical-report-of-the-klcs-kjla-...ring-pilot.pdf
    Why am I not surprised they couldn't hear the difference on their tiny full-range loudspeakers?
    Quote Quote  
  7. Member
    Join Date
    Jan 2007
    Location
    Canada
    Search Comp PM
    Ahhh, let the negativity begin. What would a thread be without it.

    I haven't read the entire pdf yet, but I kinda doubt that in-depth testing would be done ONLY using small full-range speakers.

    The sheer number of DVD's, Blu-rays and theaters that have used this technology for decades would suggest the comment JVRaines made is irrelevant, not to mention that I wanted to keep that off this thread.

    Any ACTUAL advice? Suggestions? Proof? Testing? If not, keep it to yourselves or find another post to harass.

    Thanx again.
    Quote Quote  
  8. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by JVRaines View Post
    Why am I not surprised they couldn't hear the difference on their tiny full-range loudspeakers?
    They never said what speakers they used nor what type of audio they tested with other than PBS programing. The audio part was not a major part of their test either, the testing of channel sharing using H.264 was their main goal.

    The bigger authority figure I was pointing to was what Dolby said was needed for transparency, 64kbps per channel. Dolby could be telling their honest stance, or could be straight up lying in order to make Dolby look better. If it was up to me everything would be AAC/OPUS/FLAC and leave AC3 in the past, but Dolby does everything they can to stay relevant and keep collecting their decoder/encoder fees. Like how they payed MIT millions to vote for AC3 in the ATSC standard, over MP2.
    Quote Quote  
  9. Originally Posted by ziggy1971 View Post
    Also, the .1 of 5.1 isn't actually part of the DVD specification, (correct me if I'm wrong). It's actually mostly derived from the center channel and constitutes only about 10% of the total bitrate.
    According to Wikipedia here it supports 5.1ch.
    https://en.wikipedia.org/wiki/DVD-Video#Audio_data
    The "point one" is the LFE channel.
    https://en.wikipedia.org/wiki/Surround_sound#Low_Frequency_Effects_.28LFE.29_channel

    The same theory would still apply. Even more so. LFE is supposed to be for bass "enhancement". The left/right channels still have to contain all the "required" low frequencies and LFE can be used to enhance that, but without it you shouldn't miss out on anything. The Dolby recommended method of downmixing 5.1ch to stereo doesn't include the LFE channel.


    Ballparking it in a very ballparky way, I think for a 2 hour movie and 5.1ch worth of audio the difference in file size between 384kbps and 448kbps is around 50 MB.

    350 MB(ish) v 400 MB(ish)

    Given the likely file size of a 2 hour movie 1080p/720p movie anyway, I wouldn't quibble myself.
    Quote Quote  
  10. Member
    Join Date
    Jan 2007
    Location
    Canada
    Search Comp PM
    My apologies, after some more reading/research, it seems I am inaccurate in with my information.

    Also, the .1 of 5.1 isn't actually part of the DVD specification, (correct me if I'm wrong).
    See the Wikipedia page at: https://en.wikipedia.org/wiki/Dolby_Digital

    Under "Channel Configurations", it says the .1 or LFE IS a discrete channel.
    Last edited by ziggy1971; 29th Apr 2016 at 17:39.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!