Hi all. I apologize if this has been asked before but I actually couldn't find the answer after doing a search.
Long story short, I rip Bluray movies and then encode them. I encode the audio as AC3 because I heard that has the most compatibility for old TV USB slots ect, which is important to me.
I am not an audiophile but at the same time I don't want my audio in movies to sound bad.
What would be the minimum bitrates for AC3 5.1 and 2.0 Stereo tracks without it sounding noticeably bad/worse?
+ Reply to Thread
Results 1 to 23 of 23
Thread
-
-
My minimum bitrate for 5.1 is 448 and stereo is 192,others will give you different bitrates but I like sound quality and never reduce the original size.
I think,therefore i am a hamster. -
-
160 is good enough for voices but 192 is better for music.
I think,therefore i am a hamster. -
-
I think pandy meant 5.1 where that's like 384.
I think,therefore i am a hamster. -
I see, yes, 384 would be what I'd consider the bottom acceptable minimum for 5.1 AC3.
This works alright for 5.1 but certainly 64 KBit/s per channel is not acceptable for stereo. -
128 kbit is acceptable for ac3 stereo but thats for basic sound.
I think,therefore i am a hamster. -
-
I would put AC3 quality & efficiency at some midpoint between MP2 and MP3. It has more spectral tools than MP2, but not as aggressively used as in MP3. And it is of the same era as those algorithms. I find it laughable that anyone would consider 64kbps/channel to be sufficient, however. At that rate it is noticeably compromised.
The OP did say "minimum...WITHOUT it sounding noticeably worse" (CAPS are mine).
Honestly, I don't understand the need to minimize the size of audio, when audio is already such a small percentage of the overall size. WHY make it worse, just for a few measly MB?
Example: diff between 5Mbps Video + 128kbps AC3 audio vs. 5Mbps Video + 256kbps AC3 Audio ("double" the bitrate) is ~100MB for a 2 hour movie which is going to already be at least 4.4GB for the video alone. This is goofy nickle & dime stuff, at the cost of quality.
But maybe the OP cannot tell the difference. I can, easily. And I'm no "golden ears", regardless of my Audio background.
ScottLast edited by Cornucopia; 28th Mar 2023 at 08:12.
-
I believe the op was asking so the ac3 wouldn't be too low of a bitrate since he needed to encode it for compatibility purposes.
I think,therefore i am a hamster. -
Yes, that is certainly part of it. But
1. Support for AC3 is usually yes or no, not based on bitrate. If the playback system supports AC3, it will support just about ANY of the valid AC3 bitrates (high or low). Of course, giving it an invalid bitrate could do weird things, but I don't know of any common tool that gives invalid bitrates.
2. If the source is not already in AC3 and it needs to be converted, this is a valid point, but one which there still is not a need to skimp unnecessarily. If the source is ALREADY in AC3, there is IMO also no need to even convert (and lose quality), regardless of the bitrate.
Scott -
If i had to recode the audio for playback issues i would set the bitrate at the same as the original audio or even a bit higher to keep almost original quality.
I think,therefore i am a hamster. -
That's the theory. But have you actually listened and compared MP2 vs. AC3? I have. Both MP2 and AC3 start to sound noticably compromised starting at 160 KBit/s and below, while both are fine at 192. Thus my conculsion, when it comes to bitrate choice, both can and be treated equally, even if AC3 is technically slightly more advanced.
128 KBit/s AC3 sounds atrocious to my ears (encoded with a reference encoder btw, Sonic Foundry Soft Encode), and I do find MP3 to be acceptable at 128, so I don't think I'm being too picky.
I absolutely agree with Cornucopia about this all being miniscule for a video, where the video part is so much larger than the audio anyways. Just use a slightly higher bitrate and be happy with good quality audio is my recommendation. -
Mp3 songs at 128 sound bad to me,gotta be 192 or higher.
I think,therefore i am a hamster. -
I have got a lot (10k+) of mp3 songs (in addition to other formats), most of which I acquired a decade or so ago when storage capacity was much less than it is now. Every one of the 128kbps and 160kbps ones, if I still like the song, I am hoping to replace with a newly souced aac/mp4 at similar or higher rate, or an mp3 at 192 or higher rate (224, 256, 320). Because uuggghh, you can hear the artifacts and it really grates after a while.
Here is a great test track to compare bitrate vs codec vs quality: the Theme from Shaft. (1977)
When the hi-hat and wah-wah rhythm guitar are going cyclically, it is about as demanding to a codec as raw random noise is, which is to say pretty demanding. You can instantly tell when you've lost quality when that rhythm track stops being scintillating and starts mushing together. And 128 does not cut it. Far from it.
Scott -
This is fact not theory - MP2 use way simpler approach and as such way lower spectral coding gain than AC3. You can find source where even at highest possible bitrate MP2/AC3 will perform poorly as any similar lossy codec. And as we living in free world anyone can use any bitrate he likes.
To be honest it is hilarious to read such passionate comments in times where huge amount of population use wireless bluetooth headphones or commonly consume music from monophonic sources such as smartphones... -
I'm not arguing with either, but apparently I did not make my point. My point is, what use is a fact if there is no perceivable effect in the real world? You did not answer my question whether you can hear a difference between MP2 and AC3 at borderline bitrates. I can't and I tried hard. Both start to sound unacceptable to me at the very same bitrate.
Well I don't use either of the two. Maybe that's why. -
Problem is lied in threshold of perception not lack of artifacts. And my hearing abilities are not part of this thread, it is quite common to use 64kbps per channel in AC3 as lowest acceptable per channel bitrate denominator - for example Dolby information:
http://www.ctia.org/docs/default-source/fcc-filings/technical-report-of-the-klcs-kjla-...ring-pilot.pdf
We asked Dolby Labs, how many kbps per channel was actually necessary without any
perceptible audio degradation. Dolby’s answer was 64 kbps per channel. This response seemed
too low based on common practice, so we confirmed this several times with Dolby Labs. The
question was also posed to Dolby regarding the encoding of a dual mono pair as one commonly
finds in Secondary Audio Program (“SAP”) and/or Descriptive Video Information (“DVI”). The
answer here was 64 kbps as the two channels were redundant. They stated that the decoder
would reproduce it in dual mono because that is the way it is coded to start and that since the two
channels were essentially repeated, additional bits were not needed. In a 5.1 audio configuration,
384 kbps was the specified number. We tested these settings and could hear no difference
between 128 kbps for a stereo pair and 192 kbps on our test television receivers. It certainly is
possible that there are sets of ears and Hi-Fi sound systems that one could perceive a difference.
Since the AC-3 system is proprietary to Dolby, there is no one else to ask about this. Dolby
states in the referenced paper: “Single channel services may have a bit-rate as low as 32 kb/s.
The overall data stream is defined for bit rates ranging from 32 kb/s up to 640 kb/s. The higher
rates are intended to allow the basic audio quality to exceed audiophile expectations. -
True, I was underestimating this factor, I think.
But Dolby promoting their own technology as needing only 64 KBit/s per channel is something I am very wary of (especially knowing how stereo 128 AC3 sounds).
Then again, if this is listended to through the flimsy speakers of a flat TV, possibly in a slightly noisy environment, then yes, of course there is no discernable difference between 128 and 192 for presumably any pair of ears on the planet... -
In overall not sure how ffmpeg and other open source ac-3 encoder implementations behave but perhaps original Dolby encoder extensively exploit interchannel correlation and from Dolby perspective 64kbps can be just enough for common materials where interchannel correlation is quite high. Dolby in AC3 times had rather low experience with lossy coding and this may explain why they use quite different terminology to describe codec inner life - seem this is what they call 'coupling'.
http://www.mp3-tech.org/programmer/docs/ac3-flex.pdf
Similar Threads
-
Minimum AC3 bitrate for 5.1 channel
By iKron in forum Newbie / General discussionsReplies: 7Last Post: 25th Jul 2023, 15:06 -
Ac3 5.1 audio track convert to stereo
By Anonymous543 in forum Newbie / General discussionsReplies: 32Last Post: 22nd Jan 2022, 15:54 -
iMovie 10 import AC3 but only exports stereo
By chrisf16 in forum MacReplies: 2Last Post: 10th Oct 2021, 09:26 -
AC3, 5.1 -> stereo, delete chanels, without re-encoding
By SmurfRecorder in forum AudioReplies: 20Last Post: 10th Apr 2021, 14:53 -
Converting a stereo 116kbps Opus track to AC3; what bitrate should I use?
By broom441 in forum AudioReplies: 5Last Post: 7th Jan 2021, 23:09