VideoHelp Forum




+ Reply to Thread
Results 1 to 9 of 9
  1. Hi all,

    I often use Sony Vegas to edit/extend songs that I have downloaded. Most of the songs I buy are in WAV format and according to MediaInfo they are 16-bit with a bitrate of 1411.2 Kbps. When I render the audio project I always just choose the 16 bit WAV option, because I don't want to lose any quality and I know that WAV is a lossless format, and because it is exactly the same as the lossless files that I am using to begin with.

    Now, sometimes I edit songs which I have bought from iTunes (I usually buy from other sites), and as most of you probably already know, iTunes audio files are AAC at 256 kbps. The thing I didn't know until recently though, was that they are 32-bit. And up until now I have been rendering them as 16-bit WAV files because I don't want to lose any quality (I know the resulting file size is larger, but size doesn't bother me).

    So, my question is, am I actually losing quality when converting from 32-bit AAC to 16-bit WAV, even though WAV is technically a lossless format?

    I know that I can't actually hear any difference in quality, but, I'm just asking more out of curiosity than anything, because I don't really understand how bit depth affects the quality of an audio file. I know that bitrate is an important factor when deciding how much quality you want to keep, but bit depth I do not understand...

    Any help would be greatly appreciated.

    Thank you!
    Quote Quote  
  2. AAC isn't 32 bit. Lossy audio formats such as AAC and MP3 etc don't have a fixed bitdepth, so they're not 16 bit or 24 bit or 32 bit.

    When they're decoded, they need to be decoded to a fixed bitdepth. The greater the bitdepth, then (in theory, at least) the more accurately they can be decoded, but you'd probably have magic ears if you can hear a difference (less than 16 bit you might). It's also common for audio programs to decode to a fixed bitdepth format such as a wave file while applying dithering (adding a small amount of random noise) which randomises the rounding errors when decoding to a fixed bitdepth.

    A decent audio program would probably import lossy audio as 32 bit (probably even import 16 bit wave files as 32 bit too, or at least apply 32 bit processing to them), as then you can edit and apply effects etc with minimum quality loss due to rounding errors, but most of the time you'd probably export the finished audio as a 16 bit wave file (the greater the bitdepth, the larger the file size). Or export it while re-encoding to a lossy format again.

    The greater the bitdepth, the greater number of values that can be applied to any specific sample of the audio. For 16 bit it's 65,536 different values, for 24 bit it's 16,777,216, and for 32 bit it's a really big number (2 to the power of 32). 16 bit is considered enough to encode a waveform accurately, and higher bitdepths are mostly used when mixing/editing and applying digital effects (to minimise rounding errors), or for the placebo effect audiophiles seem to enjoy. You don't lose quality by converting a wave file to a higher bitdepth (ie 16 bit to 24 bit) but there'll be rounding errors when converting down. They're actually called quantisation errors. So technically going from a higher bitdepth to a lower one isn't lossless in that the audio won't be exactly the same, but it won't be a difference you can hear as long as you don't go lower than 16 bit.

    Different lossy encoders, even different AAC encoders, can accept input audio of various bitdepths for encoding, and some will accept a 32 bit input, but once it's encoded the audio no longer has a fixed bitdepth. It will be decoded to a fixed bitdepth again at some stage though (ie playback).

    By the way, if you're just editing (cutting without any effects), MP3DirectCut will edit AAC audio losslessly. It needs to be a raw AAC file though (not MP4 or M4A etc) which can be a bit of a pain. It's the only program I know of that does. Other programs would decode it, edit, then re-encode it when exporting, which means if you re-encode to a lossy format again, you lose a bit of quality.
    Last edited by hello_hello; 29th Aug 2015 at 10:04.
    Quote Quote  
  3. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    AAC and other transform encoding formats don't have a fixed "bit depth", but their inherent bit depth can never be any better than the bit depth of the lossless/uncompressed source file/stream/decoding their samples were based on.

    And technically you could have multiple decoders that each had better or worse decode depth capabilities, so you could decode 24->32*->24 (uncompressed->*transform compressed->uncompressed), or 24->32*->16, etc. (going down this way would be the equivalent of dropping the LSBs from a LPCM stream from, say 24->16 - with or without dither, depending). This is on of the areas where the design ($?) of the codec can help determine the final quality.

    Scott
    Quote Quote  
  4. Thanks for the detailed response guys!

    Ok, so you both said that lossy audio formats don't have a fixed bit-depth, so am I safe to assume that I am not losing any quality when converting my AAC files to 16-bit WAV?

    The only reason I thought the AAC files were 32-bit was because when I imported one to Sony Vegas the other day, I noticed that it said Audio: 44,100 Hz, 32 Bit (IEEE Float), Stereo, AAC

    I never knew the bit-depth of the AAC files before because when I checked with MediaInfo it didn't have any information on bit-depth. But I guess that is because, as you say, lossy audio formats have no fixed bit-depth.
    Quote Quote  
  5. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Of course you lose quality in Encoding, hence the name "lossy" encoding. It is using complicated math and intelligent rules based on psychoacoustics to give you an educated guess at what something MIGHT sound like using (much?) less data. Less data for the guess, less close the guess is to the real thing, less quality.

    Not counting possible very, very minor errors due to rounding in the math (or bad codec design implementation), there is NO loss when DECODING any format to LPCM, however.

    Re: what is the bitdepth?
    Remember that bit depth of uncompressed signals is a measure of the number of bits necessary to code a certain quality for EACH SAMPLE POINT.
    Lossy-compressed signals, however, measure only an array which is a transformed block/group of samples (not even a block/group, but a transform). Apples & Oranges.

    Scott
    Quote Quote  
  6. Originally Posted by Ronaldinho View Post
    Ok, so you both said that lossy audio formats don't have a fixed bit-depth, so am I safe to assume that I am not losing any quality when converting my AAC files to 16-bit WAV?
    You won't be losing quality converting to 16 bit wave, as long as you keep the audio as wave (or in a lossless format). If you convert to AAC again at some stage you'll lose a bit more due to the lossy encoding.

    Do you keep your audio as wave files? Most of us would compress wave files with a lossless encoder such as FLAC to reduce the file size. If you're using iTunes you'll probably need a different format (I don't think iTunes supports FLAC), such as Apple Lossless (ALAC). ALAC can be put in a MP4/M4A container, so you can tag the files as you'd tag MP4/M4A files containing AAC audio. Assuming you're into that sort of thing. Being lossless they can ve converted to wave and back as much as you like without losing quality.

    Thinking about it..... can you buy audio in a lossless format from iTunes? I don't know, but if you can it might be a better idea.
    Quote Quote  
  7. Thanks for your reply

    And no, unfortunately iTunes does not offer lossless formats, just AAC. I always choose lossless when available. But most of the time I don't buy from iTunes anyway.
    Quote Quote  
  8. I don't know if this is still accurate, but according to Wikipedia you can encode with ALAC and iTunes if you have Quicktime installed. Unless that only applies to Macs? https://en.wikipedia.org/wiki/Apple_Lossless#History

    My suggestion was more to use a lossless format iTunes supports for playback. I'd be astounded if it didn't support ALAC. I had the impression you were using iTunes as your media player. If not.... I guess it doesn't matter, but QAAC supports ALAC encoding.
    Quote Quote  
  9. 16 bit is not recommended for signal processing (but it is OK as target final format) - if your goal is to edit audio then i would suggest to use at least 24 bit PCM (or 32 bit int or float PCM).
    Mentioned already dithering and/or noises shaping may be suboptimal if 16 bit PCM format is used for signal processing/editing.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!