VideoHelp Forum
+ Reply to Thread
Results 1 to 9 of 9
Thread
  1. Member
    Join Date
    Apr 2020
    Location
    NE England
    Search Comp PM
    Hi this is my first post as a newbie, I am also getting on in years (I can remember WW2) so please be patient.
    I recently downloaded AVC intending to convert a 42Mb file (approx. 1.5 mins) to MP4. The resulting MP4 conversion is 24Mb regardless of what Video size I select. I would have thought that there would be some sort of relationship between file size and video size. Can anyone help?
    Quote Quote  
  2. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    No such relationship.


    File size = runlength * bitrate


    So your new file has a smaller bitrate than the original.
    Quote Quote  
  3. I think every single person has this same question when they first start encoding. After all, if each video frame has more resolution (e.g., HD vs. SD), the file size should be larger, right? I know I initially thought exactly the same thing, but it is wrong.

    However, it turns out that, as DB83 says, the number of bits per second ("bitrate") is the only metric which determines file size. Once this is pointed out, you will suddenly have an "ah ha" moment where you realize that "bit per second" is telling you -- as the phrase clearly states -- how many bits will be taken out of that video file every second. You then, as DB83's formula shows, multiply that by how long the video runs, and it will give you the total number of bits, which is the same thing as the file size.

    The implication of this, of course, is that if you DO have a higher-definition file (1920x1080 vs. 640x480, for instance), then if you keep the bitrate the same for each of them, there will be a lot fewer bits available to encode each frame of the higher-res file, and it will not look very good.

    What is impossible to say -- and there must be a million posts about this -- is what the relationship is between visual quality, bitrate, and resolution. For instance, if you double the resolution, you don't have to double the bitrate to keep the same quality. It gets even more complicated if you then also factor in frame rate. Once again, your initial thinking is that if you have twice as many frames per second (60 fps instead of 30 fps), you would need to double the bitrate to get the same quality. However, when you double the frame rate, there is much less difference between each frame (at a trillion frames per second, each frame would be so similar that you probably couldn't tell the difference) and therefore you need very few bits to describe the differences.

    So, bitrate is the only thing which determines file size, but many things (bitrate, resolution, frames/second, the type of encoder/codec used) interact to determine the quality of the video.
    Quote Quote  
  4. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    johnmeyer makes a very good point. I was of the self-same mis-understanding when I started encoding all those years ago.


    It was a time when HDDs were small and quite expensive and I thought it jolly since I was only doing this for myself that I would half the frame size to save half the file size.


    And there was no VideoHelp around then to ask why the video stayed the same size
    Quote Quote  
  5. Originally Posted by johnmeyer View Post
    The implication of this, of course, is that if you DO have a higher-definition file (1920x1080 vs. 640x480, for instance), then if you keep the bitrate the same for each of them, there will be a lot fewer bits available to encode each frame of the higher-res file, and it will not look very good.
    You mean pixel?
    Quote Quote  
  6. Member
    Join Date
    Apr 2020
    Location
    NE England
    Search Comp PM
    Hi DB83 and Johnmeyer.
    Many thanks for the prompt and informative reply, it was much appreciated, after the third time of reading a tiny glimmer of light began to penetrate the darkness then Voila!
    I have been trying various sources for at least two days, some of the replies were at least dubious but mostly were downright wrong. So once again thank you, no doubt you will be hearing from me again - Barry
    Quote Quote  
  7. Originally Posted by sneaker View Post
    Originally Posted by johnmeyer View Post
    The implication of this, of course, is that if you DO have a higher-definition file (1920x1080 vs. 640x480, for instance), then if you keep the bitrate the same for each of them, there will be a lot fewer bits available to encode each frame of the higher-res file, and it will not look very good.
    You mean pixel?
    Yes, the HD file has more pixels.
    Quote Quote  
  8. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    I think many people assume there is a relationship between rez, framerate, etc., and bitrate.

    And in ONE sense there is: uncompressed video. Just like with uncompressed audio.

    H rez * V rez * color bitdepth * framerate = bitrate

    So
    1920 * 1080 * 16bits * 29.97 (for NTSC-type YUV4:2:2) = ~994Mbps.

    But that's a ridiculous bitrate. Nobody but pros save hd as uncompressed, much less higher formats.
    And with compression algorithms, there is then no longer any relationship to what it was when uncompressed. Only the compressed bitrate matters from then on.

    Scott
    Quote Quote  
  9. Originally Posted by Cornucopia View Post
    I think many people assume there is a relationship between rez, framerate, etc., and bitrate.

    And in ONE sense there is: uncompressed video.
    That is a really good point. I'd forgotten that, since there is no coder/decoder (codec) for uncompressed video, the number of bits turns out to be whatever is needed to capture that number of pixels, with a certain number of bits for each pixel to describe luma and chroma, all times the number of frames you want per second. When you multiply all those big number together, you end up with an unbelievably large result. I wonder what it is for 4K at even 30 fps?

    I remember, when digital video started, that the only way those early editing systems could keep up with the fantastic data rate required for even low res SD video, was with RAID arrays connected with the expensive SCSI interface.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!