VideoHelp Forum
+ Reply to Thread
Results 1 to 5 of 5
Thread
  1. Member
    Join Date
    Apr 2012
    Location
    Hungary
    Search PM
    When I want to render an mp4 in Sony Vegas or an mpeg2 in VBR in my encoder, all are asking for average, maximum and sometimes minimum bitrates to set. How can I calculate the maximum and minimum, after I calculated how much bitrate I need for the project?

    For example, last time, I needed to render to an mpeg2. I calculated that the project should be 5644 kb/s to fit on the DVD, I set that as average. The encoder asked for a maximum bitrate as well, but I had no clue what to set there, so I entered various amounts, I saw they do not alter the output file size significantly, so I set the max to 8000 kb/s as a safe value for DVD players.
    But I have no idea whether this is fine?

    Sometimes encoders ask for a minimum bitrate as well, that I also have no clue about how to determine. Like Sony Vegas when you encode to certain mp4 templates.


    Is there a way to calculate these values properly?
    Quote Quote  
  2. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    When I used to use TmpgEnc 2.5, 2-pass VBR encode, I always to set the min to about 1000 and max to 9000.
    These settings are not as critical, it just gives the encoder plenty of wiggle room as it tries to maintain the average
    Quote Quote  
  3. Member
    Join Date
    Aug 2013
    Location
    Central Germany
    Search PM
    The maximum bitrate is usually predefined by specifications of devices which shall be able to play the result. In case of DVD Video, the maximum bitrate is related to the reading speed of a DVD drive in default speed (10.8 Mbps gross including control data, 9.8 Mbps net for all content streams, so subtract audio and some other overhead ... giving you a maximum for the video bitrate probably between 8.5 and 9.2 Mbps). Blu-ray will have its limits as well... but I didn't learn those by heart.

    The minimum bitrate is related to the filling level of the video decoding buffer. It must not drop below a value where the drive could accidentally read more than one GOP at once, even for the simplest video, like a black screen intermission. The video stream will be filled up with junk bytes in such a case. I read of 300 kbps for DVD Video, years ago; 1 Mbps is a pretty safe value.

    If your intended playback is from a harddisk on a PC exclusively, you don't have to care that much.
    Quote Quote  
  4. Member
    Join Date
    Apr 2012
    Location
    Hungary
    Search PM
    I see. Previously, I thought these values needs to be determined related to the average bitrate, but I see it is more in connection with hardware specs. However, I read somewhere that a minimum of 5-10% headroom should be given to VBR videos as max bitrate. Do you think this is a useful reference point besides the outer hardware related factors?
    Quote Quote  
  5. Originally Posted by Bencuri View Post
    I see. Previously, I thought these values needs to be determined related to the average bitrate, but I see it is more in connection with hardware specs. However, I read somewhere that a minimum of 5-10% headroom should be given to VBR videos as max bitrate. Do you think this is a useful reference point besides the outer hardware related factors?
    Average bitrate is artificial concept - question is: average on what? - for example: if you have your video size in bits (i.e. number of bytes multiplied by 8) and video duration then average video bit rate is plainly video size/duration - video is 100MB thus 800Mb if video duration is 100 seconds then average video bitrate is 800Mb/100s=8Mbps.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!