When I want to render an mp4 in Sony Vegas or an mpeg2 in VBR in my encoder, all are asking for average, maximum and sometimes minimum bitrates to set. How can I calculate the maximum and minimum, after I calculated how much bitrate I need for the project?
For example, last time, I needed to render to an mpeg2. I calculated that the project should be 5644 kb/s to fit on the DVD, I set that as average. The encoder asked for a maximum bitrate as well, but I had no clue what to set there, so I entered various amounts, I saw they do not alter the output file size significantly, so I set the max to 8000 kb/s as a safe value for DVD players.
But I have no idea whether this is fine?
Sometimes encoders ask for a minimum bitrate as well, that I also have no clue about how to determine. Like Sony Vegas when you encode to certain mp4 templates.
Is there a way to calculate these values properly?
+ Reply to Thread
Results 1 to 5 of 5
-
-
The maximum bitrate is usually predefined by specifications of devices which shall be able to play the result. In case of DVD Video, the maximum bitrate is related to the reading speed of a DVD drive in default speed (10.8 Mbps gross including control data, 9.8 Mbps net for all content streams, so subtract audio and some other overhead ... giving you a maximum for the video bitrate probably between 8.5 and 9.2 Mbps). Blu-ray will have its limits as well... but I didn't learn those by heart.
The minimum bitrate is related to the filling level of the video decoding buffer. It must not drop below a value where the drive could accidentally read more than one GOP at once, even for the simplest video, like a black screen intermission. The video stream will be filled up with junk bytes in such a case. I read of 300 kbps for DVD Video, years ago; 1 Mbps is a pretty safe value.
If your intended playback is from a harddisk on a PC exclusively, you don't have to care that much. -
I see. Previously, I thought these values needs to be determined related to the average bitrate, but I see it is more in connection with hardware specs. However, I read somewhere that a minimum of 5-10% headroom should be given to VBR videos as max bitrate. Do you think this is a useful reference point besides the outer hardware related factors?
-
Average bitrate is artificial concept - question is: average on what? - for example: if you have your video size in bits (i.e. number of bytes multiplied by 8) and video duration then average video bit rate is plainly video size/duration - video is 100MB thus 800Mb if video duration is 100 seconds then average video bitrate is 800Mb/100s=8Mbps.
Similar Threads
-
Giving a video file a higher average bitrate than it actually is?
By Chibi in forum EditingReplies: 4Last Post: 5th Sep 2017, 22:51 -
VC-1 codec, how to get average and max bitrate from a Blu-Ray disc
By kalemvar1 in forum Software PlayingReplies: 1Last Post: 4th Jan 2017, 15:22 -
How to get the (average) video bitrate with mkvinfo?
By pxstein in forum Newbie / General discussionsReplies: 6Last Post: 23rd Dec 2015, 07:58 -
Multiple files how do i get average video/audio bitrate info
By Psyko in forum Newbie / General discussionsReplies: 2Last Post: 10th Dec 2015, 15:47 -
What's the average time to convert 10 minutes of HD video?
By snafubaby in forum Video ConversionReplies: 2Last Post: 2nd Mar 2015, 07:44