I've been trying to better understand Video compression with bitrate, resolution, fps vbr and such
I came up with a question that may help me better under stand video compression.
With the resolutioin of 1080p or i and 720p or i do you need the bitrate to be at a certain maximum or max range to achive the resolution of 1080 or 720. Cause if your bit rate is set to low the resolution will not be a true 1080 or 720.
Is there a chart that give the bitrate setting that are used to achieve the wanted resolution
just found this link http://forum.videohelp.com/threads/332905-Bitrate-for-true-1080p-or-720p which is like my question and the answer is NO but if you have a 1080 res with a bitrate of 3000 doesn't that diminish the quality and it not a true 1080 res video. Somthing has got to give when encoding video.
If you want low size - you sacrifice your quality and vice versa as in if you want quality you have to give up size.
+ Reply to Thread
Results 1 to 7 of 7
Thread: Bitrate and Resolustion
Last edited by AndreL; 13th Jan 2013 at 23:03.
I've done some more reading on res and bitrate and kinda see where they are not related but was wondering about the problems that come with to high or to low of a bitrate with a resolution in mind.
For instance with a 720 or 1080 res how high of a bitrate is recommened before your just wasting HD space and the eye can't tell the difference.
Some question for low bitrates where you notice poor quality in the video.
I think I know what you may say "It depends" if the video is high in action moderate action or low in action.
Could you give me a rang for each or a link to a web page with a table or spread sheet.
The main problem is that like you wrote yourself, "It depends" there are no exemplary clips out there + the bit rate required for also depends on the encoder and formats used.
Even if you choose a common format, let's assume H.264, there are huge differences in the bit rate needed by different encoders to archive a similar quality level.
Since a lot of people struggle with choosing a good bit rate the x264 (a H.264 encoder implementation) developers added a rate control mode called 'constant rate factor' which is ment to keep a certain level of quality (limit the loss of data) without taking the bit rate into account.
-> if you use x264 for lossy encoding most people stick with crf 16-18 for archiving and crf 20 - 22 for normal watching. (x264 also got a lossless mode, but you probably don't aim for lossless compression)
It definitely depends, which is why when encoding with the x264 encoder you only do so while selecting a bitrate/file size if you need to achieve a certain bitrate/file size. Other than that, use the quality based, single pass encoding method (CRF or constant rate factor).
The x264 encoder basically encodes exactly the same way whether you use 2 pass or single pass encoding, assuming the same encoder settings are used each time. This means if you encode a video using a particular CRF value, make note of the resulting average bitrate and then run a 2 pass encode specifying that bitrate, the two encodes will be virtually identical. Keeping in mind it takes more time to run a 2 pass encode than a single pass encode, unless you use faster (lower quality) encoder settings for the 2 pass encode, here's a quote from one of the x264 developers.
Given the same amount of encoding time, CRF is superior to 2-pass at the same bitrate. Given the same settings rather than the same time, they're effectively identical within margin of error. My recent tests show that CRF generally has a very slight edge, albeit the difference is so small that you'd have to have OCD to care.
So the recommended encoding method when using the x264 encoder is to use the built in x264 speed presets and tuning. You'd ideally pick the slowest speed preset you can live with and then a CRF value which gives you the quality you want (a given CRF value will only give you the same quality each time when using the same x264 settings). You'll get the same quality relative to the source and the bitrate/file sizes will vary accordingly.
And they will vary quite a lot, which is why it's hard to offer a "one size fits all" resolution/bitrate table. The amount of noise in a source will effect how hard it is to compress, as will the sharpness of the resizer, if you're resizing. As an example I recently looked at a few PAL DVD encodes I still had sitting on my hard drive while discussing this subject in another thread. Using a CRF value of 18 each time and the slow x264 speed preset, the lowest bitrate encode was 1196kbps at 1024x428. The highest was 2022kb/s at 1020x424.
Ultimately you'd use CRF encoding if you want to specify the quality each time, while the file size will be unknown, and 2 pass encoding if you want to specify the file size and let the quality vary. The default x264 speed preset is medium and at a low enough CRF value (20 is the default) it'll give you very good quality. If you're really fussy you might choose a CRF value of 18 and a slower speed preset. Some people go as low as CRF 16 but the file sizes tend to be relatively large.
PS. Not that it's something you need to think about, but as you mentioned frame rate in your original post I thought I'd mention the x264 encoder is "frame rate aware". I've tried it myself, just to see. I encoded a video at it's original 23.976fps, and then again at 25fps using the same x264 CRF value and settings. The 25fps encode produced a slightly smaller file size. I don't recall the difference being anything but very small, but there was a difference.
Not really. A 1080p or 720p rip is still 1080p or 720p regardless of bitrate. But at a certain point, if there isn't enough bitrate, the picture is just going to look really crappy.
Every video requires a different bitrate to keep from losing detail and breaking up into blocky artifacts. The amount required depends on the frame size, frame rate, amount of detail, amount of motion, amount of noise, even the brightness of the picture. Each person's tolerance for loss of quality is different. The viewing environment also matters. Watching a video on a 19 inch screen from 10 feet away will make it much harder to see quality loss than watching the same video on a 70 inch screen from 10 feet away. What looks ok on a 2 inch cell phone screen may not look good on a 70 inch TV screen.