My Videos are all 2,000 kbps, 30fps, 1920x1080 & they are mp4s. Is that a Good Bitrate for an Average 1080p Video? Can a video with this Bitrate actually be 1080p or is it a Lower Quality? A 2 hour Video is like 2 GB's.
+ Reply to Thread
Results 1 to 9 of 9
2000 kbps is very low quality. Exactly how low depends on the nature of the particular videos and the codecs that are used. For example, Blu-ray h.264 video is typically 10 to 15 times higher than that.
I read once that with x.264, you actually need a BR of about 8000Kb/s for 1080p video to actually get 1080p resolution. I've come to agree with this. Lower bit rates end up looking like lower res video that's been run through a sharpening filter and have too many artifacts. With x.265 video I think you caould probably get true 1080p with at least 5 or 6 Mb/s.
Here's how you should look at it:
If your videos are ALL of the same kind of content - the same amount of motion, dynamic range, edits & overlays (timing & # layers), complexity of detail, types of objects - THEN and ONLY THEN would you expect your bitrates to be comparably equivalent.
Otherwise, you should expect your bitrates to vary quite a bit, based on content. That's why quality-focused encoding uses CRF instead of VBR or even worse, CBR.
I compare my 2,000 Bitrate MP4 .H264 Movies to the HD Movies on TV & there is little difference except when there is a dark scene. Then that's where I can see the Pixels Pix-elated in the Darker Scenes but the Light Scenes are 90% of a Regular 1080P TV Movie.
1080p x264 with stripped DTS core
Some other things you'll notice if you look closely:
Loss of small, low contrast, details. Like film grain, small wrinkles on actors' faces, fuzzy sweaters, etc. This can cause posterization in shallow gradients -- like blue skies, dark areas, etc. The video I posted above shows posterization in the skies -- but those were in the source I started with (cable TV broadcast).
Rough edges on moving objects. This is hard to see if the objects are moving quickly. But it's more obvious when they're moving at slow to moderate speeds.
Also keep in mind that "1080p" is 24 fps when dealing with movies, not 30 fps or 60 fps video. And the frame is usually not 1920x1080 but more like 1920x816. That's ~25 percent fewer pixels to deal with.
As mentioned, and I forgot to, something like anime wouldn't need quite as much bit rate as live action in general. However, Pixar type animation would.
And with good anime, using 10 bit color makes more of a difference to me than having 1080p. Seriously, you don't know just how good Miyazaki pictures (or the original Ghost in the Shell) are if you haven't seen them in 10 bit color.
Many films/tv shows, esp newer, are always panning the camera to give an illusion of motion where there isn't any in the scene really. This will gobble up more bits too.
Also, I don't think you always need 1080p video. For a lot of things I watch DVD rtesolution is just fine. However, if it's something like Lord of the Rings, and thousands of little animated Orcs are onscreen, yes, I do want HD.