Ok guys, i bought a Hisense hdtv player yesterday and it should arrive in a few days (next thursday).
I have a small collection of BluRay movies and i have one movie ive been experimenting to rip and try it on the hisense.
The movie is a 2009 release so its the latest in the market.
I have two rips of this to test
One was ripped at
and another was ripped at:
Obviously this is a huge difference but they're both 1080p so the difference shouldn't be that big.
My question is, does higher bitrate equal better quality?
If not, why are there options etc to rip at different bitrates if quality would be the same?
+ Reply to Thread
Results 1 to 5 of 5
Originally Posted by SgtPepper23
keeping that in mind, if you have a source that's 30 mb/s (30 million bits per second) then of course 2 different encodes done, one at 1/3 the other, the one with the higher bit rate will be closer to the original, and all other things being equal (such as other encoder settings) the one with the higher bit rate will be of higher quality.
keep in mind that if your source is a 25-30 mb/s h264, then it makes no sense to re-encode it using a higher bit rate than the source and quite frankly it makes no sense to re-encode at a lower bit rate, even if resolution is unchanged, because you are discarding data from the video stream.
while i'm at it, if you are going to re-encode to a much lower bit rate, say under 10 mb/s, you would maintain more quality by reducing the resolution to 1280x720 and encoding to that bit rate over staying with 1920x1080 and using the same lower bit rate (this is due to the fact that by reducing the resolution to 720 you are now distributing 10 million bits among 921600 pixels * whatever the frame rate is (assume 24 fps) rather than distributing those same 10 million bits among 2073600 pixels * frame rate.
the only time it makes any sense to re-encode to a lower bit rate and keep the same resolution is if you use a more efficient codec, for instance your source is mpeg-2 and you re-encode using vc-1 or h264, then you could go with a lower bit rate without sacrificing quality.
1920x1080 at 11000 is pretty low bitrate. You should see artifacts as a result, especially during scenes with smoke, sky, or water."Quality is cool, but don't forget... Content is King!"
First of all, if you are downloading videos from the internet you have know idea what the person who uploaded did to the video. So bitrate is often not a good proxy for quality.
In general though, other things being equal, the higher the bitrate the higher the quality. Until the bitrate gets so high that encoding with more won't substantially improve the quality.
A 30,000 kbps file is likely to be a straight rip from a blu-ray disc -- ie. the data was simply copied off a blu-ray disc with no loss of quality. A 10,000 kbps file is likely to have been reencoded and be lower in quality than the straight rip. How much lower depends on many factors including the nature of the video (an hour long still frame will look just fine, an hour of high action video might look like crap), the skills and knowledge of the person doing the compression, and the software that was used.