Hey,
I'm really confused right now. I'm rather new in video editing.
My problem is: I've got a DVD with VOB files, which contain 720x576 MPEG-2 clips. I've also got the same clips as Blu-Ray version. They are M2TS files and contain them as 1280x1920 H.264. After encoding the DVD to H.264 and changing the resolution to 1280x720, I've compared the movies pixel by pixel and they seem to be 100% identical, althought the resolution is declared as a different on the other disc.
Well, as it seem to not matter whether the clip is DVD or Blu-Ray, I don't know which one to go with. But the 720p movie has got a lot higher video bitrate than the 576p one. I don't understand why, since the visuals don't distinguish between each other.
__________________________________________________ __________________________________________________ ______________
THAT WAS MY FIRST QUESTION.
THE SECOND ONE:
Some clips from other clips look better in 720p than in 576p, so I think I would encode all 576p clips to H.264 720p also, because H.264 offers thumbnails on the files. But: When I encode them, althought the resolution will visually be exactly the same in comparison to the native 720p clips, the bitrate will remain exactly the same, as it is in MPEG-2. So should I somehow change the bitrate, so that the encoding to a higher resolution won't be bottlenecked or something - or won't this happen? Because I can think of unnoticable improvements if I do so. Or is the 720 bitrate too much? It works well with the half bitrate of the former 576p clip.
__________________________________________________ __________________________________________________ ______________
Oh and if this very small easy question doesn't bother you, but this one is the biggest problem:
Some particular clips, when encoded to H.264 vastly change the colors to more warm, despite increasing resolution. Here's a difference 576 vs 720:
The left one is the 576 and the right one the 720, which I have scaled down in paint to compare them better. The 576 has the original colors and somehow the 720 natively comes in these fake colors after ripping. And when I encode the 576 (btw I use the free Any Video Converter to do this), the colors change to fake. Now I would like to know both how to prevent the color change with encoding and how to get the original colors back on the 720. I couldn't find any color filter option in Any Video Converter.
Basically it's three problems I have. I would be tremendously happy for any help.
+ Reply to Thread
Results 1 to 4 of 4
-
Last edited by draig-llofrudd; 5th Dec 2016 at 03:52.
-
Check colorimetry (must be correct and SD usually use BT.601 and HD usually use BT.709), side to this frequently HD versions (remaster or improved) may have have some tweaks on contrast and saturation to create impression of more vivid picture (marketing approach to show advantage of HD over SD).
-
I'd start by not using Any Video Converter, or check for any colour enhancing options that might be enabled. I'd try Handbrake or Vidcoder instead. They're not hard to use and won't mess with the colours. Are you sure the DVD and Bluray sources don't look just as different? I could be wrong but that seems to be more than just a colorimetry difference (high definition and standard definition are converted to RGB on playback slightly differently so upscaling or downscaling can cause the colours to look a little different).
Generally it pays to use CRF (quality based) encoding. You pick the quality but the bitrate will vary quite a bit, depending on how hard the video is to compress and the resolution etc. Somewhere around CRF18 (or it might be called constant quality in Handbrake/Vidcoder) should give you quite high quality. I don't know if Any Video Converter has that option. When you pick a bitrate you're effectively picking the quality in advance, without knowing what the quality will be, because different sources require different bitrates to achieve the same quality, depending on how hard they are to compress.
The type of source (mpeg2 or h264 etc) and the source bitrate are fairly irrelevant. The video is decoded and the encoder just sees the uncompressed video it has to re-compress. If you use quality based encoding the quality will always be roughly the same, relative to the source (for a given quality setting). Sometimes the encode will have a much lower bitrate, sometimes around the same, and occasionally higher. The encoder is oblivious to the source bitrate.
There's not much point increasing the resolution when encoding, as a general rule. There's more video to encode with no extra detail. Usually it's better to encode it "as-is" and let the player/TV upscale it on playback as required. If it's 576p, I'd encode it that way.Last edited by hello_hello; 5th Dec 2016 at 13:29.
-
Everything is clear now. I will not increase the resolution, because it doesn't make sence. The player itself will set it up for me just by switching to fullscreen. It might be possible that the colours are caused by the increased resolution. But somehow it happens, when re-encoding to the same resolution, too. But it's not important to me anymore. I'll just leave the files untouched and not worry about the bitrate.
Similar Threads
-
NEWBIE QUESTION: Is there a difference between .h264 and h.264?
By smw2102 in forum Newbie / General discussionsReplies: 6Last Post: 31st Oct 2014, 13:14 -
Ripbot 264 only encoding 2.0
By JMC907 in forum AudioReplies: 10Last Post: 14th Aug 2014, 03:42 -
Editing mp3's With Audacity Confuses Car Stereo
By JohnnyGalaga in forum AudioReplies: 2Last Post: 23rd Aug 2013, 22:22 -
good laptop for x.264 encoding
By mmbwdpnz in forum ComputerReplies: 18Last Post: 9th Feb 2013, 11:56 -
h.264 encoding
By anonymous_whatever in forum Newbie / General discussionsReplies: 4Last Post: 11th Feb 2012, 12:56