Suppose you took a screenshot of a scene in a 720x480 episode. That's the only available resolution for that episode. Years later, a BD of that series comes out, and you take a screenshot of the same scene on the new 1280x720 BD version of the episode. If you open up a photo editor (ACDSee Photo Manager 9 in my case) and resize that 720p screenshot to 720x480, the resized picture would have a better quality.
Does this apply in resizing mkv too? Or does the resolution just decide the dimensions for the picture to appear, while the quality really rests on the CRF or bitrate you choose to encode the mkv at?
I just started encoding a few weeks ago. I use a .bat file for encoding with --preset veryslow (placebo is just ridiculous) and my CRF varies from 22 to 18. I'm more likely to use 18 and 19 for resizing 1080p to 720p since so far, the 1080p BD sources I have are either upscales or just plain lousy. I use the 10-bit x264 thingy and placed it in my User profile in C:\.
These are my laptop specs: Windows 7 Home Premium, HP Pavilion dv6 Notebook PC, Intel Core i7-2670QM @ 2.20GHz, 8GB RAM, 1TB HDD
Now that I've established that, I could go on with my query.
Would resizing 1080p to 720p be better (in terms of quality) than getting an already 720p source? In another thread, I saw someone say "same bitrate = same filesize". Since I mentioned that resizing a 1280x720 picture to 720x480 would obviously show finer quality (given that it isn't an upscale, and since there are more details which you are basically compressing into a smaller size), I was wondering if that applied to encoding with the command line --vf resize:1280,720
Also another question about CRF and bitrate:
If my source originally has very high bitrate, say 23.0 Mbps, would re-encoding it to somewhere around CRF 22 give it quality as if re-encoding a 7000 kbps version of that episode to CRF 20 or CRF 19? And since it's a 1080p source, and I'm resizing it to 720p, even if I made it CRF 23, would it still have quality (better or a bit less) like the 7000 kbps to CRF 20?
+ Reply to Thread
Results 1 to 18 of 18
Since I mentioned that resizing a 1280x720 picture to 720x480 would obviously show finer quality...
I've compared lots of old DVD encodes to newer encodes at the same resolution using Bluray video or high quality 1080p, 720p versions (I don't downscale like that any more as all the players in the house now play HD files) and while there's lots of variables, a Standard Definition xvid AVI encode taken from a HD source looks better than a SD xvid AVI encode (same dimensions such as 720x400 etc) taken from DVD every time. Sometimes they're very similar, sometimes the difference is huge, but when you're resizing down while encoding, the more detail the source has the more detail the downscaled version can retain. Well to a point. There's only so much detail a SD resolution can store.
You seem to be making the wrong assumption regarding resizing. Every time you resize down you potentially lose detail. You can't really increase the quality of the encoder settings to compensate and end up with the same amount of detail after reducing the resolution, because the resolution ultimately determines the amount of detail a video can have. The higher the quality you use for encoding the more of that detail might be retained, but the resolution is the upper limit. When you're re-encoding, the encoding quality really only determines how accurately it'll be reproduced.
Having said that, just because a video uses a high resolution (1080p or 720p etc) doesn't mean it has 1080p or 720p worth of detail. You can resize 480p video to 1080p but while you've used more pixels to encode it you really haven't increased the picture detail as such. Only you can resize a video down and decide if you're losing detail and/or how much. I guess ideally you'd preview the downsized version before you encode it because the encoding process can only lose you more.
The source bitrate and resolution is kind of irrelevant to the CRF value you'd use when encoding whether you're resizing or not. Generally CRF 18 is considered to be fairly "transparent" using the default x264 settings, and the bitrate used will depend on how hard the video is to compress. Ideally you'd pick a CRF value you're happy with and use it for everything.
The only time I sort of equate resolution to a CRF value is if for some reason I'm wanting to keep the file size to a minimum. It's all personal preference but if you're wanting to maintain particular file sizes then there might be some compromise between the resolution you use and the encoding quality. Whether you reduce the resolution or decrease the encoding quality or do a little bit of both really comes down to which one loses you the most detail or whether it increases the compression artefacts too much and what you prefer to look at.
Last edited by hello_hello; 30th Dec 2012 at 06:03.
A lot will depend on the resizing algorithm used. Most people use resizers that sharpen (like Lanczos3). Professional studios use less sharp resizers to reduce aliasing and haloing problems.
And sometimes downloaded files come from over-the-air captures which have less resolution than DVD.
It's like saving a really crappy jpg (say 250x350 px), and then finding a bigger scan (say 1143x1800 px) of that elsewhere, and then resizing it to a 250x350 px jpg. Doing the latter compared to saving the first crappy jpg would give you a better (clearer, cleaner) result. Less 'artifacts'? I'm not sure about what those nasty little dots are when you save as jpg. I think I understand 'artifacts' in mkv being nasty color stuff (most prominent is dark scenes) which is common in 8-bit. Correct me if I'm wrong.
And yes, I know that re-encoding the same thing again and again would only degrade the quality, which is why I came here to verify about encoding 1080p source to 720p.
Anyway, thanks for all the replies
Where did that "already 480p picture" come from? Movies (film) are normally scanned at high resolution (2K, 4K). The DVD is made by downsizing those scans. If the DVD is less sharp than your downscaled Blu-ray release that was the intention of the studio. Some crappy DVDs are made from analog tape source. Those have naturally less sharp pictures. Anything recorded standard definition off-air broadcast will be even less sharp.
example new DVD of the avengers, made by the studio
is very likely going to be better than one You make by downscaling BD version of the same movie
the only way your version could be better quality is IF the studios deliberately made the DVD version of poor quality
and the compressed rip of BD movie has already had the quality reduced
yes it would be better than a compressed rip of the DVD
BUT maybe NOT any better quality than the original studio DVD
your down scale conversion of a down loaded BD rip will nOT necessarily be any better than the straight rip of the studio 720*480 DVD
There are a lot of resolutions available out there, like 360p, 480p, 720p, 1080p. Some older shows around 2004 or somewhere around that time period would have torrents available with 360p, and newer ones at 480p, and then 2010 to 2012, you'd most likely find a BD version at last that is fully 720p. I'm not talking about re-encodes. I'm talking about the best available subbed version.
So basically, you're more likely to know what I mean if I had mentioned that the subject I was referring to was anime. Unlike movies, they tend to be available in 720p and 1080p. 480p is rare (but I don't know for sure, I'm not a movie guy). For anime movies, 480p is more frequent than with live-action movies.
Many retail anime BD releases are actually upscaled SD, including older titles are re-released on BD. It would have to be re-done from the original material, and many studios find it cheaper to upscale SD. It also depends what is available
Many live action BD's are terrible, almost DVD quality .
So it really depends on the specific source
I know the tendency is to have huge TV screens. If you have a show that doesn't have all that great a picture quality, sometimes it is best to watch that on a smaller screen and save your time and money.
I have a Hauppauge 150 that captures video at 720 x 480 at 8000 kbs constant bit rate. I was testing out some files I converted to .mkv files and even the ones I encoded at 750 kbs looked good on a 20 inch screen. I bought a LG 3D Blu Ray player and I am testing out various files encoded at various bit rates. Some of those files are 1080i captures from my Hauppauge HD PVR, which I converted to 720i. Those 720i files look beautiful on that 20 inch screen. I am saving up for a 32 inch LED TV that will be a 1080p television. I don't know what my 720 x 480 videos will look like on a 32 inch TV espescially those ones at 750 kbs, but it is got to be better than watching it on a 50 inch or larger screen.
If you have a great number of low quality of video files, it is at your descretion to get better quality video files. It is just a matter of time, effort, and money. Sometimes you can find something on YouTube even at 360p and it would look better than a VHS and nearly as good as DVD.
Size and pixel count are inversely proportional. When one goes up, the other goes down. Something has to give, and that's detail.
AviSynth script to resize it down to SD and convert the colors, then I've opened the original HD video in another instance of the player, synced them up, maximised them both on my TV and switched between them. That takes re-encoding out of the equation and all I see is a loss of detail (with maybe some of the usual resizing artefacts such as jagged edges or halos), or the resized down version looks a bit blurry by comparison.
I'd agree resizing down can give the appearance of a contrast change. I'm not sure why, maybe it's simply the way the human eye or brain works, but take a HD image and resize it down to SD and watch them both side by side on the same HD screen, and the SD version sometimes appears to have more contrast, but run them both full screen and it doesn't.
Unless of course the upscaling of the SD version reverses any contrast change which makes them look the same again when they're played back at the same size, but even if that's what happens the end result is if you're playing either the HD video or the SD version full screen on the same monitor the contrast should be the same.
My theory is a 480p encode taken from a HD source is always going to look better than the same encode from a SD source simply because the HD source has more detail to begin with which allows more detail to be retained when resizing. Or if a sharp(ish) resizer is used, at worst it'll at least have the appearance of retaining more detail simply because it's a little sharper.
Last edited by hello_hello; 10th Jan 2013 at 05:05.
My other-half has a 32 inch LCD in her living room. Sitting on her sofa it's far enough away that almost anything looks good. Well with the exception of the free to air TV where I am. It's usually broadcast in SD and so heavily compressed it's going to suck no matter what size screen you use.
hello_hello: Thanks for this information. Most of my 720 x 480 mpeg2 conversion to .mkv are at 1500 kbs and 1800 kbs.
On the specific top that the original poster brought up, I would say downconverting from a HD source is going to look pretty good, depending on if it looked good in the first place. My point is that having a smaller TV somewhere to watch his original SD encodes, might save him some time, effort and money.
Although, as pdr said earlier, Blu-Rays are sometimes just upconverted from standard-def and it's often source-specific.
I'd imagine 1500kb/s and 1800kb/s should give you pretty good quality at 720x480 though. I've still got the last few encodes sitting on my hard drive so I had a look. I'm in PAL-land and after cropping I tend to resize up when encoding DVDs these days as there's a couple of devices in this house which won't play anamorphic MKVs correctly. Naturally that increases the file sizes a bit. Using a CRF value of 18 each time and the slow x264 speed preset, the lowest bitrate encode was 1196kbps at 1024x428. The highest was 2022kb/s at 1020x424.
Of course for DVD encodes I always de-interlace if need be (most DVDs seems to be progressive these days anyway). I don't think I've evener encoded an interlaced video so I've no idea how that effects bitrates.
Last edited by hello_hello; 10th Jan 2013 at 23:32.