I'd heard before that 960x540 was considered an "HD" format. Also, in Final Cut Pro, this dimension is listed as HD as well. I always thought that HD was defined by dimension size (1920x1080 or 1280/720).
So out of curiosity, I decided to do some Blu-ray encodes of Breaking Bad episodes. Did a 2 pass encode and went smaller on average bitrate because of the reduced dimension. All I can say is 1500kbps/H264 @ 960x540 looked...WOW.. When I did a comparison of the same episode in 720p, 960x540 seemed to maintain all the sharpness/detail and looked basically identical (scored 20/20 on recent eye exam). The added bonuses were reduction in file size (47 min ep @ 570mb's is DAMN GOOD) and the speed of the encode was also 45-50 minutes faster because of the smaller dimension.
+ Reply to Thread
Results 1 to 6 of 6
Last edited by rdeffley; 11th Jan 2015 at 03:03.
Apple tried to promote this format for a short time using the "iFrame" logo standard. It is a bastard, orphan format and about the only thing good about it is that it interpolates easily to 1920x1080 (that, and the fact that it uses only intra-frame - for editing). It also maxes out at 25p or 30p (not 50/60 here of any kind)=limited. a BAD IDEA.
By the definition of most media engineers, ~576 is the minimum vertical dimension cutoff for Standard Definition (e.g. 1024x576, 768x576, 720x576), so the idea that this ought to be called HD is silly, even if it is meant to be a "proxy" for HD.
re: your test encode...
Of course, given a certain bitrate setting (and codec), a smaller dimension will look cleaner and with less artifacts every time. Try the same bitrate with 640x480 or even with 320x240 - each smaller one will be that much less compression (compared to an uncompressed version OF THE SAME SMALL SIZE). And of course, the smaller you go, the less computer power it takes and the quicker it is to encode/decode. No surprise there. But is it sharper? - NO. You could take this to the extreme and say you could encode 160x120 at 1500kbps and it would be a perfect encode compared to a master 160x120. Why? Because it only takes 450kbps to show a fully uncompressed RGB version that size. But it's not sharp, it's fuzzy as hell.
And conversely, if you take your same master and tried to do a 1920x1080p or God-forbid a 4k encode - AT THE SAME BITRATE - it is going to look worse and worse (more & more compression relative to their uncompressed masters and so more & more artifacting). Guess what, you just learned a fundamental rule of encoding: the concepts of BITRATE, QUALITY, RESOLUTION, and DETAIL are interrelated but distinctly different things.
And I'm sorry, but regardless of your eye exam, there IS a difference in detail between the 720p and 540. Your ability to notice those things may be less based on visual acuity and more on expectations & learning discernment (aka you don't have a very well TRAINED eye).
Last edited by Cornucopia; 11th Jan 2015 at 03:41.
Haha.. That may very well be true in terms of not knowing what to look for vs looking good to the human eye. However, I have done encodes @ 624x352 before and no matter how high I went on the bit rate (taken from a 1080p source) the sharpness and detail did not match that of a larger dimension size. 960x540 is the first "smaller" dimension I've seen to offer really good sharpness and detail with facial features, etc.. Since 960x540's been referred to as a so-called "HD" resolution that is why I was curious about it.
It can be considered (unofficially as whole concept is not official) as a part of something what was so called EDTV http://en.wikipedia.org/wiki/Enhanced-definition_television .
Microsoft proposed this resolution as balance between full HD (i.e. 1920x1080) and SDTV when encryption/copy content protection can't be applied to digital/analog output from source (for example HDMI without HDCP) however HDMI/HDCP rejected this concept and currently not even SD quality when HDMI is not supported policy is used.
"Manufacturers will have to incorporate an analog restrict function that uses a combination of software (Intel's HDCP and Microsoft’s COPP) and hardware (HDMI) technologies to impend the playback of HD content over analog connections such as component and VGA . If these technologies are not present, the player will only output a video signal with a 960x540 resolution and won’t allow 720p and 1080i images to be displayed on the screen."
So currently DRM approach is more restrictive than proposed in past (and Microsoft it no so evil).
960x540p50/60 is better than SD and worse than HD (and usually HD overscan is lower than SD thus 480/576 provide still lower quality).
Last edited by pandy; 11th Jan 2015 at 08:45.
I never knew 960x540 was some sort of standard. I often resize PAL DVDs to 960x540 as it's exactly 16:9 and it cuts the required bitrate down a little compared to 1024x576 (and doesn't tend to reduce the picture detail compared to 1024x576). I'd have considered anything from PAL down to be SD though (anything with a 1024x576 or lower resolution) so that'd include 960x540.
When it comes to re-encoding 1080p video and resizing down I often resize to 900p (1600x900) or 720p (1280x720), it depends on the amount of picture detail, but it's very rare I'd go any lower (only for maybe a very old B/W movie with little picture detail.... that sort of thing). I'm sure 960x540 can still look pretty good, but I'm sure there'd usually be some loss of detail compared to 720p.
It's not always easy to compare video unless you can do so by pausing on identical frames and switching between them. I often create several Avisynth scripts at different resolutions, open them in MPC-HC and switch between them while paused on my TV. That also takes any loss of detail due to encoding out of the equation when picking a resolution. Any difference in sharpness is usually obvious that way when it mightn't be so much while trying to compare video as it's playing.
One "problem" for me when downscaling that much would be colorimetry. I dislike downscaled encodes that haven't been colour converted properly. Sometimes you can't tell, but sometimes it's obvious. Given HD uses rec.709 and SD uses rec.601 I'd probably convert the colours when downscaling to 960x540. You could always write rec.709 to the video stream but do any players pay attention to that?
Do you notice any colour difference on playback between downscaling to 720p and 540p, or are you converting the colours? And how are you comparing the two (media player, monitor or TV etc)? I'm just curious. If the player is using SD colorimetry on playback for the 540p encode and the colours weren't converted, the most obvious difference is usually redish colours. Skin tones tend to look a bit "redder" or darker, that sort of thing.
I had a bit of a play with some resizing comparisons via the script below today. The source video was 1080p. I opened the script with MPC-HC and ran it fullscreen on my TV (51" Plasma). Because the source is 1080p, the player isn't doing any resizing.
ClipA = Last
Original_Width = ClipA.Width
Original_Height = ClipA.Height
ClipB = ClipA.Spline36Resize(1280,720).Spline36Resize(Orig inal_Width,Original_Height)
ClipC = ClipA.Spline36Resize(960,540).Spline36Resize(Origi nal_Width,Original_Height)
Clip1 = ClipA.Subtitle("1080p")
Clip2 = ClipB.Subtitle("720p")
Clip3 = ClipC.Subtitle("540p")
This was one of the few scenes with enough detail or sharp edges for the effect of the resizing to become fairly obvious, but that's comparing individual frames close up. I doubt I could pick 1080p from 540p back at normal viewing distance. The screenshots are about 3MB each as they're lossless.
Edit: I just realised the original video isn't exactly 1080p, it's 1916x1076. Not that it matters. The script is resizing to 720p/540p and back to the original resolution, so the result doesn't change.
I modified the script so a softer resizer was used for upscaling. I'd be willing to bet most players use sharper upscaling, so it's probably "worse case scenario" upscaling in respect to sharpness. The difference between the resolutions is a little more pronounced, but 960x540 still isn't all that bad.
Not that my experiment really proved much. I just thought I'd share.....
720p bilinear upscaling
540p bilinear upscaling
Last edited by hello_hello; 12th Jan 2015 at 10:05.