VideoHelp Forum
+ Reply to Thread
Results 1 to 14 of 14
Thread
  1. Member
    Join Date
    Dec 2014
    Location
    San Jose, CA
    Search Comp PM
    I'd heard before that 960x540 was considered an "HD" format. Also, in Final Cut Pro, this dimension is listed as HD as well. I always thought that HD was defined by dimension size (1920x1080 or 1280/720).

    So out of curiosity, I decided to do some Blu-ray encodes of Breaking Bad episodes. Did a 2 pass encode and went smaller on average bitrate because of the reduced dimension. All I can say is 1500kbps/H264 @ 960x540 looked...WOW.. When I did a comparison of the same episode in 720p, 960x540 seemed to maintain all the sharpness/detail and looked basically identical (scored 20/20 on recent eye exam). The added bonuses were reduction in file size (47 min ep @ 570mb's is DAMN GOOD) and the speed of the encode was also 45-50 minutes faster because of the smaller dimension.
    Last edited by rdeffley; 11th Jan 2015 at 03:03.
    Quote Quote  
  2. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Apple tried to promote this format for a short time using the "iFrame" logo standard. It is a bastard, orphan format and about the only thing good about it is that it interpolates easily to 1920x1080 (that, and the fact that it uses only intra-frame - for editing). It also maxes out at 25p or 30p (not 50/60 here of any kind)=limited. a BAD IDEA.

    By the definition of most media engineers, ~576 is the minimum vertical dimension cutoff for Standard Definition (e.g. 1024x576, 768x576, 720x576), so the idea that this ought to be called HD is silly, even if it is meant to be a "proxy" for HD.

    re: your test encode...
    Of course, given a certain bitrate setting (and codec), a smaller dimension will look cleaner and with less artifacts every time. Try the same bitrate with 640x480 or even with 320x240 - each smaller one will be that much less compression (compared to an uncompressed version OF THE SAME SMALL SIZE). And of course, the smaller you go, the less computer power it takes and the quicker it is to encode/decode. No surprise there. But is it sharper? - NO. You could take this to the extreme and say you could encode 160x120 at 1500kbps and it would be a perfect encode compared to a master 160x120. Why? Because it only takes 450kbps to show a fully uncompressed RGB version that size. But it's not sharp, it's fuzzy as hell.
    And conversely, if you take your same master and tried to do a 1920x1080p or God-forbid a 4k encode - AT THE SAME BITRATE - it is going to look worse and worse (more & more compression relative to their uncompressed masters and so more & more artifacting). Guess what, you just learned a fundamental rule of encoding: the concepts of BITRATE, QUALITY, RESOLUTION, and DETAIL are interrelated but distinctly different things.
    And I'm sorry, but regardless of your eye exam, there IS a difference in detail between the 720p and 540. Your ability to notice those things may be less based on visual acuity and more on expectations & learning discernment (aka you don't have a very well TRAINED eye).

    Scott
    Last edited by Cornucopia; 11th Jan 2015 at 03:41.
    Quote Quote  
  3. Member
    Join Date
    Dec 2014
    Location
    San Jose, CA
    Search Comp PM
    Haha.. That may very well be true in terms of not knowing what to look for vs looking good to the human eye. However, I have done encodes @ 624x352 before and no matter how high I went on the bit rate (taken from a 1080p source) the sharpness and detail did not match that of a larger dimension size. 960x540 is the first "smaller" dimension I've seen to offer really good sharpness and detail with facial features, etc.. Since 960x540's been referred to as a so-called "HD" resolution that is why I was curious about it.
    Quote Quote  
  4. It can be considered (unofficially as whole concept is not official) as a part of something what was so called EDTV http://en.wikipedia.org/wiki/Enhanced-definition_television .
    Microsoft proposed this resolution as balance between full HD (i.e. 1920x1080) and SDTV when encryption/copy content protection can't be applied to digital/analog output from source (for example HDMI without HDCP) however HDMI/HDCP rejected this concept and currently not even SD quality when HDMI is not supported policy is used.

    "Manufacturers will have to incorporate an analog restrict function that uses a combination of software (Intel's HDCP and Microsoft’s COPP) and hardware (HDMI) technologies to impend the playback of HD content over analog connections such as component and VGA . If these technologies are not present, the player will only output a video signal with a 960x540 resolution and won’t allow 720p and 1080i images to be displayed on the screen."
    http://files-recovery.blogspot.nl/2010/08/high-definition-dvd-as-next-generation.html

    So currently DRM approach is more restrictive than proposed in past (and Microsoft it no so evil).

    960x540p50/60 is better than SD and worse than HD (and usually HD overscan is lower than SD thus 480/576 provide still lower quality).
    Last edited by pandy; 11th Jan 2015 at 08:45.
    Quote Quote  
  5. I never knew 960x540 was some sort of standard. I often resize PAL DVDs to 960x540 as it's exactly 16:9 and it cuts the required bitrate down a little compared to 1024x576 (and doesn't tend to reduce the picture detail compared to 1024x576). I'd have considered anything from PAL down to be SD though (anything with a 1024x576 or lower resolution) so that'd include 960x540.

    When it comes to re-encoding 1080p video and resizing down I often resize to 900p (1600x900) or 720p (1280x720), it depends on the amount of picture detail, but it's very rare I'd go any lower (only for maybe a very old B/W movie with little picture detail.... that sort of thing). I'm sure 960x540 can still look pretty good, but I'm sure there'd usually be some loss of detail compared to 720p.

    It's not always easy to compare video unless you can do so by pausing on identical frames and switching between them. I often create several Avisynth scripts at different resolutions, open them in MPC-HC and switch between them while paused on my TV. That also takes any loss of detail due to encoding out of the equation when picking a resolution. Any difference in sharpness is usually obvious that way when it mightn't be so much while trying to compare video as it's playing.

    One "problem" for me when downscaling that much would be colorimetry. I dislike downscaled encodes that haven't been colour converted properly. Sometimes you can't tell, but sometimes it's obvious. Given HD uses rec.709 and SD uses rec.601 I'd probably convert the colours when downscaling to 960x540. You could always write rec.709 to the video stream but do any players pay attention to that?

    rdeffley,
    Do you notice any colour difference on playback between downscaling to 720p and 540p, or are you converting the colours? And how are you comparing the two (media player, monitor or TV etc)? I'm just curious. If the player is using SD colorimetry on playback for the 540p encode and the colours weren't converted, the most obvious difference is usually redish colours. Skin tones tend to look a bit "redder" or darker, that sort of thing.
    Quote Quote  
  6. I had a bit of a play with some resizing comparisons via the script below today. The source video was 1080p. I opened the script with MPC-HC and ran it fullscreen on my TV (51" Plasma). Because the source is 1080p, the player isn't doing any resizing.

    ClipA = Last

    Original_Width = ClipA.Width
    Original_Height = ClipA.Height

    ClipB = ClipA.Spline36Resize(1280,720).Spline36Resize(Orig inal_Width,Original_Height)
    ClipC = ClipA.Spline36Resize(960,540).Spline36Resize(Origi nal_Width,Original_Height)

    Clip1 = ClipA.Subtitle("1080p")
    Clip2 = ClipB.Subtitle("720p")
    Clip3 = ClipC.Subtitle("540p")

    Interleave(Clip1,Clip2,Clip3)
    For those not overly familiar with AVIsynth, the script displays a frame of the original 1080p video, followed by the same frame resized down to 720p and back to 1080p again, followed by the same frame resized to 960x540 and back. It still surprised me how often I couldn't see any loss of detail, even when the video was downscaled to 540p. Mostly, I'd need to look for frames with objects with nicely defined edges, as sometime the edges would look a little blurred after downscaling, but there still wasn't always a massive difference. I'm not saying this is typical (I've definitely seen 1080p video with more fine picture detail), but maybe it's also an indication as to how good Spline resizing is.....

    This was one of the few scenes with enough detail or sharp edges for the effect of the resizing to become fairly obvious, but that's comparing individual frames close up. I doubt I could pick 1080p from 540p back at normal viewing distance. The screenshots are about 3MB each as they're lossless.

    1080p
    Click image for larger version

Name:	1080p.png
Views:	2428
Size:	3.33 MB
ID:	29654

    720p upscaled
    Click image for larger version

Name:	720p.png
Views:	2058
Size:	3.24 MB
ID:	29655

    540p upscaled
    Click image for larger version

Name:	540p.png
Views:	2167
Size:	3.06 MB
ID:	29656

    Edit: I just realised the original video isn't exactly 1080p, it's 1916x1076. Not that it matters. The script is resizing to 720p/540p and back to the original resolution, so the result doesn't change.

    I modified the script so a softer resizer was used for upscaling. I'd be willing to bet most players use sharper upscaling, so it's probably "worse case scenario" upscaling in respect to sharpness. The difference between the resolutions is a little more pronounced, but 960x540 still isn't all that bad.

    Not that my experiment really proved much. I just thought I'd share.....

    720p bilinear upscaling
    Click image for larger version

Name:	720p Bilinear.png
Views:	1252
Size:	3.11 MB
ID:	29657

    540p bilinear upscaling
    Click image for larger version

Name:	540p Bilinear.png
Views:	1361
Size:	2.89 MB
ID:	29658
    Last edited by hello_hello; 12th Jan 2015 at 10:05.
    Quote Quote  
  7. Hello, I just here from Google search, and I was doing my FHD anime downloading for a while since I tried every possible SD dimension I could come across,
    And I once read about downloading algorithm the FFmpeg uses, most replies I found people recommending something close to integer number (it too complicated I can't member al details ) ,
    So I decide to divide 1920x1080 at 2, and I got 960x540 and I like it, before that I was using DVD standard 1024x576 and I didn't like it much.
    Anyway I notice too, that file size saving are not that big difference compare to 720p, but the last one it images are sharper,
    even if I lower the quality setting in FFmpeg e.g -crf 26, in that I notice case file size different are negligible,
    since I always had to kept crf value at default -crf 26 with 960x540 or else the image quality will decrees dramatically.
    Quote Quote  
  8. Originally Posted by www-data View Post
    Hello, I just here from Google search, and I was doing my FHD anime downloading for a while since I tried every possible SD dimension I could come across,
    And I once read about downloading algorithm the FFmpeg uses, most replies I found people recommending something close to integer number (it too complicated I can't member al details ) ,
    So I decide to divide 1920x1080 at 2, and I got 960x540 and I like it, before that I was using DVD standard 1024x576 and I didn't like it much.
    Anyway I notice too, that file size saving are not that big difference compare to 720p, but the last one it images are sharper,
    even if I lower the quality setting in FFmpeg e.g -crf 26, in that I notice case file size different are negligible,
    since I always had to kept crf value at default -crf 26 with 960x540 or else the image quality will decrees dramatically.
    Most of displays nowadays are 1920x1080 or UHD so it is quite obvious that 960x540 fit nicely in rigid pixel structure for those displays.
    Quote Quote  
  9. Banned
    Join Date
    Nov 2022
    Search PM
    Originally Posted by rdeffley View Post
    I'd heard before that 960x540 was considered an "HD" format. Also, in Final Cut Pro, this dimension is listed as HD as well.
    Old Apple machines could not handle H.264 HD in realtime, so FCP would convert your AVCHD into ProRes for editing. For this reason Apple created iFrame, which could be edited directly. Fun fact: in exactly the same time Vegas 8 (or was it Vegas 9?) added support for H.264/AVCHD, and it handled it natively without re-encoding, and straight cuts were pretty much realtime. This is when I became Vegas fanboy.

    Originally Posted by Cornucopia View Post
    By the definition of most media engineers, ~576 is the minimum vertical dimension cutoff for Standard Definition (e.g. 1024x576, 768x576, 720x576), so the idea that this ought to be called HD is silly, even if it is meant to be a "proxy" for HD.
    576p50 was a broadcast HD format in Australia.

    Originally Posted by www-data View Post
    So I decide to divide 1920x1080 at 2, and I got 960x540
    Strange math you've got here. You need to divide 1920x1080 by 4 to get 960x540.
    Quote Quote  
  10. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    I agree, it seems from the point of ~2001, Apple stopped being a performance leader in video compression (exception is ProRes). They never did Xvid/DivX properly (their mpg4part2 simple vs DivX/Xvid mpg4part2 advanced), their h264 and h265 were and still are soft compared to others' encoders, no AV1, etc).

    576p50 probably was considered HD (or at least ED). 576i or 576p30 was probably NOT considered HD.

    Yeah, that 1920x1080 --> needs dividing by 2 in both dimensions, so divided by 4 overall.


    Scott
    Quote Quote  
  11. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    ^^ Well for the sake of bloody-mindness, I would disagree with the maths.

    960 * 540 is half-size of 1920*1080 or 1/2 or two times. True that the number of actual pixels is 1/4 so then one gets the 4 times. But then WGAF (except for the usual suspect and I do not include Scott in that)
    Quote Quote  
  12. Banned
    Join Date
    Nov 2022
    Search PM
    Originally Posted by DB83 View Post
    Well for the sake of bloody-mindness, I would disagree with the maths.
    It seems that quite a few of your recent posts have been made for the sake of bloody-mindness. Plumbing at your home is not made of lead pipes, is it?

    Originally Posted by DB83 View Post
    960 * 540 is half-size of 1920*1080
    960 × 540 = 518400
    1920 × 1080 = 2073600
    2073600 ÷ 2 = 1036800 ≠ 518400

    Originally Posted by DB83 View Post
    or 1/2 or two times.
    Is it one half or two times? Or is it all the same for you? I bet you would not be so dismissive it it was half pint versus two pints.

    Originally Posted by DB83 View Post
    True that the number of actual pixels is 1/4 so then one gets the 4 times.
    Really? How did you get 4 times? Oh, I see: you divided two by one half. Right on.
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Now you're just goading, Bwaak.
    Quote Quote  
  14. Capturing Memories dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by pandy View Post
    Most of displays nowadays are 1920x1080 or UHD so it is quite obvious that 960x540 fit nicely in rigid pixel structure for those displays.
    I would have prefered this resolution over the stupid 1280x720, At least we wouldn't have problems now upscaling it, Gosh I hated over processed and badly upscaled 720 footages from the 2000's era, Even broadcast facilities are guilty of it.
    Quote Quote  
Visit our sponsor! Try DVDFab and backup Blu-rays!