VideoHelp Forum
+ Reply to Thread
Results 1 to 17 of 17
Thread
  1. Member LouieChuckyMerry's Avatar
    Join Date: Jun 2010
    Location: Singapore
    Search Comp PM
    Hello and thanks in advance for any help. And my apologies if I don't explain myself clearly enough for helpful understanding as I'm rather confused. I've 2 Blu-ray rips of the same movie, with the same video bit rates, and I'm trying to understand which video is technically of the higher quality and why. When I play the videos with VLC and KMPlayer they display the exact same aspect ratio, having the same size black bars top and bottom and showing the same visuals on the left and right sides of the screen; that is, neither video appears to be cropped, at least as I understand cropping, because neither seems to have any video information missing. However, MediaInfo for the two files is quite different, one being 1920*800 (2.40:1) and the other being 1920*1080 (16:9). Also, VLC snapshots of each have the same corresponding MediaInfo differences in width and height, however, when opened with Windows Photo Viewer the 1920*800 snapshot is fullscreen but the (bigger?) 1920*1080 snapshot appears as a smaller rectangle within the frame (see attachments). To (hopefully) simplify:


    1) How can two video files with radically different MediaInfos appear the same when played back?

    2) Which of these files, if either, is technically of higher quality and why?


    Thanks again for any help and sorry for my lack of technical knowledge in my explanation of the files .

    Click image for larger version

Name:	VLCSnapshot,1920x800.JPG
Views:	92
Size:	154.6 KB
ID:	23803 1920*800 VLC Snapshot

    Click image for larger version

Name:	VLCSnapshot,1920x1080.JPG
Views:	80
Size:	137.7 KB
ID:	23804 1920*1080 VLC Snapshot
    Last edited by LouieChuckyMerry; 28th Feb 2014 at 22:16. Reason: Lucidity
    Quote Quote  
  2. This is a joke, right?
    Originally Posted by LouieChuckyMerry View Post
    1) How can two video files with radically different MediaInfos appear the same when played back?
    The top one was reencoded with the black bars cropped away and the player adds them during playback. The bottom one was reencoded with the black bars intact.
    2) Which of these files, if either, is technically of higher quality and why?
    If, as you say, the bitrates are identical, then the quality of the two is virtually the same. Encoding black bars takes almost no bits. But there may be some other different encoding settings and there's no real way to tell, given the limited amount of information we've been supplied.

    That bottom picture isn't the same percentage size as the top one. I think if you pressed that magnifying glass thing with the +, you could make them the same width.
    Quote Quote  
  3. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    I'll try to give the answer to what I think is going here:

    May not be the actual original, but the 1920x1080 is likely more upstream than the 1920x800. Why?

    1. Blu-ray (and DVD) does NOT support a native DAR of 2.40:1. Neither supports anything but 16:9 (and 4:3 in the case of SD material).

    Therefore, a movie with a native DAR of 2.40 must be fit into a 16:9 box, and the standard way of doing that is via Letterboxing.

    The other clip (1920x800) was extracted from this and subsequently cropped (and then re-encoded).

    Note: both clips display the same desired payload AR (the part of the screen YOU are interested in), but the former has an actual CONTENT AR of 16:9 (which includes the hardcoded letterboxing in the storage and in the calculation of AR), while the latter has a content AR of 2.40:1 (which does NOT include the letterboxing).

    Since you didn't show the MediaInfo text, we couldn't verify this, but I'm confident it would concur with my guess.

    What a media player would do would be to attempt to play both in Full Screen (assuming a 16:9 screen output display). The former would already be formatted for 16:9, so it would be either a 1:1 pixel-match or would be a simple equally-constrained scale-up or down (depending on the rez of the display), while the latter would default to adding its own letterboxing upon playback (in realtime) to make the ARs fit. In some softwares, like VLC that can be adjusted to STRETCH, zoom, crop, or leave unscaled, just like the options usually available to consumer TVs. But not ALL software has that capability. Windows Photo Viewer is likely a good example of that: In this instance above, the windowed user display area is ~2:40:1, so it fits the 1920x800 out to fill it exactly, while the 16:9 letterboxed 1920x1080 version cannot be anamorphically stretched, nor can it be symmetrically zoomed & cropped, so it is filling the biggest 16:9 space within that user windowed area that it can.

    In answer to your 2nd question, while my intuition would lead me to think that the 1920x1080 is the higher quality (due to not being re-compressed?), there's no real way of us knowing since we haven't been given the generational provenance of the files.

    Scott

    <edit>Got too windy and manono beat me to the punch!</edit>
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  4. Member LouieChuckyMerry's Avatar
    Join Date: Jun 2010
    Location: Singapore
    Search Comp PM
    manono: no joke, just ignorance. I understand your explanation (thanks), but why would someone actually take the time to crop the black bars before reencoding if the quality doesn't change and the player adds them during playback anyway? And why doesn't the player honor the video's new (due to cropping of the black bars) aspect ratio? Sorry if these questions seem stupid to you but I really am trying to learn something new .


    Edit: Thanks for your reply, Cornucopia, I really appreciate it. And not at all "windy", it's actually very educational. When you type "May not be the actual original, but the 1920x1080 is likely more upstream than the 1920x800. Why?", what do you mean by "upstream"? Also, would uploading MediaInfo for each file be of any use? If so, then which viewing information option would be best?
    Last edited by LouieChuckyMerry; 28th Feb 2014 at 23:34. Reason: Cornucopia's Fast Reply
    Quote Quote  
  5. Originally Posted by LouieChuckyMerry View Post
    ... but why would someone actually take the time to crop the black bars before reencoding if the quality doesn't change...
    Of course the quality changes. They've both been reencoded from some retail Blu-Ray. Those might be 25-30GB. Are these that size? I crop black bars too. Maybe some people don't. This is downloaded stuff, right? When you encode your own stuff you can make them any way you like. If you want to turn around and make another Blu-Ray you need the black bars, and maybe that's the reason one encoder left them on - to make it easier to make a Blu-Ray, so you don't have to reencode yet again.
    And why doesn't the player honor the video's new (due to cropping of the black bars) aspect ratio?
    What new aspect ratio? The active video for both is the same. The aspect ratio is the same (square pixel, 1:1). One is 1.778:1 with black bars, the other 2.40:1 with the letterboxing cropped away. Why shouldn't they play exactly the same way?
    Quote Quote  
  6. It's been covered in details.

    1) How can two video files with radically different MediaInfos appear the same when played back?
    There are technical differences between two pictures OP posted, even though it very hard to notice by human eyes.
    It is very rare that two different rip are technically exactly same, even though differences are hard to notice by naked eye.

    Differences hard to notice:


    The man standing in a right dark corner is hard to notice among few birds or bats flying until high lighted.
    And, This is the reason why I always insist in preserving details as much as possible.
    And, Would you not agree with me that some xyz encoder chops lots of details (not only) in dark area?.
    As we do not see, may be xyz encoder does not see as well.
    I perform way better when I am drunk, I guess.
    Last edited by enim; 1st Mar 2014 at 08:35.
    Quote Quote  
  7. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Generational media compression provenance (or Generational provenance or media provenance) is a term I created and use here to explain what happens to media that affects how it holds up throughout the cycle of production + distribution + consumption.

    Using a Hollywood movie as an example:
    1st Generation - Original "Master" Edited File of a title (say, "Inception"). It is a 2K or 4K DI file with little to NO compression (so ~500-1500Mbps data rate)
    2nd Gen - 2K JPEG2000-compressed MXF copy @ ~200Mbps, used for DCI authoring & production (digital projection in theatres)
    3rd Gen - 1080p AVC-compressed copy @ ~40Mbps, used for Blu-ray disc authoring & production (could be 2nd gen when taken from original DI/Edit Master, but in practice is OFTEN taken from DCI master instead)
    4th Gen - 1920x1080 MP4 recompressed copy from ripped BD @ ~12Mbps?
    5th Gen - 1920x800 cropped MKV recompressed copy from #3 or #4 @ ~6Mbps?
    (BTW, as most consumers only can get a BD file as their best version, this is often assumed to be the start of the generations and people count up from it, even though there have been more things going on prior to its existence, and if you wanted to get real technical you could add back the generations that led from camera/CGI up to the master)

    See how the earlier generations are "upstream"? The data from them will ultimately flow down into the subsequent formats with modification along the way.
    The upstream files are less compressed (with less artifacts), less modified, and less deteriorated.

    The provenance lists what those versions all were (and if it went into more detail, it would explain what modifications were made - resize, color adjust, etc - and maybe even by which app and using what settings/methods). This will GREATLY explain WHY a given clip is or is not as optimal a quality as it ought to be.

    Assuming the 1920x800 file is a descendent of the 1920x1080 file (a safe assumption), why someone would do the cropping is a mystery to me. It certainly doesn't improve playback. I honestly believe it happens from misinformed & misguided attempts to avoid letterboxing, and to "maximize compression" (even though those bars don't even significantly affect the compressibility).

    Both players are honoring both files' DAR, but players aren't smart enough to know that the encoded "black" isn't part of the active picture. They have to work with the whole contained resolution: one is DAR 2.40:1 and one is DAR 16:9 (both with PAR of 1:1), regardless of what the humanly-desired portion's ratio is. When a player DOESN'T honor a file's aspect ratio, that's when anamorphic stretching occurs (or when anamorphic stretching which is supposed to occur DOESN'T occur).

    MediaInfo is an invaluable tool for examining media (particularly on this site). The detailed text info (which can be exported and then cut+pasted to here) is the preferred layout, IMO.

    I don't know if that simplified things, or confused them further.

    Scott
    Last edited by Cornucopia; 1st Mar 2014 at 01:24.
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  8. I appreciate Cornucopia & manono you guys have covered lots of details that an average person normally do not know, and provided as much information as you can. Some where down the road I stated this forum rocks with reasons, while elsewhere poster gets frustrated while waiting for responses.

    Newbies always make blunders to assume that people here are stupid.
    Quote Quote  
  9. Originally Posted by LouieChuckyMerry View Post
    manono: no joke, just ignorance. I understand your explanation (thanks), but why would someone actually take the time to crop the black bars before reencoding if the quality doesn't change and the player adds them during playback anyway? And why doesn't the player honor the video's new (due to cropping of the black bars) aspect ratio? Sorry if these questions seem stupid to you but I really am trying to learn something new .
    The black bars are included in the Bluray aspect ratio as though they were part of the picture, or as if they actually contained picture. In your example though, the picture itself is 2.40:1, but the black bars combined with the picture gives you 16:9. Once the black bars are removed, what's left is 2.40:1.
    If you don't crop the black bars when encoding, you're wasting bits encoding a picture area which is nothing but black, and the presence of black bars can have a small effect on how the picture itself is encoded. If they're removed, they can't.
    The player does add the black bars back on playback, but it'll only add them back in the same way if the screen being used is 16:9. If the screen has a different aspect ratio, then the player will display the video until the width (or height) fills the screen. It'll then add back whatever black bars are required to make up the difference between the picture aspect ratio and the screen aspect ratio (when the video is running in fullscreen mode).

    In the case of your screenshots that's effectively what's happening. Your monitor may be 16:9, but the player itself is occupying part of that with the menu and navigation bars so what's left for displaying the video isn't 16:9. When the black bars are included in the video (total 16:9 aspect ratio) the player resizes the video so the entire 16:9 picture (black bars included) fill what's left of the screen. The end result is it runs out of screen height before the width is wide enough to fill the display. Normally the player would then add black bars down each side too, but in your second screenshot they're white as Windows Photo Viewer wouldn't do that.
    Without the black bars, the picture (in this case) can be resized until the width fills the screen. The player would then add smaller black bars top and bottom to make up the difference, but once again in the screenshots using Windows Photo Viewer they're white.

    The upshot of it all is if you're viewing the video using a 16:9 screen, and the video is running fullscreen (the player window isn't using part of the screen area) the video containing black bars and the version without them should display in exactly the same way. With the black bars included the picture plus black bars =16:9 and fills the screen. Without black bars included the picture is 2.40:1 and the player adds the black bars to once again make up a total apsect ratio of 16:9 and fill the screen.

    Assuming both those videos were encoded using the same Bluray source (they're both only one generation away from the "original") and assuming the video bitrate is the same each time (and assuming the same encoder settings were used etc) then the quality of your two videos would be pretty much the same. The black bars do use a little bitrate if they're encoded, but it's pretty minimal, and they can effect the way the picture itself is encoded, but the effect is very, very small, so all else being equal the quality would be the same.
    Last edited by hello_hello; 1st Mar 2014 at 10:59.
    Quote Quote  
  10. Originally Posted by Cornucopia View Post
    Assuming the 1920x800 file is a descendent of the 1920x1080 file (a safe assumption), why someone would do the cropping is a mystery to me. It certainly doesn't improve playback. I honestly believe it happens from misinformed & misguided attempts to avoid letterboxing, and to "maximize compression" (even though those bars don't even significantly affect the compressibility).
    I wouldn't necessarily assume your assumption is safe, especially given the OP doesn't seem like someone who's encoded video much before, if at all. How would they have obtained the second encode taken from the source they already had? It's probably just as safe to assume the two encodes came from the same source, even if they were encoded by different people.

    Originally Posted by Cornucopia View Post
    Both players are honoring both files' DAR, but players aren't smart enough to know that the encoded "black" isn't part of the active picture. They have to work with the whole contained resolution: one is DAR 2.40:1 and one is DAR 16:9 (both with PAR of 1:1)
    There's one reason for cropping.
    Last edited by hello_hello; 1st Mar 2014 at 02:22.
    Quote Quote  
  11. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Fair enough, but I disagree. The assumption that the less modified of the two is a direct rip, while the cropped version would have to have been taken from a rip, is the most likely scenario. We wouldn't know for sure, though, unless we had more info - that's why MI is so helpful and why I mentioned all that provenance stuff.

    Cropping to simplify AR recognition would have been a partial reason if all things were equal, but they aren't, because one HAS to re-compress to get the 1920x800 version, and generational losses are always a bad thing. I would go so far as to say they are WORSE than having to deal with black bar padding.
    Add to that the fact that every player/display combination I know allows one to modify the default to apply stretch/zoom/crop when necessary. Lessens the impact of non-standard ARs.

    I'm curious: do you (normally) play your videos in full screen? If so, the players would achieve the same ends regardless of the source's overall picture AR, thus making moot the need for cropping.

    I think, by and large, (this kind of) cropping is done by people who don't fully understand how digital video works. Education is the better answer.

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  12. Originally Posted by Cornucopia View Post
    Fair enough, but I disagree. The assumption that the less modified of the two is a direct rip, while the cropped version would have to have been taken from a rip, is the most likely scenario.
    And I'll disagree with you. We know they both have the same video bitrate. I'd say there's next to no chance of the one with black bars being a direct unreencoded port from the Blu-Ray. They were both shrunk considerably to make them easier to upload/download. I'd bet dollars to donuts that hello_hello is correct in speculating they're both from the same retail Blu-Ray source and rencoded and made considerably smaller than the source by two different people before being uploaded for distribution.
    Originally Posted by Cornucopia View Post
    I'm curious: do you (normally) play your videos in full screen?
    I'm not hello_hello, but I mostly watch my downloaded movies on the television. So, yes, they're played fullscreen.
    I think, by and large, (this kind of) cropping is done by people who don't fully understand how digital video works.
    Be that as it may, I have yet to even see one with the black bars remaining. They're always removed.
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    You (both) make a good argument. What makes your case stronger is the bit about the bitrates being equal.

    The MI text readouts would iron this all out, though. I'll hold off conceding until then.


    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  14. Cornucopia,
    If in a few years time 16:9 becomes old hat and everyone owns TVs with a wider aspect ratio, aren't you going to be a tad miffed while watching your non-cropped 16:9 encodes? You'll have double the joy.... encoded black bars top and bottom with the player adding black bars to the sides.
    Or maybe you could hang some nice curtains down each side of the screen.... sorry, I'm being silly..... although if that were to happen I could imagine a lot of people re-encoding just to remove the back bars.

    I'll confess, assuming you're going to re-encode the video anyway to reduce the file size or apply filtering etc, I can't see any advantage to keeping the black bars. I can see some, if only minor, advantage to removing them. If for no other reason than it allows a player to resize the video with more flexibility, assuming the available screen real estate isn't 16:9.
    Plus I'm a little OCD when it comes to the picture having nice clean edges. That's not always the case, so cropping allows you to crop until the edges are nice and sharp. I've read a few posts where people have complained even with sharp black borders, the very edges of the picture can still look a little blurry after encoding (maybe it happens if the picture/borders aren't mod16).

    http://forum.videohelp.com/threads/353220-Why-do-most-people-crop-out-the-black-bars-w...=1#post2221277
    For my test encodes I took a PAL 16:9 DVD, cropped and resized it to 1024x432, then encoded it. The second time I did the same thing, but also added the borders back for 1024x576. Same CRF value each time. And for fun I encoded just the black borders, to see how much bitrate they'd required.
    810.6 MB without black borders.
    800.2MB with black borders.
    2.1MB for just the black borders.
    So it's probably fair to conclude the presence of the black borders caused the encoder to spend 12.5MB less worth of bitrate for encoding the actual picture.

    Originally Posted by Cornucopia View Post
    Cropping to simplify AR recognition would have been a partial reason if all things were equal, but they aren't, because one HAS to re-compress to get the 1920x800 version, and generational losses are always a bad thing. I would go so far as to say they are WORSE than having to deal with black bar padding.
    Agreed. I wouldn't re-encode simply to remove the black bars, but when I'm re-encoding anyway.....

    Originally Posted by Cornucopia View Post
    I'm curious: do you (normally) play your videos in full screen? If so, the players would achieve the same ends regardless of the source's overall picture AR, thus making moot the need for cropping.
    Yes.
    Watching wide-screen video encoded onto a 4:3 frame is fun. Unless the black bars are cropped.

    Originally Posted by Cornucopia View Post
    I think, by and large, (this kind of) cropping is done by people who don't fully understand how digital video works. Education is the better answer.
    I understand enough about digital video to know if I encoded a 2.40:1 picture using the full 1920x1080 resolution it's possible to retain more detail than when it's 1920x800 with black bars. All it requires is a player clever enough to understand more than two aspect ratios. To be honest I don't really understand why we're still stuck with 16:9 and 4:3 aspect ratios as being the "standard", often requiring the use of black bars to display the picture correctly.

    As a side note..... because of the way Windows, or the Windows renderers (or whenever the decision is made) decide the colorimetry to use based on resolution, some "720p" video will display with the wrong colorimetry if the black bars are cropped. For instance a 1280x720 video with black bars will display using BT.709, but encode the same video while removing the black bars for a 1280x544 resolution (as an example) and it'll display using BT.601. So there's one argument for not cropping.

    Speaking of "standards".... anyone know why a lot of iTunes PAL video seems to be encoded using a 960x720 resolution, but with a (correct) 16:9 aspect ratio?
    Last edited by hello_hello; 1st Mar 2014 at 11:27.
    Quote Quote  
  15. Member
    Join Date: Jun 2012
    Location: USA
    Search Comp PM
    Originally Posted by hello_hello View Post
    To be honest I don't really understand why we're still stuck with 16:9 and 4:3 aspect ratios as being the "standard", often requiring the use of black bars to display the picture correctly.

    Speaking of "standards".... anyone know why a lot of iTunes PAL video seems to be encoded using a 960x720 resolution, but with a (correct) 16:9 aspect ratio?
    It's a little bit of history and a little bit of engineering efficiency, but as you've noticed, everyone ignores the standards when they want to do it differently.
    Quote Quote  
  16. Member
    Join Date: Oct 2004
    Location: Freedonia
    Search Comp PM
    The weird iTune resolution may also be a legacy thing because older versions of Apple TV only supported 720p output and Apple probably still encodes to support the lowest common denominator of 720p for iTunes video in HD. iTunes videos are DRMed so they aren't really intended for viewing on anything but Apple TV or a PC/Mac anyway. I can't speak to exactly what THAT particular resolution was chosen other than my previous reasons should at least explain the 720 part of it.
    Quote Quote  
  17. I'll confess I've only seen non-DRM video and only for frame rates higher than 24fps. I've assumed it's sometimes the way it's done for iTunes though.
    This post seems to indicate you're probably correct and 1280x720 was only supported up to 24fps.
    http://forums.macrumors.com/showthread.php?t=882223

    I didn't even know there was an AppleTV. You can learn something new every day.... http://en.wikipedia.org/wiki/Apple_TV
    Quote Quote  



Similar Threads