VideoHelp Forum




+ Reply to Thread
Page 3 of 4
FirstFirst 1 2 3 4 LastLast
Results 61 to 90 of 114
  1. That's interesting. So far I've only been working with DVD sources (mainly just one) and not anything spectacular in terms of quality. I might try repeating your tests later. Actually if anything it'll be tomorrow, but It'll be worth a look. What sort of CRF values were you using?
    Quote Quote  
  2. I usually work at CRF=18. But there was mention of CRF=25 earlier in the thread so I tried both.
    Quote Quote  
  3. I'd have to re-install it to check exactly, but I think CRF25 was mentioned because when using MediaCoder's quality setting, unlike most other GUIs it makes you pick a quality percentage rather than a specific CRF value, and the choices of CRF value change in quite large steps as a result. I know I couldn't work out how to pick a CRF value in between 15 and 20..... it was either 15 or 20..... and I think the next step from there was 25.

    Probably why Slipster was so keen to compare his 1333Kbps preset to using CRF20 rather than compare encodes using the same average bitrate, because it's the only CRF value MediaCoder can use which, if the moon happens to be in the right phase and it isn't raining too heavily, might just produce an average bitrate somewhere close to 1333Kbps. It's probably also why he's seeing CRF vs 2 pass differences the rest of us don't.

    One other thought.... the very last video I ran through MediaCoder using Slipster's preset was PAL, 100% interlaced, mpeg2. By that stage I was afraid to alter anything for fear the gods of 2 pass encoding would take vengeance on me, so I left the de-interlacing setting on auto as per the preset. MediaCoder encoded it while completely failing to de-interlace it at all (well the section of it I was encoding anyway), so I'm not even sure I'd be willing to automatically blame any differences on the encoding method itself. I didn't save the pics and the links have been deleted as part of his dummy-spit, but it did make me wonder if the minor differences in the screenshots Slipster posted could be caused by something other than CRF vs 2 pass. When de-interlacing is set to manual you can choose the de-interlacing method, change parameters according to the chosen method or disable it completely etc, but when in auto mode I don't know if there's any way to tell whether MediaCoder is making the same decisions every time. Which is why I was setting some of those options manually, until discovering doing so wasn't permitted.
    I'm not even sure he understands the concept of anamorphic encoding correctly, given he gave me a hard time about cropping and encoding that way while referring to the resizing I wasn't actually doing. I think his preset encodes DVDs the same way, just without any cropping, but in the end I gave up asking him which method Mediacoder uses to resize DVDs and just worked it out for myself. I'm not sure if he ignored the question because he didn't understand what I was asking, and maybe he thought it implied I must be resizing while encoding.

    And I still like to know why MediaCoder converts the color space to I420 by default and whether Slipster's preset does the same thing because he thought it was a terrific idea or because he thought MediaCoder must know best etc. Color space is a topic on which I know nothing but the very basics.... so I've no idea whether converting to I420 is spectacularly clever or it effectively doesn't matter at all.
    Last edited by hello_hello; 10th Sep 2012 at 21:23.
    Quote Quote  
  4. Member
    Join Date
    Aug 2012
    Location
    UK
    Search PM
    Post deleted on the basis that every single word in this one was complete and utter bollocks too apparently.
    Last edited by Slipster; 12th Sep 2012 at 20:59.
    Quote Quote  
  5. No..... there's no proof I wasn't actually reading anything because you removed the evidence of what you'd posted during your dummy spit.

    Yes, your logic behind using CRF20 as a reference was probably explained numerous times, and if I questioned it more than once it'd be because of how ridiculous it is to use it as a reference if the bitrates aren't the same. Why you can't seem to get your head around that fairly simple concept while you fixate over one specific 2 pass bitrate is a mystery to me. The discussion began due to your questioning 2 pass being identical to CRF at the same bitrate. It ain't rocket science.

    Cropping is not resizing in relation to encoding in any language and that won't change because it's something else on which you have your own special perspective. Cropping removes part of an image. What remains has not been resized.
    And whether you're now claiming you had no idea if I was invoking a rescale (as you put it) because continuing to lie is easier, or because it was you who failed to read my posts, what I wrote is still there in black and white, unaffected by dummy spitting. Post #47:
    "The video looks correct in the preview window and I've selected "keep pixel aspect ratio" so I'm hoping I'm encoding this "as-is" without any resizing, but MediaCoder's summary tab still insists the output will be 720x576 even though Resize is unchecked.... I don't know what to believe but if I think the output is anything other than 716x424 anamorphic I'm just going to uninstall this thing and give up."
    Then #49:
    "For the record, the 2 pass MediaCoder encode gave me exactly the same resolution and aspect ratio as MeGUI, which is what I was after."
    Should I have drawn diagrams for you? Maybe a flow chart or two?

    Yes, I encoded a section of interlaced video with MediaCoder's interlacing set to auto and it failed to de-interlace it. Your claim MediaCoder has a 100% success rate doesn't mean much though, as I'd no longer take it as read you'd have a 100% success rate in determining what the resulting output might be.

    In post #47 I explained in the first paragraph what I'd changed and exactly why I changed it to keep the playing field level, and given you subsequently argued against my making the changes I specified simply because it might upset your finely tuned, encode anything perfectly, one bitrate fits all, MediaCoder preset, don't try to claim I wasn't clear now because it makes you seem fairly dim.

    Wow. Anecdotal evidence suggests that some decoders produce incorrect results on playback if the colourspace is YV12 instead of i420?? Yeah, probably in the same way anecdotal evidence suggests 2 pass encoding and CRF at the same bitrate offer different quality. Probably just as much a load of bullocks too.

    "Regarding the usage of "--fullrange", it's only a flag in the header, so it changes nothing in terms of the encoding itself..... yada, yada, nonsense etc."
    Full range is not just a "flag in the header" in this case.
    The Full range setting expands the TV levels to PC levels while encoding. End of story. (Edit: For anyone reading this thread in the future, it appears I was completely wrong about that. See post #75)

    You've only got to think about it for ten seconds. You take a video with TV levels and encode it while setting the "full range" flag without changing the levels, then expect it to be played back using the correct levels even though it was encoded using TV levels.... if the player respects the flag, which I can pretty much guarantee it won't. The levels flag doesn't tell the player which levels you'd prefer to use, it's supposed to tell the player which levels are actually being used. Anything else would be stupid. TVs don't change their input levels because one minute you're playing a video using TV levels, then next an encoded version with the TV levels being expanded to PC levels. One of those videos is going to display incorrectly.
    You're basically saying I can encode a video which uses high definition colorimetry, convert it to standard definition without changing the colorimetry, set the "colorimetry flag" as being standard definition, and it'll somehow play back using the correct colorimetry even though the colorimetry was still high definition when encoding. It's quite ridiculous.

    Anyway, I'm not going to keep arguing about it. You want to mess with the levels every time you encode.... be my guest. But if you're able to play a DVD using a DVD/Bluray player, then play a "full range encode" of the same DVD using the same player while it displays in the same way..... well it ain't going to happen. When you realize what's really going on though, you won't be a happy camper.

    I don't know why you keep going on about my questioning you multiple times, apparently from some viewpoint you're an encoding guru who's being questioned because I didn't pay attention to the answers, when in fact it was probably because the answers given were as dumb as your explanation of how the full range setting in MediaCoder works.

    I've got a very good understanding of human nature, which is why I don't see your removal of posts from other threads as well as this one being anything other than a dummy split..... the type which usually follows trolling accusations when some man-child thinks the moral high ground from which he's posting isn't having the desired effect.

    LOL! I think the squinting smiley summed up your last comment perfectly. And it's probably the same expression I'd have with my head up my arse too. I think you've at least made a good final choice.... leaving the rest of the discussion to the grownups.
    Last edited by hello_hello; 12th Sep 2012 at 21:46.
    Quote Quote  
  6. Originally Posted by hello_hello View Post

    Wow. Anecdotal evidence suggests that some decoders produce incorrect results on playback if the colourspace is YV12 instead of i420?? Yeah, probably in the same way anecdotal evidence suggests 2 pass encoding and CRF at the same bitrate offer different quality. Probably just as much a load of bullocks too.
    They are treated as the same in virtually all programs.

    If for whatever reason some obscure decoder or program doesn't, it will be immediately obvious (the colors will be shifted)



    "Regarding the usage of "--fullrange", it's only a flag in the header, so it changes nothing in terms of the encoding itself..... yada, yada, nonsense etc."

    1. Full range is not just a "flag in the header" in this case.
    2. The Full range setting expands the TV levels to PC levels while encoding. End of story.
    I don't know about mediacoder, but behavior in x264 has changed a while back.

    It used to be true that --fullrange used to be a VUI parameter (no change in the actual video data). The syntax for the VUI paramter is just --range now

    However, if --range and --input-range are different, then there will be a conversion in the actual data





    To keep it simple , during comparisons, I would avoid all GUI's unless you know the exact preprocessing steps used, command line used, x264 version, and decoder used. It would be much easier to use the same x264 binary

    You want to eliminate all the possible confounding variables so you actually test what is meant to be tested .
    Quote Quote  
  7. Anyone know for certain if the finalratefactor value being specified in MeGUI's log file following a 2 pass encode is the CRF value which was used (or not)? I've been told on numerous occasions there's no way to determine the CRF value used when running a 2 pass encode, but given the finalratefactor value does seem to pretty much match......

    Digging out the MeGUI log file for the earlier 2 pass encode I ran, and while I did mess up a little and manage to specify a marginally different bitrate to the CRF18 encode, the final ratefactor value in MeGUIs log file after 2 passes was 17.68.

    I just finished my new CRF20 v 2 pass encode (I'm going to check them visually a little later on) and this time the bitrates matched exactly according to MediaInfo, while the final ratefactor reported for the 2 pass encode was 19.79.

    Surely there's some relationship between final ratefactor and CRF?? Anyone know?

    And speaking of MediaInfo. It's the reason I didn't match the bitrates exactly the first time. I just checked my two new encodes and the same thing's happened again. For the CRF20 encode MediaInfo reports an "overall bitrate" of 1015Kbps, but under the video section it reports a bitrate of 995Kbps. After running a 2 pass encode though, the "overall bitrate" and the "bitrate" being reported are the same. In this case 1015Kbps each time. Anyone know why "overall bitrate" and "bitrate" don't match after a CRF encode?
    Quote Quote  
  8. MediaInfo isn't too accurate. Use Bitrate Viewer instead.
    Quote Quote  
  9. Originally Posted by poisondeathray View Post
    I don't know about mediacoder, but behavior in x264 has changed a while back.

    It used to be true that --fullrange used to be a VUI parameter (no change in the actual video data). The syntax for the VUI paramter is just --range now

    However, if --range and --input-range are different, then there will be a conversion in the actual data.
    I can't believe I installed Mediacoder once again to check. I'm never going to be rid of it.....

    I'm sure MediaCoder's "Auto Level" setting, when changed to Full Range, actually expands the levels. Nothing in my playback chain touches the levels as far as I'm aware, aside from the video card, so if video displays with different levels using my PC I'm pretty sure it's because the levels are different. And I really hope that's the case or the levels might be wrong and I'd not notice. Do you know of any software hardware players which honor the --fullrange VUI parameter? Is the level info saved to the video stream in any other way etc?

    I just ran another quick encode to check and neither --range, nor --input-range, or --fullrange, or anything resembling a parameter including the word range are being added to the command line. Least not the one MediaCoder displays in it's GUI. Nothing mentioned in respect to levels/range when looking at the resulting encode using MediaInfo.

    To give me a little more confidence I'm correct, the Auto Level setting is found under the MediaCoder "Effects" tab along with things like Denoiser, Rotation, DeBlocking etc, completely separate from the x264 specific stuff under the "x264" tab. It seems very likely the Auto Level setting gets MediaCoder to change the levels via other means.

    I thought I might find some more info in the MediaCoder log file, and while from the top menu there's an option to clear the log file, I'm obviously too silly to work out how to open that log file or find it's location.

    Obviously though, if the --fullrange VUI parameter was just that, (a VUI parameter) and obviously if it was applied while the video used TV levels, and if the player respected the --fullrange parameter as Slipster imagines it would, then it's very likely to display the video incorrectly isn't it? Lets assume a player expands video with TV levels to PC levels to display correctly on a monitor expecting PC levels. All is good. Then according to Slipster along comes the encode using TV levels with a --fullrange flag the player respects..... but to my way of thinking stopping it from expanding the levels as it should..... video displays incorrectly. I don't see how it could work any other way.

    Anyone know of an easy way to open a video and quickly compare the level range. That way I could open the MediaCoder encode I just ran along with the source and prove whether the actual levels are different.... or not.
    Last edited by hello_hello; 11th Sep 2012 at 13:17.
    Quote Quote  
  10. Originally Posted by hello_hello View Post

    I'm sure MediaCoder's "Auto Level" setting, when changed to Full Range, actually expands the levels. Nothing in my playback chain touches the levels as far as I'm aware, aside from the video card, so if video displays with different levels using my PC I'm pretty sure it's because the levels are different. And I really hope that's the case or the levels might be wrong and I'd not notice.
    How do you know it's not something in your software or hardware isn't obeying the flag?

    If you don't know, then you shouldn't make comparisons using whatever methods you are currently using. It' s imperative you know exactly how it's being rendered , from decoder, renderer, splitter, everything

    I'll give you one example - If you use ffvideosource with avspmod in default configuration, the RGB conversion will be done ignoring flags, and with rec601. The renderer will be vfw dib. So this will be consistent.

    Do you know of any software hardware players which honor the --fullrange VUI parameter?
    For example , adobe flash obeys the full range flag

    Is the level info saved to the video stream in any other way etc?
    What do you mean by this ?
    The stream contents can conform to a level as well

    --level in x264 currently doesn't do what you think it does. It's just a label. It doesn't limit anything. You can have a stream that conforms to high@L3.1 but it might have a L5.0 "label"





    I just ran another quick encode to check and neither --range, nor --input-range, or --fullrange, or anything resembling a parameter including the work range are being added to the command line. Least not the one MediaCoder displays in it's GUI. Nothing mentioned in respect to levels/range when looking at the resulting encode using MediaInfo.
    It might be using an old x264 binary. Did you check versions ?

    Do you have a log file? how do you know the command line? Often some gui's hide parameters. Unless you know EXACTLY what command was passed you can't make any valid conclusions



    Obviously though, if the --fullrange VUI parameter was just that, (a VUI parameter) and obviously if it was applied while the video used TV levels, and if the player respected the --fullrange parameter as Slipster imagines it would, then it's very likely to display the video incorrectly isn't it? Lets assume a player expands video with TV levels to PC levels to display correctly on a monitor expecting PC levels. All is good. Then according to Slipster along comes the encode using TV levels with a --fullrange flag added, stopping the player from expanding the levels as it should..... video displays incorrectly. I don't see how it could work any other way.
    What you're saying here makes sense, but I don't know what context this was in, the previous posts seem to have vanished
    Quote Quote  
  11. I guess I don't know 100% for sure it's not something obeying the flag, but a couple of questions......

    Assuming the video does still use TV levels as Slipster says, but the --fullrange flag is set and being respected, would the fullrange flag cause the video to display with more contrast or less? Because the effect of using the Full Range setting is to increase the contrast. As though the levels are actually being expanded. As per the pics in post #52. I'm trying to get my head around this.... but if the fullrange flag was being respected while the levels were actually TV levels...... wouldn't/shouldn't the effect be reversed?

    What if I encoded the same video using MediaCoder but a different format and the results were the same as when encoding using x264 (ie huffyuv inside an AVI). If they both display with the same level differences compared to the source, are the levels being converted while encoding? Because I just tried it using huffyuv and that's what happened. Dark stuff got darker, brighter stuff got brighter, video looks like crap. Any fullrange type flags to mess with the way the video is displayed there?

    I opened the original AVI and the huffyuv version using VirtualDub. They display differently in the preview window. Using clrtools (which I've barely used) the levels certainly appear to have changed to me.

    I was asking earlier if the VUI method was the only way of storing the --range parameter or if that info can be stored elsewhere in the video stream. I've used the word "level' referring to "range". Sorry, if that confused the issue.

    I do realize --fullrange was just a label, which is pretty much why I was originally convinced the video levels were being converted, contrary to Slipster's claim the --fullrange flag was being added and obeyed by my player, as that seemed way less likely to me anyway. I'm not sure if I've ever seen a player pay attention to that sort of thing.

    Anyway...... with MediaCoder's Auto Level set to "full range" and encoding using huffyuv I still see exactly the same level differences after encoding as when encoding using x264, and it appears I can see the difference using clrtools in VirtualDub, so unless there's some other likely explanation for why the level change is consistent regardless of encoder, would an assumption the levels are being changed during conversion be a fairly logical one?

    If so, I wonder how long Slipster's been encoding using that Full Range setting?

    PS I can't for the life of me find where, or if, this MediaCoder thing which I've now uninstalled for about the 30th time and hopefully the last time..... saves a log file. And I've looked fairly hard.
    Last edited by hello_hello; 11th Sep 2012 at 14:41.
    Quote Quote  
  12. Originally Posted by hello_hello View Post
    Assuming the video does still use TV levels as Slipster says, but the --fullrange flag is set and being respected, would the fullrange flag cause the video to display with more contrast or less? Because the effect of using the Full Range setting is to increase the contrast. As per the pics in post #52. I'm trying to get my head around this.... but if the fullrange flag was being respected while the levels were actually TV levels...... wouldn't/shouldn't the effect be reversed?
    Almost all video uses "TV levels" . Certainly all retail DVD's , blu-rays.

    If a player respected the fullrange flag and the video data was "limited range" or "legal" Y16-235, the conversion matrix would be PC matrix in avisynth terms - so you can simulate this in avisynth. It looks more washed out, less contrast






    What if I encoded the same video using MediaCoder but a different format and the results were the same as when encoding using x264 (ie huffyuv inside an AVI). If they both display with the same level differences compared to the source, are the levels being converted while encoding? Cause I just tried it using huffyuv and that's what happened. Dark stuff got darker, brighter stuff got brighter, video looks like crap. Any fullrange type flags to mess with the way the video is displayed there?
    I don't know what mediacoder does, but beware some programs don't treat huffyuv as "lossless" YUV. Some convert it to RGB, and some use full range (e.g. vegas), some limited range (e.g. older versions of premiere pro) .

    I opened the original AVI and the huffyuv version using VirtualDub. They display differently in the preview window. Using clrtools (which I've barely used) the levels certainly appear to have changed to me.
    I missed it somewhere - What was the "original AVI" and huffyuv again ?


    I was asking earlier if the VUI method was the only way of storing the --range parameter or if that info can be stored elsewhere in the video stream. I've used the word "level' referring to "range". Sorry, if that confused the issue.
    As mentioned earlier, in newer x264 versions, --range can actually change the YUV data, when --input-range differs . So be careful, it's not just VUI information anymore

    I do realize --fullrange was just a label, which is pretty much why I was originally convinced the video levels were being converted, contrary to Slipster's claim the --fullrange flag was being added and obeyed by my player, as that seemed way less likely to me anyway. I'm not sure if I've ever seen a player pay attention to that sort of thing.
    It might be mediacoder version using older vs. newer x264. If it was using the old syntax with older x264 build, --fullrange would do nothing to YUV data . In newer builds --range pc would could convert actual yuv data, unless --input-range was entered and also set to pc


    Anyway...... with MediaCoder's Auto Level set to "full range" and encoding using huffyuv I still see exactly the same level differences after encoding as when encoding using x264, and it appears I can see the difference using clrtools in VirtualDub, so unless there's some other likely explanation for why the level change is consistent regardless of encoder, would an assumption the levels are being changed during conversion be a fairly logical one?
    That's reasonable, but not proof. Some programs output range differs depending on type format as well. For example, vegas will output a different range when exporting wmv/asf vs. h.264/mp4 under certain settings.


    If so, I wonder how long Slipster's been encoding using that Full Range setting?
    To me, it doesn't make sense to use full range flag or full range anything, unless you have full range data




    There are so many little "quircks" and idiosyncracies between programs, that I would avoid doing any of this. All these guis do other stuff and processing behind the scenes. If your original intention was to see 2pass vs. CRF - This is the KISS rule, just use x264.exe. You want to eliminate all variables except the one being tested
    Quote Quote  
  13. Originally Posted by poisondeathray View Post
    If a player respected the fullrange flag and the video data was "limited range" or "legal" Y16-235, the conversion matrix would be PC matrix in avisynth terms - so you can simulate this in avisynth. It looks more washed out, less contrast
    Thank you. That's what I wanted confirmed. So the fact I'm seeing more contrast means the video can't be encoded using the same TV levels while the player respects the incorrectly set --fullrange flag, because if that was the case the contrast would be decreasing, not increasing as is happening.

    I feel fairly safe using MeGUI/AVIsynth as what's going on is fairly transparent. It's MediaCoder, which I never use myself (until now), which makes it hard to work out what's happening behind the scenes. And in front of the scenes, and most other places....

    I agree. Setting a full range flag if the video uses TV levels is silly, expanding it to PC levels when encoding is silly. Either could cause problems. I don't see how adding the --fullrange flag to a limited range video could do anything other than cause it to be displayed incorrectly, assuming it has any effect at all. Slipster's theory seems to be something along the lines of it allowing his encodes to automatically have their levels expanded by the player to display correctly on a monitor/TV expecting PC levels..... or something to that effect. Along with his belief most TVs expect PC levels at their inputs by default.

    The "original AVI" I was referring to was just one I plucked off my hard drive at random to re-encode and test the effect of Mediacoder's full range option. Just a 5 minute Xvid encoded music video.

    Anyway..... whatever the reason, I've encoded a DVD using MediaCoder without the fullrange option set and it's displayed using exactly the same levels as the source on my PC. As have the encodes of the same DVD using MeGUI. So far I've re-encoded an AVI and a DVD using MediaCoder's full range option and that's when things change. When encoding using both x264 and huffyuv. One way or another it's messing with the way the video is displayed and not in a nice way. My bet is it's expanding the levels while encoding. I'll be absolutely astounded if MediaCoder happens to be setting a --fullrange flag when encoding using x264 without changing the levels, expanding the levels when encoding with huffyuv instead, and somehow either way the output displays as though the levels have been expanded on my PC. "Astounded" doesn't really cut it. I'll probably need a defibrillator if it turns out that's what's happening.

    Anyway.... time for me to go bye bye. To be continued...... I guess........
    Last edited by hello_hello; 11th Sep 2012 at 16:08.
    Quote Quote  
  14. jagabo,
    I tried a CRF20 vs 2 pass encode using a similar method to yours. Similar in that I didn't have the motivation to rip a Bluray disc from scratch to do it, but I started with a 720p CRF18 encode and resized it to a lossless 704x396 AVI to use as the starting point. Then I re-encoded that AVI "as-is".

    The source was pretty clean (I ran a noise filter on it when encoding the original disc) but the main difference I could see as a general rule was in any background blocking when displaying the encodes on my TV. But that's in relation to the way the pattern of blocks was arranged rather than one being more blocky than the other, and it's frame by frame comparing.... nothing you'd be able to pick while the video was playing.

    Anyway, I went through quite a few frames of different types (static, motion, low detail, high detail) and nothing's changed for me. Definite minor compression differences to be seen frame by frame. Quality differences zero.
    In fact compared with the last comparisons using an average quality DVD as the source, once you're using a better quality source there seems to be smaller differences in the way individual frames have been compressed when comparing CRF to 2 pass at the same bitrate.

    So nobody knows what this "final ratefactor" in MeGUI's log file after running a 2 pass encode is exactly? The one which has suspiciously CRF type values.
    Quote Quote  
  15. Just to complete the earlier MediaCoder "Auto Level" discussion (hopefully):

    I've been reading up on color spaces to improve my knowledge, so I re-installed and looked at MediaCoder's conversion options just to see what they are, and while I was at it I opened MediaCoder's settings and found the one for Auto Level. According to the tooltip for Auto Level, Slipster is completely wrong regarding what Auto Level does, and I turned out to be 99% as wrong as him. It's nothing to do with setting --range type flags, and it's nothing to do with converting TV levels to PC levels. According to the tooltip:

    Auto Level: Automatic brightness/contrast correction.

    I very much suspect it works in much the same way as ffdshow's luminance level fix under Picture Properties, which also has a "full range" option. From what I've observed when using ffdshow's "luminance level fix" it works very much like volume normalizing, which slowly turns the volume up then drops it again when the peaks reach a certain level, except instead of it being applied to audio, it's being applied to luminance.

    I've watched it in action using ffdshow, and when there's a static scene the luminance levels change gradually, then appear to "reset" when the brightness increases. That sort of thing. And it explains something I noticed yesterday while looking at the MediaCoder encode and comparing it the source. Maybe the penny should have dropped then, but there were a few times I compared identical frames and thought they looked basically the same, then I moved on a bit and they didn't..... and even though according to clrtools the levels were different to the source they did still appear to be clipped at 16-235, which I thought was a bit odd, but I just put it down to not knowing how to use clrtools properly. I guess I was just looking at the Auto Level function in action, gradually adjusting brightness and contrast.

    Whether that's exactly how it works or not (I can't see myself running an encode just to check), as Slipster deleted his last post like a baby again it appears he's at least still reading the thread, so I thought I'd post back with that info, as it's something no sane person would want to apply while encoding. While I've no idea how consistently Auto Level adjusts brightness and contrast over repeated encodes of the same movie, or whether it's adjustments are frame rate dependent (meaning it could change luminance differently according to the playback (encoding) speed... which ffdshow's auto deblocking definitely does.... but it's another variable which needs to be removed before comparing CRF and 2 pass encoding.

    Mind you I don't know how anyone could encode with Auto Level enabled and not notice fairly quickly, especially with it cranked up to full range, given I noticed it almost immediately the first time I used it. Especially when that same person seems confident they can pick CRF vs 2 pass encoding differences at 50 paces with one eye closed, and lectured me on using my eyes. Oh well...... if Slipster's still reading the thread I may have done him a favor when it comes to future encoding, and given he's probably applied Auto Level adjusting to dozens, if not hundreds of encodes.... well knowing that can be my reward for being the adult in the discussion.

    PS. Under Mediacoder's x264 options there's one called "Full range samples setting" which can be enabled or disabled (disabled after loading Slipster's preset). And while the tooltip just describes it as "full range samples setting" it appears to be the one which would add "--fullrange" to the command line (although it doesn't appear in the command line MediaCoder displays), so I guess MediaCoder is using an old version of the x264 encoder if "--fullrange" is now redundant.
    Last edited by hello_hello; 12th Sep 2012 at 21:40.
    Quote Quote  
  16. Yes, with professionally prepared sources you usually don't need automatic gain or any other type of auto brightness, contrast, or color "enhancement".
    Last edited by jagabo; 12th Sep 2012 at 21:33.
    Quote Quote  
  17. Member
    Join Date
    Aug 2012
    Location
    UK
    Search PM
    Originally Posted by hello_hello View Post
    Auto Level: Automatic brightness/contrast correction... it's something no sane person would want to apply while encoding.
    I'll give you that one in terms of what it's doing as you are right and I am undeniably wrong. Thanks for investigating and pointing it out.

    Just to recap for jagabo, I had made it clear that the odd 'chunky' macroblock quantiser behaviour at CRF20 was annoyingly visible here at 1 metre away from an HDMI connected 21.5" screen when using MeGUI on default settings. I posted a link to a problematic CRF25 test encoding in which h_h_ saw nothing to complain about, but everyone's monitor setups and tolerance to artifacts is different.

    I think we agreed that it was most likely down to a CPU family related bug (or something similar) as at least one other person has spotted broadly similar behaviour as reported by Makavelli84(?) on Doom9, although I see that he's basically being told that he's imagining it too.

    As jagabo says, it shouldn't be necessary to adjust anything, but I'm noticing a large difference in black level here between DVDs when applying no adjustments at all (down to poorly prepared professional sources I guess), so I'd rather it be slightly wrong occasionally than clearly wrong throughout some entire movies. I wanted PC levels out so that it plays back on PC media centres without the need to touch any media player settings. That's exactly the behaviour I'm getting here across all sources, or at least it looks that way here in both MPC-HC and VLC on 3 different PCs.

    I think it's an "each to their own" setting rather than something "no sane person would want to apply" if the sources are clearly 'broken', unless each source is to be fixed on a one-by-one basis. I'll not make the preset publicly available again though without making this clear and giving instructions on what it does and how to disable it.

    Under Mediacoder's x264 options there's one called "Full range samples setting" which can be enabled or disabled (disabled after loading Slipster's preset). And while the tooltip just describes it as "full range samples setting" it appears to be the one which would add "--fullrange" to the command line (although it doesn't appear in the command line MediaCoder displays), so I guess MediaCoder is using an old version of the x264 encoder if "--fullrange" is now redundant.
    It does in MediaCoder x64 2011 build 5226. I've just checked, although I've no interest in playing with it right now and probably won't.

    PS This post stays if we can mutually agree to stop winding each other up.
    Last edited by Slipster; 12th Sep 2012 at 23:35.
    Quote Quote  
  18. The main reason I think the Auto Level function is a really bad idea is because if it works the way I think it does, it'll change the contrast/brightness gradually as the video progresses. Much of the time scenes will change as the video progresses and you won't notice it, but if it's anything like the way the luminance level fix function in ffdshow works, once you do.....
    I've literally watched a section of video, which over the course of a couple of minutes went from being a little dark to very washed out, and if MediaCoder's Auto Level setting is doing something similar I wouldn't want it doing it to my encodes. If I wanted to apply something like that I'd be doing it on playback.

    I've adjusted brightness and contrast when encoding a few times in the past because some video simply needed it, but only very old low quality stuff and I keep it consistent throughout. I've even, would you believe, expanded the levels using AVIsynth's TV->PC function, but only in cases where it was obvious the video was actually transferred to disc or broadcast incorrectly. Most of the extras on the Anthrax DVD and an episode of The Universe (documentary) come to mind, and I'd have to check which, but there's an entire DVD of episodes from season 3 (I think) of Battlestar Galactica which were transfered to (PAL) DVD incorrectly. In the latter case you can tell because the color of "space" in the opening credits is dark grey. Converting the levels from TV to PC fixed it and the rest of the video no longer looks washed out. Or is it the other way around..... I'm confused now.

    To be honest I think there's something possibly wrong with the levels between your playback devices as aside from VGA, I don't think I've ever met an input on a TV which uses PC levels by default. I'm sure even the dedicated PC HDMI input on my TV was set to "low" by default, which I'm sure = TV levels. Changing it to normal gave me PC levels. I remember thinking it was pretty counter-intuitive at the time, but I will check that later in case I'm just confused.... got to head out for a bit.
    Quote Quote  
  19. Originally Posted by Slipster View Post
    I wanted PC levels out so that it plays back on PC media centres without the need to touch any media player settings.
    Every player I'm aware of converts YUV 16-125 to RGB 0-255 for display. Actually, it's normally the job of the graphics driver's proc amp since the player sends YUV to the graphics card and the graphics card does the conversion to RGB. If you're not getting that behavior your graphics card is set up wrong. And you are screwing up all your encodings by setting the levels wrong to compensate for a maladjusted video card. YUV should always be Y=16-235, U and V=16-240.

    Get a levels calibration video and adjust your graphics card's video proc amp.
    https://forum.videohelp.com/threads/326496-file-in-Virtualdub-has-strange-colors-when-o...=1#post2022085
    https://forum.videohelp.com/threads/309485-Bad-quality-downscaling?p=1907812&viewfull=1#post1907812
    Last edited by jagabo; 13th Sep 2012 at 08:47.
    Quote Quote  
  20. I had a quick play just to make sure I wasn't going silly and that's how my TV works. By default the inputs (except VGA) seem to expect TV levels. I checked a few different ways to make sure I'd got it correct, but basically with my PC connected via VGA and the video card set to expand TV levels to PC levels (which I'm sure is correct), after I reset the HDMI input, I had to stop expanding the levels to get video to display the same way as it was via VGA. So the default HDMI input setting of "low" = TV levels, while "normal" = PC levels. VGA won't let me change the input level.

    When connected via HDMI my video card lets me change the output from RGB to YCbCr444, which I think is the same format the Bluray player outputs (even if I knew how to do it I can't check it at the moment anyway because it's been so long since I've used it I can't find the remote). With the video card set to output YCbCr444 the TV no longer gives me an option to change input level. Only when sending it TV levels does the video display correctly.

    So for me the default HDMI input for video is TV levels when the input is RGB, and TV levels only when the input is YCbCr444.

    Given I still had MediaCoder installed I ran another couple of quick re-encodes of my 5 minute music video, as it begins with some static scenes. The bad news is the Full Range Auto Level setting does adjust contrast/brightness by varying amounts. As the first image was displayed for a couple of seconds I opened the original AVI twice and looked at the first and last frame (before the scene changed) and the image hadn't changed at all. After re-encoding it I did the same and while the first frame was different to the original, the last frame was even more different. Not by a massive amount, but it was different. It seemed to keep adjusting as the video progressed but obviously fairly slowly as I couldn't find a point where it seemed to suddenly change, but depending on the scene the difference between the encode and the original was sometimes small, often huge, in quite a few places enough for the encode to look horrible by comparison.
    Unfortunately though, no matter how much or little it changed the contrast/brightness as the video progressed, the effect was never the same as actually expanding the levels of the original video. Most of the time, actually expanding the levels looked better.

    The good news though is at least the Auto Level function is consistent. I tried CRF15, CRF25 and 2 pass, and while they obviously weren't compressed equally, I couldn't see any obvious differences in the way Auto Level was "adjusting" the video as it progressed.
    Last edited by hello_hello; 13th Sep 2012 at 08:53.
    Quote Quote  
  21. Originally Posted by jagabo View Post
    Every player I'm aware of converts YUV 16-125 to RGB 0-255 for display. Actually, it's normally the job of the graphics driver's proc amp since the player sends YUV to the graphics card and the graphics card does the conversion to RGB.
    By default? My experience has been the opposite. They convert to RGB without changing the levels. In my case the default Nvidia setting in the Nvidia control panel for video is "use the player settings" and as MPC-HC is using the WMR9 renderer on my PC, it has no way to expand the levels (unless I deliberately do it using a pixel shader). The option to change output levels is greyed out. I have to change the Nvidia setting to "with Nvidia settings" then the "Range" option to 0-255. If I leave it on 16-235 the video displays the same way as when using the "use the player settings" option.
    Same with the actual PC monitor. I have to specifically set the levels as 0-255 in the video card's options for video to display correctly.

    And it's one area which offers an annoyance with my setup, The Nvidia "full range" setting doesn't generally survive a reboot for the secondary display, so I need to remember to reset it each time.

    When using the EVR renderer though, MPC-HC will change levels via the right click Renderer Settings menu and I'm sure it defaults to 0-255, so maybe the default level expansion thing is renderer specific? Recently I set up my other half's laptop and it uses ATI video, yet I had to set MPC-HC to output 0-255 so video would display correctly using it's inbuilt display as the video card offered no way to expand the levels and obviously wasn't doing it by default.
    And just to make that setup more fun, her olde(er) TV only seemed to want TV levels via HDMI, although fortunately her ATI card would adjust the levels for an external display, so I'm pretty sure in the end I had MPC-HC expanding the levels for the laptop display, then the video card reducing them back to TV levels for the TV, so she could still switch between displays without the levels being wrong. I'll have to check that next time I see her but that's how I remember it anyway.....
    Quote Quote  
  22. Any idea how this works? I just opened a video using MPC-HC (WMR9) to use as a reference, then I opened another instance of MPC-HC, switched to the EVR renderer, closed it so the change would take effect, then opened the same video. The output range is definitely 0-255 by default, but even set that way the video still displayed the same way as using the WMR9 renderer. Switching to 16-235 made no difference. So it seems with MPC-HC outputting 16-235 the video card will expand the levels as per it's full range setting, but with MPC-HC set to output 0-255 it leaves the levels alone, not expanding them a second time. I'm just curious how the video card knows when the output is already 0-255 and it should leave the levels untouched. Obviously that's a good thing, I just don't know how it works.
    Quote Quote  
  23. Originally Posted by hello_hello View Post
    Originally Posted by jagabo View Post
    Every player I'm aware of converts YUV 16-125 to RGB 0-255 for display. Actually, it's normally the job of the graphics driver's proc amp since the player sends YUV to the graphics card and the graphics card does the conversion to RGB.
    By default?
    It' supposed be the default. There have been many bugs regarding video levels with all the graphics device manufacturers.

    It's very simple. You have two systems: YUV where black is Y=16 and white is Y=235, and RGB where black is R=G=B=0 and white is R=G=B=255. Whenever you convert from YUV to RGB you should expand the contrast. When converting from RGB to YUV you contract the contrast. What you see on the screen of any device is always RGB. At that point black=0, white=255.

    Studio RGB (where black is R=G=B=16, white is R=G=B=235) is a specialty RGB system used by some editors (most notably Sony Vegas) when working in RGB to allow recovery of blacker-than-black, and whiter-than-white from YUV sources.

    I only know of one codec where YUV routinely ranges from 0 to 255: Motion JPEG. That's because JPEG internally uses Y values from 0 to 255 (it's a photo codec, not a video codec). Those sources have to have their contrast reduced when converted to any of the standard video formats.
    Quote Quote  
  24. Why is it always me who ends up arguing......

    I tried your test videos, only I'm not sure I agree with everything you wrote in your posts. 16-235 is the level range used by most video however it's not compulsory to clip the levels at 16 and 235.... at least that's my understanding. Isn't that how blacker than black and whiter than whites come about?
    Likewise I don't think TVs are necessarily limited to 16-235, if they're capable of displaying blacker than black and whiter than white?

    Anyway..... here be the thing.....
    I have the settings for the VGA and HDMI inputs set exactly the same (the TV lets me adjust brightness/contrast etc per input). There's some contrast/brightness differences between them, but nothing to an extent I've ever tried to match them better. If the two inputs aren't displaying video using the same levels though, I'll ban myself from having an opinion an anything, ever again.
    Using your test videos, I can't see anything blacker than 16 or whiter than 235 when viewing your videos via VGA. Using HDMI I can. There's not a huge difference, it's very subtle, but I can very slight gradients below 16 and above 235. Well at least one level below 16

    Maybe my TV calibration isn't all that flash.... I used a couple of test videos I downloaded to calibrate it by eye it at the time, nothing professional, although I was surprised how close the most "boring" factory preset came to being right, and then from there made some minor adjustments according to what I preferred to look at.

    Edit: Actually on second look I might take a break because I must be imagining things.....
    TV levels out and in, I can see a fair distinction between 239 and 251, and a very, very tiny one between 251 and 255.
    PC levels out and in, no distinction above 239.

    I'm sure I've got the input levels matched each time, because if they're not I either loose all the stuff above and below 235 and 20 completely, or I can see a huge difference at each end of the scale.

    Enough for me.... I think I'll actually watch a video for a change.....
    Last edited by hello_hello; 13th Sep 2012 at 10:23.
    Quote Quote  
  25. Originally Posted by hello_hello View Post
    Why is it always me who ends up arguing......

    I tried your test videos, only I'm not sure I agree with everything you wrote in your posts. 16-235 is the level range used by most video however it's not compulsory to clip the levels at 16 and 235.... at least that's my understanding. Isn't that how blacker than black and whiter than whites come about?
    Of course the video file can contain Y<16 or Y>235. They are there for overshoot. But Y<16 should render the same shade of black as Y=16 on a properly set up display. Y>235 should render the same as Y=235. This is the defined standard for digital video or maladjusted source device.

    Originally Posted by hello_hello View Post
    Likewise I don't think TVs are necessarily limited to 16-235, if they're capable of displaying blacker than black and whiter than white?
    You can use the brightness/contrast controls to bring out details on the illegal regions (the proc amp controls work in YUV, before conversion to RGB to be put on the screen). But there isn't supposed to be any significant picture information there. And those details normally shouldn't be visible. You should only use the adjustments when you have an improperly encoded video.

    Originally Posted by hello_hello View Post
    Using your test videos, I can't see anything blacker than 16 or whiter than 235 when viewing your videos via VGA. Using HDMI I can.
    Then your TV and/or computer isn't adjusted properly. When you have your computer's HDMI port set up to output RGB it should use the same 0-255 range as VGA*. YUV sources should be expanded by the player or graphics card. If you have it set up to output YUV it should use the standard 16-235 range, then your TV should be performing the contrast expansion. The TV should be displaying the RGB and YUV inputs exactly the same.

    * Keep in mind that VGA is analog and prone to all the problems of analog video -- including inconsistent levels from maladjusted or malfunctioning electronics.
    Quote Quote  
  26. Originally Posted by jagabo View Post
    Then your TV and/or computer isn't adjusted properly. When you have your computer's HDMI port set up to output RGB it should use the same 0-255 range as VGA*. YUV sources should be expanded by the player or graphics card. If you have it set up to output YUV it should use the standard 16-235 range, then your TV should be performing the contrast expansion. The TV should be displaying the RGB and YUV inputs exactly the same.
    Maybe a miscommunication......
    I didn't actually compare RGB and YUV using your test videos. I was comparing RBG/VGA to RGB/HDMI while changing the input output levels.
    RGB/VGA looked pretty much as you said it should. RGB/HDMI changed just a little bit according to the way it was running. ie with the PC outputting PC levels and the TV input set for PC levels, it looked pretty much the same as VGA. With them both running at TV levels I could see a little more detail below 20 and above 235. Not much. Just a little bit.

    It probably is the way the TV is set up. As I said I just set it up manually using a few test videos similar to yours, but from there I adjusted it a tad according to what my brain liked when viewing video. I think basically I just gave the gamma a little nudge.

    I don't use YUV out from the PC for the same reason I expand the video levels to full range and set the TV for PC levels when using RGB. Using TV levels or YUV the video looks fine either way, but of course all the non-video stuff (ie Windows itself) which uses 0-255 looks a bit dark. I assume when the TV is expanding the levels it can't discriminate between what's video and what isn't.

    Given we have the opposite experience on what seems to be the default output level for video cards (RGB) what's your experience regarding the default input level used by TVs? I've not played with many TVs at all, but I'm sure I haven't met one yet which defaults to full range. That's HDMI of course. VGA uses full range levels.
    Last edited by hello_hello; 13th Sep 2012 at 17:22.
    Quote Quote  
  27. Originally Posted by hello_hello View Post
    Given we have the opposite experience on what seems to be the default output level for video cards (RGB) what's your experience regarding the default input level used by TVs?
    Who cares what defaults any of the devices use. Most TVs have atrocious defaults. When people go to a showroom and see several different TVs they'll think the one with the brightest picture, most contrast, and most saturated colors is the best. So all the manufacturers pump them way up out of spec to look "good".

    There are defined international specs for all these video issues, YUV 16-235, RGB 0-255, analog VGA (RGB=0=0V to RGB=255=0.7V). It's your responsibility to adjust all your equipment. With properly set up equipment they will all look the same.

    Remember there are two ways pictures get from an application to the display under Windows: Windows GDI (used by Desktop applications) and Video Overlay (the application sends YUV to the video card, the video card produces RGB outout if necessary). There are separate proc amp adjustments for these two systems in Windows.

    What I usually do is set the graphics card's desktop proc amp to neutral. Then use a BMP or PNG 256 grey level test pattern to adjust the TV (or monitor) so that all 256 grey patches can be differentiated*, with 0 as dark as the TV can display, 255 as bright as the TV can display (at the current brightness setting -- eg, I run my TV with the backlights set at 2 or 3 out of 10, regardless of the backlight setting nearly all 256 shades should be distinguishable, even though the brightness of the picture varies with the backlight setting). Then I use a video test pattern like the ones I linked to to adjust the video proc amp.

    * In practice a few grey levels may not be different -- for example many TVs crush the darkest shades so RGB=0 may not look different than RGB=1.
    Quote Quote  
  28. Well the default picture settings and the default input levels are two different things, although I'm not sure how inquiring which levels TVs generally use by default relates to who's responsibility it might be to adjust settings to calibrate one properly.

    Anyway, in the case of my Plasma (and I'm sure different manufacturers do it differently) when I first fired up the TV it asked me if it was for showroom viewing or normal home viewing. I picked the latter. Each input has (generally) three different modes which can be configured independently for each (standard, movie, gaming). Once I disabled all the image enhancing crap, the default standard mode actually ended up being very close to properly calibrated according to the test videos I looked at (ones similar to yours). Once I thought I had it correct I wrote down the settings and reset the input to see what the differences were and aside from a minor brightness level difference I think they were basically identical.

    The other way I worked out I could do it was by displaying widesceen video where the player was adding black bars, dropping the brightness down to nothing and setting the contrast at around 90% to 100%, and if I then increased the brightness until I could first begin to see a few grey pixels appearing in the "black" area (looking a few inches from the screen), then dropped the brightness back to where they disappeared, that left the TV pretty calibrated at both ends of the scale according to the video test samples I was running. I don't know if it'd work for every TV but it seemed a nice and easy way to go about it.
    Last edited by hello_hello; 13th Sep 2012 at 20:03.
    Quote Quote  
  29. Originally Posted by hello_hello View Post
    Once I disabled all the image enhancing crap, the default standard mode actually ended up being very close to properly calibrated according to the test videos I looked at (ones similar to yours).
    But how do you know the player output was correct? That's why I use an RGB image with the desktop proc amp set to neutral. It's the best way of assuring the RGB values in the image reach the TV unchanged. Then you address the video proc amp.
    Quote Quote  
  30. I guess the only way to know if the player's output was correct would be to compare it's addition of black bars to the black the TV displays itself when it's not displaying anything else. I have a vague feeling I did try it.... running 4:3 video with black bars then comparing the effect of increasing the brightness to the black each side of the 4:3 picture area, but it was a fair while ago now. I might try that again later, out of curiosity.

    One thing I have been meaning to try for quite a while is to test how devices make colorimetry decisions (HD v SD) cause that's an area which annoys me more than TVs which aren't calibrated exactly. I've had a few PC vs standalone player debates in the past (involving which makes a better media player) but nobody's ever been able to tell me which colorimetry their standalone player might use at any particular time, or if it's likely to pay any attention to any colorimetry info in the video itself. I recall reading a few posts in another forum where someone was complaining about the WD media players in respect to colorimetry.
    I've run a few tests using the PC (MPC-HC with video card output being RGB) and while I'd have to find where I wrote it down to remember exactly, I know when using the WMR9 renderer, video changes colorimetry according to an odd combination of width and height, which basically causes it to display some cropped 720p video using standard definition colorimetry (using XP at least). (ie 1280x544 will display incorrectly while 1280x720 will display correctly). I think madVR and Haali make their choices based on similar rules to the ones ffdshow uses when converting video to RGB so they'll usually get it right. I've not done any testing using the EVR renderer.

    Anyway, even though it'll probably be harder to test fairly accurately, I've been meaning to work out if the Bluray players here (2 different brands) change colorimetry according to resolution and if they do it the same way. I assume if the players are both outputting 1080p to the TV they'll be choosing colorimetry if it changes. Likewise I don't even know how the colors might be effected when simply using the TV's built in player to play video via USB. and then there's the media player in my Android phone.... Changing colorimetry for HD video was really dumb if you ask me.

    Given all the x264 "scene" encoders keep the original HD colorimetry when resizing down to standard definition I'm kind of curious to know if players pay any attention to the colorimetry VUI info written to the video stream when using the x264 encoder and display it correctly. I've not seen a software player pay any attention to that sort of thing. Now I've thought about it again I might put some test videos on a USB stick and make it a project for one day over the weekend as I don't think I've got anything important I need to be doing.
    Last edited by hello_hello; 14th Sep 2012 at 14:05.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!