VideoHelp Forum




+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 46 of 46
  1. Member Ethlred's Avatar
    Join Date
    Feb 2008
    Location
    United States
    Search Comp PM
    The banding in the shots you posted were caused by the increase in contrast between the original and the copies. Ten bit could have decreased that I suppose but it wouldn't have been needed if the original contrast had been maintained.

    I had to look carefully to see any banding. The main area I saw it in was the flare which had a big increase in range between dark and light. The thing about 10 bit on PCs is that no one has a ten monitor or a ten bit OS. On a PC the 10 bits are going to be constrained to 8 bits in the final result no matter what you do. Increasing the contrast is going to cause banding.

    X264 used to have a real banding problem before they added aq-mode

    aq-mode

    Adaptive Quantization Mode

    Default: 1

    Without AQ, x264 tends to underallocate bits to less-detailed sections. AQ is used to better distribute the available bits between all macroblocks in the video. This setting changes what scope AQ re-arranges bits in:

    0: Do not use AQ at all.
    1: Allow AQ to redistribute bits across the whole video and within frames.
    2: Auto-variance AQ (experimental) which attempts to adapt strength per-frame.

    See also: --aq-strength

    Make sure AQ is not disabled in your encodes and you will have less banding. But if you increase the contrast you can get banding on a PC even if you use 10 bit because you are increasing the slope beyond what the original data supported as a smooth gradient. A difference of one bit in density can become two and thus a band forms.

    Ethelred
    Quote Quote  
  2. 10bit means HIGHER INTERNAL PRECISION WHICH ELIMINATES BANDING even if output is 8bit !!!

    Click image for larger version

Name:	XtqLc.png
Views:	6177
Size:	595.6 KB
ID:	11323
    Quote Quote  
  3. Member
    Join Date
    Mar 2012
    Location
    Lebanon
    Search Comp PM
    @Ethlred, next time I'll do an encode I'll make sure to try the AQ thing, thanks for the tip!
    Quote Quote  
  4. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    Originally Posted by Deadly9000 View Post
    No idea why you even mentioned it although I never mentioned original files weren't 10bit?
    I get that English is not your native language. However, you'd do well to stop adopting moronic habits native speakers now use like putting a freaking question mark at the end of any sentence that indicates puzzlement. What you wrote is NOT a question in English. Don't put a question mark at the end of it.

    You didn't mention whether they were or were not 10 bit. He just guessed. I think it's way overkill for animation but it's your time to spend doing this, so I don't care.
    Quote Quote  
  5. As mentioned earlier, 10bit x264 and decoders are widely available now , and there are hundreds of posted tests demonstrating the benefits, and benefits are not limited to gradient banding. It's hard to quantify, because it varies on content, but I've been seeing anywhere from 5-20% difference on typical content (in terms of SSIM or PSNR values)

    The negatives - it's still slower than 8bit, even with the new optimizations in recent patches, and poor hardware support (strictly limited to PC playback currently). Also , I find 10bit x264 2pass rate control b0rked currently. It's way off

    For banding - AQ helps (I've posted several comparisons on varying AQ before, and the effect compared to other h264 encoders ), as does lowering deadzones, increasing psy , or post processing filters like dithering or adding grain. But most of these will end up requiring more bitrate to achieve the desired effect.

    It's important to compare at equivalent bitrates. So either 2pass tests, or do multiple encodes are varying CRF until you get equivalent bitrates.

    If Atak's example was difficult to compare the differences - these were 1280x720p25 5Mbps encodes from uncompressed 8-bit 4:2:0 source. It illustrates typical gradient banding more clearly, included is an exagerrated luma view done with histogram("luma") . Dithering can get you similar results as the 10bit encode - but it would require much higher bitrates (dithering is essentially noise, and hard to compress, even ordered dithering)
    Image Attached Thumbnails Click image for larger version

Name:	8bit.png
Views:	1425
Size:	821.0 KB
ID:	11324  

    Click image for larger version

Name:	8bit exagerrated luma view.png
Views:	1233
Size:	448.7 KB
ID:	11325  

    Click image for larger version

Name:	10bit.png
Views:	2119
Size:	1.25 MB
ID:	11326  

    Click image for larger version

Name:	10bit exagerrated luma view.png
Views:	1163
Size:	719.8 KB
ID:	11327  

    Quote Quote  
  6. Member Ethlred's Avatar
    Join Date
    Feb 2008
    Location
    United States
    Search Comp PM
    I can just barely manage to see a difference between the two in IrfanViewer. I have to look at the area where the blue sky is turning gray near the trees on the left side and NOT look directly at it. Might be easier to see on an LCD. I still have a CRT.

    Seems to be a very good example in that there the same contrast in both.

    I might try playing with it a bit but since AQ mode was added I have not had a visible banding problem. Prior to that it often irked me and mostly I was still using XVid to avoid the problem. I started with the beta versions of AQ and switched to X264 for everything at that point. I don't see a real need for it even with DVD conversions, where the source is 10bit. If I can get the same results as I do now without the dithering and get a lower bitrate it might be worth it.

    Catch is it can only play on a PC which is an 8bit system and the only LED screens, for a remotely reasonable price, that aren't 8bit are 6bit. I have my doubts that it is worth the loss of compatibility.

    Ethelered
    Quote Quote  
  7. Might be easier to see on an LCD. I still have a CRT.
    I've noticed that banding and ringing is less visible on CRTs and on cheap glossy LCDs (laptops). On decent mat LCD/Plasma suddenly you notice all those compression artefacts.

    For example my XviD/DVD videos look alot better on my old CRT TV than on my expensive LCD monitor. Typical Film noise/Grain is not visible. Macroblocking is also not visible and so on...
    Quote Quote  
  8. Originally Posted by Ethlred View Post

    Catch is it can only play on a PC which is an 8bit system and the only LED screens, for a remotely reasonable price, that aren't 8bit are 6bit. I have my doubts that it is worth the loss of compatibility.
    The gains are because of internal precision of x264. Even on a 6bit LCD panel you will see a difference, you don't need a 10bit capable graphics card, displayport or 10bit panel

    It's not just the "banding" either - it simply out performs the 8bit version (quality wise, not speedwise) . You can do a series of encodes, varying the bitrate. You don't require as much bitrate for a given "quality" level. You can check objective metrics like PSNR/SSIM and view them as well for subjective assessment. Look at the reference links posted earlier, Broadcast Engineering is a respected publication
    Quote Quote  
  9. Member
    Join Date
    Mar 2012
    Location
    Lebanon
    Search Comp PM
    Alright since I have no idea what you guys are saying and I feel like mr awkward since I started the topic, I feel like I have an important thing to say.

    Peanut butter jelly time?! Kiddingg, anyway jman98's post didn't really make any sense since I doubt he even understood what I meant.
    And uhh to put it just as simple as possible, difference between 8bit&10bit=10bit shows more depths and colors are closer to each others, in a simple way it's that different colors near each others look smoother(what you guys call "banding") but are harder to notice on most PC screens because they don't support 10bit, or more like they can't difference between 8bit and 10bit since their technology isn't high enough and only high quality screens are able to show 10bit depths.

    ^ Was that considered as a correct summary or any wrong statements in it?
    Quote Quote  
  10. but are harder to notice on most pc screens because they don't support 10bit, or more like they can't difference between 8bit and 10bit since their technology isn't high enough and only high quality screens are able to show 10bit depths.
    can you read?

    the gains are because of internal precision of x264. Even on a 6bit lcd panel you will see a difference, you don't need a 10bit capable graphics card, displayport or 10bit panel
    Quote Quote  
  11. I saw the obvious difference between poisondeathray's 8 bit and 10 bit samples. If you can't see it try using a screen magnifier.
    Quote Quote  
  12. And to answer to the question - you could try this tool .
    Quote Quote  
  13. Member Ethlred's Avatar
    Join Date
    Feb 2008
    Location
    United States
    Search Comp PM
    Atak_Snajpera said this
    can you read?
    Yes. Can you keep track of the order of posting?

    the gains are because of internal precision of x264. Even on a 6bit lcd panel you will see a difference, you don't need a 10bit capable graphics card, displayport or 10bit panel
    That was posted AFTERWARDS and I see little reason to agree with it in any case. A six bit panel is still six bit no matter what you start with. And ALL remotely affordable PCs use no more than 8bits for the final result thus putting strong constraints on any theoretical advantage.

    YOUR example was NOT due to using 8bit vs 10bit. The blocking was due to the increase in contrast. If you had not increased the contrast and used AQ there would have been no banding.

    I not saying you shouldn't use 10bit if you want to. I am saying it not might be worth the compatibility issues. Four times the number of density levels won't help even one bit if it won't play on all the devices you want to play on.

    Ethelred
    Last edited by Ethlred; 8th Mar 2012 at 02:35.
    Quote Quote  
  14. Member Ethlred's Avatar
    Join Date
    Feb 2008
    Location
    United States
    Search Comp PM
    jagabo said this
    I saw the obvious difference between poisondeathray's 8 bit and 10 bit samples.
    I saw it. It just wasn't obvious on my CRT. Even at 100%, though I no longer had to use the off axis trick to see it. I suspect I wouldn't see it at all as part of video but it might be more obvious that way if there was crawling of the bands.

    If it is that much more obvious with an LCD I may have to re-encode a lot of stuff when I get a new monitor later this year. A higher contrast ratio would certainly make any banding more obvious. Banding bugs me almost as much as film shown at 29.976.

    Ethelred
    Quote Quote  
  15. You can also try deband filter in FFDshow. Works very well without noticeable quality drop.
    Quote Quote  
  16. Member
    Join Date
    Mar 2012
    Location
    Lebanon
    Search Comp PM
    Thanks for the reply guys, I've tried to figure out even a bit more on how to use scripts and stuff like that. I've managed to encode a whole bunch of stuff into 55mb files which give me ~300 bitrate for 480p and 85mb for 720p. I'm quite thankful for those that helped And sorry for the late reply, I have had family issues so I've been quite busy lately. Anyway again, thanks for the help&tips
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!