I'm trying to encode a 1080p video to 720p (x264 and x265)
The source video is clean and doesn't need any debanding but output video is very dirty
I use crf 24 for both x264 and x265 encodes.
+ Reply to Thread
Results 1 to 15 of 15
Use 10-bit encoding with either encoder. Even with low bitrates you won't get banding.
Use crf 22,crf 24 is a bit too low,you will see more artifacts at 24 since you want to use 8bit encoding.I think,therefore i am a hamster.
Last edited by ben45; 7th Aug 2016 at 14:54.
You can try this: Find out what the bitrate you are getting at crf 24 medium preset using media info. Plug that bitrate into your encoder using a
2-pass (slow first pass) and a slow preset. And try turning down the adaptive quantizing strength (aq=0.5 to 0.6). And for further improvement
you can use tune grain and avoid larger file sizes. As for me, I use a crf of 18 or 17 when doing a single pass method for typical encodes.
This method has helped me before with varying degree of improvement whenever I've encountered banding but felt I needed to control the
file size. With this method you can also play around with adding some grain to act as dither as suggested in an earlier post.
Just now checked crf 24 encoded file bitrate. It's 813kbs! Re-encoded same file with 2-pass method and 500kbs bitrate and....... 2-pass killed %50 of bandings and quality is much better than crf 24!!
2-pass encode speed is the only problem. it takes 2x realtime to encode a 720p and 4x realtime for a 1080p
Last edited by ben45; 7th Aug 2016 at 15:37.
You're welcome. And yes, encoding time would obviously at least double. Patience is everything when dealing with video encoders.
I never use crf greater than 20
It's quality versus file size
Which one is more important to you
If your after a specific file size you have too live with the limitations of quality that it introduces
DitherTools can help quite a bit. Start by sticking gradfun3() at the end of a script. I rarely encode without it. Admittedly I also tend to use lower CRF values, but it should still help.
Even though technically it adds noise, I find if anything gradfun3() reduces the bitrate a little for a given CRF value. I'm not 100% sure why.
Or DitherTools can convert to 16 bit, resize in 16 bit, then dither back down to 8 bit, in which case you probably wouldn't need to add gradfun3() to the script as resizing that way would be less likely to induce banding than 8 bit resizing.
For the record, there's 8 bit and 10 bit builds of the x264 encoder, but only one x265 encoder. It can encode at 8, 10, or 12 bit according to the profile you set. MeGUI's x265 encoder configuration is as basic as it gets at the moment, but adding the following to the custom command line section will enable 10 bit x265 encoding:
x265 also has an option for setting the output bitdepth (--output-depth) which is also the bitdepth it'll use internally, apparently.
I don't use 10 bit x264 myself as hardware player support is fairly non-existent and probably always will be.
h265 included 10 and 12 bit encoding in the spec from the get-go, so with any luck that'll be a different story... once h265 playback support is mainstream. Hardware player support for h265 is still pretty limited regardless of the bitdepth at the moment anyway, so if I was to use the x265 encoder I'd be pretty tempted to go straight to 10 bit.
Last edited by hello_hello; 7th Aug 2016 at 16:39.
[QUOTE=ben45;2455279]avisynth and mentioned dithertools by cretindesalpes http://avisynth.nl/index.php/Dither_tools - never tried zscale which is implemented in ffmpeg) that can provide for example YCbCr 6 bit pattern dithered over 8 bit YCbCr 420p - it is quite strange btw...
Thankyou so much guys
Seems i have to increase bitrate a bit