A lower value means better quality, right? I've read users mentioning that faster settings mean a better quality video and others say the opposite.
I'm using DAIN APP to interpolate videos and their default CRF is 16, but as I have to use the VRAM reduced usage setting a 1080p file is reduced to 540p in the output file, so I'm losing quality. Would lowering the default CRF help in any way? Thanks.
+ Reply to Thread
Results 1 to 18 of 18
Yes, lower is better but uses more bitrate.
It only gives you better quality as long as the source is very good to excellent quality,it will never up the original quality.I think,therefore i am a hamster.
@OP, CRF is a bit rate control method, lower CRF values will produce, in theory, higher quality, because they use more bit rate.
As was mentioned by someone else, the practical limit is that it can never give you more quality than the source but an equal limit is the settings of all the so-called "psy-optimizations", AQ, MB-TREE, PSY-RD, PSY-Trellis, Trellis, etc, that all work together to distribute the bit rate Intra and Inter frame in what the developer's hope will result in optimal visual at any given bit rate range.
It's all a bunch of horse feathers of course, but that is the theory.
Well in reality and not some theory, what butterw says is valid. For HD source, you get away with higher CRF 19-20. For SD resolution , while watching on big TV screens, content gets upscaled, every artifact is blown up, so that 19-20 equivalent for HD might show up as a worse,CRF is better to be lower, say CRF, 16-17 or 18.
Of course , you start to use a 120" TV screen and then again that CRF 19-20 for HD content might not be enough. That is the relativity for those things. But I'm sure he is talking about a given screen and setting scenario, using similar or same settings so things make sense in this regard. Not what settings do what.
Encoding business IS businesses dealing with illusion. When you start to bring up, what settings does and criticize it makes a little sense, because at the end it is that illusion that matters. That's the point.
It's advice such as your and the other guys that results in constant questions on this and other forums as to why theire encodes don't look good.
Think before you type.
As for the other hogwash about illusions,
You are what you are.
At the end, size of a content and upscale while watching matters. Because it is us who are watching it. Not some rule book. I can encode 100x100 pixel with CRF 0 and guess how it is going to look on my HD TV? Can you comprehend this?
Last edited by _Al_; 19th Aug 2020 at 10:40.
Artifacts in an sd encode are much bigger on an hd screen than artifacts in an hd encode on the same screen.
HD still needs more bits of course, and using a fixed crf will of course generate a much higher bitrate for the half encode relative to the sd encode. However, it’s often good to use a lower crf for SD content.
It's common knowledge to use higher CRF Values, as the resolution goes up.
From the official Handbrake Documentation:
Recommended settings for x264 and x265 encoders:
RF 18-22 for 480p/576p Standard Definition
RF 19-23 for 720p High Definition
RF 20-24 for 1080p Full High Definition
RF 22-28 for 2160p 4K Ultra High Definition
Imperfections tend to be more noticeable at larger display sizes and closer viewing distances. This is especially true for lower resolution videos (less than 720p), which are typically scaled or “blown up” to fill your display, magnifying even minor imperfections in quality.
You may wish to slightly increase quality (lower CRF) for viewing on larger displays (50 inches / 125 cm diagonal or greater), or where viewing from closer than average distances. Reduced quality may be acceptable for viewing on smaller screens or where storage space is limited, e.g. mobile devices.
You encoded the 4k samples at CRF24, and they looked great. I would never encode a SD source at CRF24. Try downscaling one of your 4k sources to SD and encode at CRF24, then upscale the encode for playback on a 4k display. It's not hard to deduce which encode is going to have more visible encoding artefacts, when viewed at the same resolution.
Well, the original source is 1080HD but as I said due to the limitations of VRAM usage in the program(720p requires 10-11GB of VRAM) it must be halved in resolution(960x540) before interpolation. So the output is SD albeit very good quality SD. I then use AI to upscale to 4K which produces a folder of several TB of tif images, which then is compressed to h.265 where I look for a bitrate similar to that of 4KUHD discs 60-90mbps.
My follow up questions are about the resulting SD files from the DAIN APP. If I set the CRF for DAIN at 14 that gives a higher bitrate SD file. So if I'm looking for best quality, is it reasonable to try and get a bitrate that is half that of the original file, given that it is half the resolution? Did that make sense? Ultimately this about the quality I feed the AI upscale software as that creates(literally) uncompressed 4K data.
Would you care to say why it is a crap? Encoding UHD or SD is not the same, because size does matter.
The meaning of crf values actually depends on what codec you are using (ex: x264), see ffmpeg Video Encoding Guide: https://trac.ffmpeg.org/wiki/Encode/H.264)
I don't know what DAIN APP is or what it outputs.
540p is mod4 (544 would be mod16). 540 to UHD is x4 on Height for AI upscaling.
Given your very specific application, it's probably a good idea to test out what works best for you on a small representative sample of your sources.
I know almost nothing about how video encoding works, but take the use of motion vectors as an example. You have a picture that's spread over 4k's worth of pixels. The picture changes from frame to frame, but isn't it more likely the encoder will find a good match for a block at a higher resolution, because neighbouring blocks are more likely to be different at lower resolutions? Just a thought....
1: A 4k frame, encoded at CRF24 (x264), downscaled to 1080p on playback.
2: The same frame, downscaled to 960x400 and encoded at CRF24 (default settings), then upscaled to 1080p on playback.
3: The same frame, downscaled to 960x400 and encoded at CRF18 (default settings), then upscaled to 1080p on playback.
Notice how the SD CRF24 encode looks pretty average next to the CRF24 UHD source at the same resolution (1080p)?
Notice how the SD CRF18 encode compares far more favourably?
Conclusion.... when upscaling SD to HD or UHD on playback, the encoding quality, or lack thereof, is more noticeable. Therefore, lower resolutions require lower CRF values, while for higher resolutions you can get away with higher CRF values.