+ Reply to Thread
Results 31 to 35 of 35
Better compression and I noticed I got fewer artifacts. I did some test encodes with NVENC H265. Still doesn't beat x264 medium imo though. I have a GTX1070 card. The newer RTX 20 and RTX 30 series cards may be a different story.
Last edited by stonesfan187; 20th Sep 2020 at 16:53.
The only problem with 10-bit H264 is that none of the hardware decoders, or none that I know of supported 10-bit decoding. So you were stuck with software decoding, and 10-bit software decoding took more power than the 8-bit. With H265 they seem to have had 10-bit in mind from the start.
AFAIK 10bit compression means smaller files:
- x265 - Always choose 10bit?
- QUICK COMPARE: AVC vs. HEVC, 8-bit vs. 10-bit Video Encoding
- Why does 10-bit save bandwidth (even when content is 8-bit)?