Did tests with 10-bit and this time 2.0 is leading, doing significantly better than 1.2 but for some reason still getting a lower score than 1.2 8-bit.
Method x265 version Filesize SSIM+
CRF 1.2 10-bit 1763 90.18
CRF 2.0 10-bit 1764 93.55
CRF x264 10-bit 1762 92.86
This sample has been exhausted pretty much. Still no clue why x264 got an edge on x265 on sneaker's source that had more jagged edges. Oh well, all the more reason to clean up before compressing.
I'm pretty sure 10-bit will perform better on sources likely to experience banding.
EDIT: _Al_ from another thread gave me the idea to compare x265 to MPEG2, or at least MPEG1 since I can't get the former to work right. As it's largely off-topic I'll keep it short.
For the same quality as the best of x265 produced at 68 kb/s, the best of the following codecs required:
x264 @ 75 kb/s
Xvid @ 262 kb/s
mpeg-1 @ 659 kb/s
+ Reply to Thread
Results 721 to 750 of 782
-
Last edited by -Habanero-; 31st Aug 2016 at 21:13.
-
Did another test for a higher bitrate. x265 did a lot better than x264 this time, achieving 27% more efficiency with 1.2. 2.0 is still slightly behind.
-
Does it leverage OpenCL to aid encoding? OpenCL can do a very good job with decoding x265, even when the video card has no native x265 support. I had an old dual core Socket 939 Athlon x64 playing 720p x265 perfectly with an nVidia 8800 GT on Windows XP, using the Strongene OpenCL decoder and LAV Filters.
Replaced that box with a newer one, same 8800 GT and XP but a dual core Socket AM2 CPU. Only the most detail intensive 1080p causes minor stuttering on it, with Potplayer. That's a real nice player, puts VLC, GomPlayer, MPC-HC and others in the weeds.
I'm putting together a Dell Inspiron 530 with a Core 2 Quad and 8 gig (there were two versions of the 530, one can only take a dual core and 4 gig) with Windows 10 which will initially have a 9800 GT 1gig. That card, on an AM3 Phenom II 555 dual core with Windows 10, Strongene OpenCL and LAV Filters plays 1080p without a hitch with Potplayer. -
No.
LAV Video decoder uses hardware decoding via APIs like DXVA2 or CUVID. An 8800 GT won't provide any HEVC decoding via such APIs because it's too old. And last time I looked Strongene OpenCL decoder didn't support Nvidia at all even though Nvidia supports OpenCL. Maybe it is different today but even if it could it wouldn't necessarily be faster than software decoding. And of course sharing HEVC decoding load between LAV Video and another decoder is definitely impossible. So I have my doubts about what you say. -
My ATI 4670 (1GB) from 2008 did a pretty decent job of helping with decoding 720p/1080p HEVC content. Using MPC-HC with LAV Filters using DXVA2. I only recently changed it out with a AMD RX 460, and I'm still using DVXA2 without problem. But now I can also decode 4K HEVC in real time with the RX 460, which I could not do smoothly with the ATI 4670.
-
I was getting pretty decent GPU (ATI 4670) usage spikes when using it for HD HEVC, like +40% in the ATI control panel. Taking a big load off of the CPU. Trying to play the video in VLC would either cause the CPU to take all the load, and usually causing 100% peaks which gave dropped frames. Never had dropped frames in MPC-HC until I started trying 4K content, for both H.264 or HEVC.
-
You can see directly in LAV Video what decoder is active. No need to look at unreliable info about GPU usage which may be used for other things like scaling by the renderer. VLC's decoder was known to be slow, LAV author cherry-picked optimizations from OpenHEVC. LAV 64 was (still is?) like 3 or 4 times faster for HEVC decoding than VLC 32.
-
-
I see we already had this discussion before:
https://forum.videohelp.com/threads/379693-Struggles-when-playing-back-certain-4K-files...=1#post2453848
You can even see in your screenshot that it was using "avcodec"(=ffmpeg software decoder) with "<none>" as "Active Hardware Accelerator", not DXVA. -
It was not much of a discussion but just me posting a comment and you replying.
I have no idea what I was decoding in that screen shot, or that I went out of my way to decode a HEVC video in that shot. I might of also just opened any SD I had, as you can see the SD box is unticked. I can even get the same thing today when using DVB Viewer to decode H.264 1080i video over the air, or playing SD content through MPC-HC. https://forum.videohelp.com/images/imgfiles/neLLDtp.png
I'm pretty certain that the "Active Hardware Accelerator" would say "4600 Series", and the "Active Decoder" would say DXVA2. When decoding HD H.264 or HEVC in MPC-HC. I could reinstall the 4670 as I still have it. But am not eager to do that as installing the RX 460 and getting it working properly took an entire evening.
What are you going off that makes you believe the 4670 could never use DXVA2 (native). I've see it stated that the "AMD/ATI: Radeon HD 6xxx and newer" are needed for for DXVA2 (copy back) but native is supported on older cards.
Also found this. http://kodi.wiki/view/HOW-TO:Enable_Hardware_Accelerated_Decoding_via_DXVA2_in_XBMC_for_Windows -
This is a thread about the x265 HEVC encoder. Optimizing the decoding is not the topic here; and encoding HEVC is a magnitude more complex than decoding it, so even if GPGPU features or specialized decoding chipset areas may support decoding HEVC in any way (even partially), it hardly matters much for the encoding, it is technically really different.
-
Then it is using software decoding. For example because the renderer doesn't support DXVA2 native or because there's an intermediate filter breaking the chain.
I didn't say it doesn't support DXVA2. I'm saying it doesn't support HEVC decoding via DXVA2. Just like your RX 460 driver supports HEVC DXVA2 but not VP9 (though AMD has promised future driver for VP9 on Polaris). Every format needs to be implemented separately by the GPU driver. And this simply isn't the case for older GPUs and HEVC.
http://bluesky23.yukishigure.com/en/DXVAChecker.html
http://bluesky23.yukishigure.com/en/dxvac/screenshot/ss.html
But LigH is correct, maybe a mod can move the discussion. -
Using CUVID hardware decoding on a GeForce 9800 GT. This setting only became available after installing the OpenCL Strongene HEVC decoder. Also works on an 8800 GT but not quite as well due to the less powerful GPU.
-
Like I said, watch the "Active Decoder:" item. It will be "avcodec" when playing HEVC on a PC with only an old GPU. No matter if CUVID, DXVA2 or QuickSync. You're fooling yourselves.
-
What matters is how well or not it plays. With the dual core socket 939 - without Strongene OpenCL and LAV *with the CUVID selected* it could not play 720P HEVC video without a ton of stuttering.
-
Note: The title of this thread is: 'x265 HEVC Encoder.' Please keep to the original title for replies.
If you want to discuss W.264 or other encoding, start a new topic. Thanks
Moderator redwudz -
Just wanted to make a last comment on HEVC decoding on my old ATI 4670. I spent a bit of time reinstalling the 4670 and found that it could hardware decode MPEG2/VC1/H.264 via LAV Filters and DVXA2. But when I tried any HEVC videos, no indication of hardware decoding was happening and was using that avcodec. Makes for some pretty good software decoding if it convinced me it was hardware decoding. Reinstalled the RX 460 and am getting HW decoding of HEVC.
No more sliding of this thread from me, promise. -
suggestion for SEI parameters versus user custom/override parameter updates.
It is good that x265 includes the SEI encode parameters. Lately, I've noticed that the parameters have increased or doubled in size. Its good to include all parameters, however, one thing that would help people encoding with x265 and reviewing the parameter to help them get better results:
separate the --preset parameters from the user customized or override parameters. For instance, put the --preset parameters first, then the user customized after. This will help the user (or others reviewing other's encode) to see which parameter were actually used in the final video. I mean. How is a person suppose to know what parameter was used if there were additional parameters or worse, that some of the value's in the --preset were revised. I see people post the --preset they used but, what parameter did they change or override? ..since the parameters are just jumbled output but with no meaning. No.., I am not asking you...just asking in general, if I was reviewing someone-else's work, I'd like to know exactly what parameter was added (at the end of the --preset) and/or over-written.
A separate or special character could be used to denote the position after the --preset. For instance { --preset a, b, c, .. g || h, i, j, .. z }
Thus, h,i,j, .. z were the custom params to look at. -
The SEI parameter string contains the parameters on the encoder core level, not on the application level. CLI parameters – including "preset" and "tune" which are split into basic parameters even before – are translated to core parameters before they are handled by the encoder library core functions. The function which writes the SEI parameter string already does not even know anymore if a preset or tune parameter was used in the application and translated in the API.
It doesn't matter whether you order a "Pizza Prosciutto" or a "Pizza Margherita with additional ham"; eventually, it is only known that the pizza dough shall be topped with tomato sauce, cheese, and ham. And then it doesn't matter anymore how you ordered it. -
PowerPC? Why? How well can it possibly perform on Macintosh computers that are at least 11 years old now, for the G5 models? Anyone have a final model dual CPU dual core Powermac G5 with OS X 10.5.8 to do a benchmark?
Aside from that, are there any other common uses of newer POWER architecture CPUs outside of big servers and supercomputers? -
https://forum.doom9.org/showpost.php?p=1778725&postcount=4210
IBM has contributed a set of files containing POWER optimizations. Although this is not in proper patch format, I'm guessing that any competent developer familiar with IBM POWER 8 servers can build and run the optimized version of x265. The x265 team will review, test and commit these improvements as soon as possible, but it may take some time as we've got every developer and then some spoken for right now. So any test or code review feedback from the community of x265 developers would be welcomed. Thanks to IBM for contributing to x265! -
The only reasons I can see for implementing PowerPC support is for TV networks to use super computers for compressing video feeds for distribution at higher quality, and for online streaming sources - if Youtube, Vimeo and friends ever support HEVC.
Unless support can be added to firmware of ATSC televisions I don't see any near term use for over the air broadcast. Unless a new PowerPC hardware platform is produced for the consumer market, PPC support in the encoder won't make any direct difference to userland. -
did anyone try the new '--multi-pass-opt-analysis' option and can comment on it's worth?
users currently on my ignore list: deadrats, Stears555, marcorocchini -
Just announced in the developer maininglist:
Originally Posted by Pradeep -
Marsia MarinerGuest
Overdue update...
Originally Posted by Ma -
Shouldn't --ssim-rd help when aiming for better SSIM results and thus be included in a 'true placebo' setting which seems to aim for better PSRN/SSIM results?
users currently on my ignore list: deadrats, Stears555, marcorocchini -
x265 2.4+27-e9e574bbed93 (merge with stable): more fields in rcStats and CSV file
Similar Threads
-
[HEVC] x265.EXE: mingw builds
By El Heggunte in forum Video ConversionReplies: 2221Last Post: 9th Feb 2021, 01:18 -
HEVC Encoder by Strongene Lentoid
By vhelp in forum Video ConversionReplies: 126Last Post: 19th May 2017, 12:58 -
theX.265 (a free HEVC) codec. Have you ever tried that HEVC encoder? (HELP)
By Stears555 in forum Video ConversionReplies: 41Last Post: 16th Sep 2013, 11:15 -
HEVC x265 Decoder
By enim in forum Newbie / General discussionsReplies: 5Last Post: 19th Aug 2013, 12:58 -
MulticoreWare Annouces x265/HEVC Mission Statement
By enim in forum Latest Video NewsReplies: 4Last Post: 9th Aug 2013, 22:09