Hey forumHope you can answer this question.
I've been doing a fair bit of research regarding encoding (mostly x264) video to more efficient formats such as x265 and, more recently, AV1. From what I've been able to gather, with x265 encoding, while several GPUs support this (including the one I have, a 2080 Super) and tend to be much faster than CPUs, the video quality they tend to produce is much lower than a CPU-encoded x265 video. Great for streaming games, but not so great for encoding high-quality video to a smaller size without significant quality loss.
With that said, until the recent Intel ARC GPU series, no GPU or mainstream CPU that I'm aware of had hardware AV1 encoding support. Intel has had CPU-accelerated HEVC encoding since Skylake, however, under their brand "Quick Sync Video".
Will the ARC GPUs handle encoding in the same way that other GPUs handle video encoding; that is, putting an emphasis on speed rather than quality/compression factor? Or, since this architecture comes from Intel's Quick Sync Video brand, which was historically CPU-bound, is it still able to encode at as high a quality as a CPU given the task?
+ Reply to Thread
Results 1 to 10 of 10
-
-
As video encoding implemented in HW is unique and exclusive proprietary technology then i assume nobody is able to say how in details particular vendor implement video encoding - most likely this is HW+SoC firmware i.e. software that run on dedicated silicone.
Quick Sync Video is named currently by Intel as 'oneVPL' .
Intel implemented quicksync video encoding in HW and in HW+SW (CPU driven hybrid mode) - this is something else than SW encoding.
HW encoders focus on speed not on quality - this is mostly related to limited HW resources available for video encoding process - if you need high quality then you must increase bitrate.
If you trying to match at the same time high quality and highe encoding speed then GPU based encoders may not provide satisfactory results when compared to traditional SW based approach. -
Thanks for your response. My goal is to output the highest quality possible for a given bitrate, as I'm looking to do this for archiving purposes. I do understand that GPUs prioritize speed over quality. However, CPUs at least for HEVC have hardware acceleration, but don't lower quality compared to a non-accelerated CPU (I tested this last night with a 3770K vs an 8750H and received the same file size/quality).
The thing here is, since in the last generation, Intel is supporting AV1 encoding (currently) only in the GPUs whereas historically all of the hardware acceleration has been CPU-bound, but it's still listed as being under the same technology. So I'm wondering if Intel ARC GPUs will actually have a quality focus unlike the (non-AV1) encoding techniques built into NVIDIA/AMD GPUs.
You may be right though, I might not be able to get this answer from a forum. Unless someone has an ARC GPU and can test this. -
Issue is GPU (usually not GPU but dedicated video engine independent from GPU) encoding has limited resources where CPU infinite resources - CPU may perform exhaustive motion search where GPU not (due HW limitations), CPU may use all techniques where GPU only some - there is many similar differences.
Of course every iteration of video encoders is better than previous one (perhaps except AMD) but still at some point if your goal is low bitrate and high quality then CPU may be only possible choice. But if can add some bitrate then HW encoders may deliver high quality and high speed.
To judge new Intel video encoder we probably need to wait not only for first tests but also some degree of maturity .
Perhaps Intel will be able to combine GPU (like OpenCL) with video encoder and perhaps CPU to form some advanced hybrid encoder capable to deliver high quality, low bitrate and high speed of the encoding. -
There aren't a lot of benchmarks around, and this one is geared toward 1080p gaming (realtime streaming), but software AV1 (non-realtime) encoding is still much better than Intel's ARC AV1 encoding. At least going by VMAF results:
https://www.tomshardware.com/news/intel-arc-av1-encoder-dominates-nvenc -
That's good to know. I'll steer clear for my purposes then unless I hear otherwise. Thanks so much y'all!!
For now I'll consider either waiting for CPU-accelerated (HQ) AV1 encoding to become a thing, or going for as many cores as I can fit in my budget. -
Out of curiosity, I looked at an Intel ARC GPU discrete graphics card today with the idea of upgrading one of my systems. From what I read about them, Intel ARC GPU discrete graphics cards don't work well with every CPU and motherboard, even relatively recent ones. Intel recommends using Intel® Driver and Support Assistant (Intel DSA) to check if your system is ready for Intel® Arc™ discrete graphics. See https://www.intel.com/content/www/us/en/support/articles/000091128/graphics.html
Ignore list: hello_hello, tried, TechLord, Snoopy329 -
From what I've read they don't work well period. Intel has admitted that the cause of the recent delays is driver problems. I don't know if that extends to the video encoders or if it's just for gaming.
https://www.tomshardware.com/news/intel-may-delay-arc-desktop-gpus-until-the-end-of-august
https://www.tomshardware.com/news/intel-blames-poor-software-for-arc-delays-shipments-missLast edited by jagabo; 2nd Sep 2022 at 08:34.
Similar Threads
-
Encoding to AV1 - NotEnoughAV1Encodes
By Alkl in forum Video ConversionReplies: 26Last Post: 2nd Mar 2021, 01:05 -
The most comprehensive AV1 encoding test
By sophisticles in forum Video ConversionReplies: 11Last Post: 29th Jun 2020, 09:37 -
GPU or CPU encoding?
By m00511 in forum Newbie / General discussionsReplies: 44Last Post: 17th Nov 2019, 19:03 -
some qustions about video encoders (HM,JM,AV1..)
By rockerovo in forum Video ConversionReplies: 1Last Post: 4th Nov 2018, 16:36 -
There is too much about GPU vs CPU encoding, but how about GPU filters?
By Bernix in forum Newbie / General discussionsReplies: 24Last Post: 16th Feb 2018, 16:17