VideoHelp Forum
+ Reply to Thread
Results 1 to 10 of 10
Thread
  1. Hey forum Hope you can answer this question.

    I've been doing a fair bit of research regarding encoding (mostly x264) video to more efficient formats such as x265 and, more recently, AV1. From what I've been able to gather, with x265 encoding, while several GPUs support this (including the one I have, a 2080 Super) and tend to be much faster than CPUs, the video quality they tend to produce is much lower than a CPU-encoded x265 video. Great for streaming games, but not so great for encoding high-quality video to a smaller size without significant quality loss.

    With that said, until the recent Intel ARC GPU series, no GPU or mainstream CPU that I'm aware of had hardware AV1 encoding support. Intel has had CPU-accelerated HEVC encoding since Skylake, however, under their brand "Quick Sync Video".

    Will the ARC GPUs handle encoding in the same way that other GPUs handle video encoding; that is, putting an emphasis on speed rather than quality/compression factor? Or, since this architecture comes from Intel's Quick Sync Video brand, which was historically CPU-bound, is it still able to encode at as high a quality as a CPU given the task?
    Quote Quote  
  2. Originally Posted by qb_master View Post
    Will the ARC GPUs handle encoding in the same way that other GPUs handle video encoding; that is, putting an emphasis on speed rather than quality/compression factor? Or, since this architecture comes from Intel's Quick Sync Video brand, which was historically CPU-bound, is it still able to encode at as high a quality as a CPU given the task?
    As video encoding implemented in HW is unique and exclusive proprietary technology then i assume nobody is able to say how in details particular vendor implement video encoding - most likely this is HW+SoC firmware i.e. software that run on dedicated silicone.
    Quick Sync Video is named currently by Intel as 'oneVPL' .
    Intel implemented quicksync video encoding in HW and in HW+SW (CPU driven hybrid mode) - this is something else than SW encoding.

    HW encoders focus on speed not on quality - this is mostly related to limited HW resources available for video encoding process - if you need high quality then you must increase bitrate.
    If you trying to match at the same time high quality and highe encoding speed then GPU based encoders may not provide satisfactory results when compared to traditional SW based approach.
    Quote Quote  
  3. Thanks for your response. My goal is to output the highest quality possible for a given bitrate, as I'm looking to do this for archiving purposes. I do understand that GPUs prioritize speed over quality. However, CPUs at least for HEVC have hardware acceleration, but don't lower quality compared to a non-accelerated CPU (I tested this last night with a 3770K vs an 8750H and received the same file size/quality).

    The thing here is, since in the last generation, Intel is supporting AV1 encoding (currently) only in the GPUs whereas historically all of the hardware acceleration has been CPU-bound, but it's still listed as being under the same technology. So I'm wondering if Intel ARC GPUs will actually have a quality focus unlike the (non-AV1) encoding techniques built into NVIDIA/AMD GPUs.

    You may be right though, I might not be able to get this answer from a forum. Unless someone has an ARC GPU and can test this.
    Quote Quote  
  4. Originally Posted by qb_master View Post
    Thanks for your response. My goal is to output the highest quality possible for a given bitrate, as I'm looking to do this for archiving purposes. I do understand that GPUs prioritize speed over quality. However, CPUs at least for HEVC have hardware acceleration, but don't lower quality compared to a non-accelerated CPU (I tested this last night with a 3770K vs an 8750H and received the same file size/quality).

    The thing here is, since in the last generation, Intel is supporting AV1 encoding (currently) only in the GPUs whereas historically all of the hardware acceleration has been CPU-bound, but it's still listed as being under the same technology. So I'm wondering if Intel ARC GPUs will actually have a quality focus unlike the (non-AV1) encoding techniques built into NVIDIA/AMD GPUs.

    You may be right though, I might not be able to get this answer from a forum. Unless someone has an ARC GPU and can test this.
    Issue is GPU (usually not GPU but dedicated video engine independent from GPU) encoding has limited resources where CPU infinite resources - CPU may perform exhaustive motion search where GPU not (due HW limitations), CPU may use all techniques where GPU only some - there is many similar differences.
    Of course every iteration of video encoders is better than previous one (perhaps except AMD) but still at some point if your goal is low bitrate and high quality then CPU may be only possible choice. But if can add some bitrate then HW encoders may deliver high quality and high speed.

    To judge new Intel video encoder we probably need to wait not only for first tests but also some degree of maturity .
    Perhaps Intel will be able to combine GPU (like OpenCL) with video encoder and perhaps CPU to form some advanced hybrid encoder capable to deliver high quality, low bitrate and high speed of the encoding.
    Quote Quote  
  5. There aren't a lot of benchmarks around, and this one is geared toward 1080p gaming (realtime streaming), but software AV1 (non-realtime) encoding is still much better than Intel's ARC AV1 encoding. At least going by VMAF results:

    https://www.tomshardware.com/news/intel-arc-av1-encoder-dominates-nvenc
    Quote Quote  
  6. I was actually going to post about this, the only article I have seen is that one jagabo linked to and I was far from impressed.

    It seems in this test it is far behind SVT-AV1, I was hoping that it would be much closer.
    Quote Quote  
  7. That's good to know. I'll steer clear for my purposes then unless I hear otherwise. Thanks so much y'all!!

    For now I'll consider either waiting for CPU-accelerated (HQ) AV1 encoding to become a thing, or going for as many cores as I can fit in my budget.
    Quote Quote  
  8. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Out of curiosity, I looked at an Intel ARC GPU discrete graphics card today with the idea of upgrading one of my systems. From what I read about them, Intel ARC GPU discrete graphics cards don't work well with every CPU and motherboard, even relatively recent ones. Intel recommends using Intel® Driver and Support Assistant (Intel DSA) to check if your system is ready for Intel® Arc™ discrete graphics. See https://www.intel.com/content/www/us/en/support/articles/000091128/graphics.html
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  9. Originally Posted by usually_quiet View Post
    Out of curiosity, I looked at an Intel ARC GPU discrete graphics card today with the idea of upgrading one of my systems. From what I read about them, Intel ARC GPU discrete graphics cards don't work well with every CPU and motherboard
    From what I've read they don't work well period. Intel has admitted that the cause of the recent delays is driver problems. I don't know if that extends to the video encoders or if it's just for gaming.

    https://www.tomshardware.com/news/intel-may-delay-arc-desktop-gpus-until-the-end-of-august
    https://www.tomshardware.com/news/intel-blames-poor-software-for-arc-delays-shipments-miss
    Last edited by jagabo; 2nd Sep 2022 at 08:34.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!