VideoHelp Forum
+ Reply to Thread
Results 1 to 8 of 8
Thread
  1. Time for a new computer, and I'm hoping for some brief responses. I will be doing some high-def video editing (up to 1920x1024), but mostly it will just be for general purpose. Very simple video editing--cutting, pasting, joining, maybe some fades. No gaming.

    This would be a modest Windows machine, with Intel i5 or maybe i3 processor. It would also dual boot Linux. The video choices are (1) built-in Intel video and (2) modest video card, probably NVidia. I'm wondering whether a card is really necessary. I have an i3 on a tablet has absolutely no problem viewing video without a card. I suppose rendering might be a different matter. I don't need ultimate speed, as long as it's reasonable.

    The reason I hesitate to get the card is that I will be dual booting Linux, and support for cards may not be the greatest. I don't demand ultimate speed, but I don't like problems, and so far I've found Intel video to be flawless. On the other hand, the Linux video editing programs I have used are unusable because of severe performance problems--usually, the interface is not even responsive. I don't know if a card would help or not. Haven't tried Windows programs recently, and haven't tried them with high-def.

    What are your experiences? Does a dedicated card make a hugely important difference? Do inexpensive Windows editing programs run well without a card? Does a card fix the Linux programs? Please confine your comments to personal experience or sold information you consider reliable, and be brief. Thanks.
    Quote Quote  
  2. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    I have a Windows system with a Haswell i5 and its iGPU is the only VGA. It works fine using VideoReDo TVSuite V5 to do simple edits on video up to 1080i. (1080p is a supported resolution but I have not edited any 1080p video so far.) I have not tried using Linux with this system.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  3. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    1920x1024? What is that? It's not 1:77778:1 (16:9), it's 1.876:1. Maybe you mean 1920x1080?

    What kind of card are you talking about? Hardware cards aren't used for editing. Use an an editor, which is software, not hardware. That could be anything from Corel's budget Video Studio to something like Adobe Premiere Pro. What you could use for Linux I don't know, but the vast tonnage of free and paid video software and utilities available for Windows makes Linux a non-competitor.
    You won't get anywhere in HD work with an i3 CPU. You need i5 at least, i7 would be better.
    - My sister Ann's brother
    Quote Quote  
  4. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    The GPU can be used by editing software. Some (advanced) NLEs are able to use use the GPU for image processing (DaVinci Resolve is one) and to encode when exporting video. I don't know if most editors use GPU-based hardware acceleration for decoding, but I know that my simple editor, VideoReDo TVSuite V5, can.

    Editors with Linux versions listed in VideoHelp's software section: Kdenlive, Lightworks, OpenShot, and Shotcut
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  5. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    I never saw anyone give any praise to GPU video processing and most of what I've seen and read about isn't very favorable. Maybe GPU processing is where all the godawful user videos I see in forums is coming from?
    - My sister Ann's brother
    Quote Quote  
  6. The answer to this question isn't simple. It depends on the software you're using, the files you're editing, the filters you're using, what codecs you use for output, and how much compression you need. I'm going to use the term "GPU" for both add-in graphics cards (AMD/Nvidia) and the processor's built in graphics device and hardware video decoder/encoder (Intel's Quick Sync).

    In theory, the GPU can be used for decompressing the source video, various filtering operations (noise reduction, sharpening, etc.), and encoding the final output. But all of this is dependent on the software you are using. Some may only support AMD/ATI, some only Nvidia's older drivers, some only nvidia's newer drivers, and some only Intel's Quick Sync, or some combination of them. And the software may only support decoding, or filtering, or compression, or some combination of those. So you'll have to check the software you are using or will be using.

    Hardware decoding of the source is important with long GOP codecs like h.264, especially with high definition sources. If you're sitting at the end of a 250 frame GOP and want to seek backward by one frame the decoder may have to go back 250 frames to the first frame of the GOP and decompress every frame up to the desired frame. With software and a slow CPU that may take several seconds. With GPU decoding it's much faster. Unfortunately, GPU decoding is sometimes problematic, delivering a corrupt picture, so you may have to disable it at times.

    As was pointed out GPU encoding isn't of the highest quality. But it can be faster than CPU encoding. Some people don't mind giving up a little quality, or using higher bitrates, for faster encoding.

    Filtering is generally limited to some filters. If you use those a lot it makes sense to have the GPU hardware do it -- it may be several times faster.

    There is another issue right now. Because of the high demand for GPUs for cryptocurrency mining there is a shortage of graphics cards. That has caused prices of the affordable graphics cards to spike by 2x or more. This is not a good time to be buying a graphics card.

    Unless you know you need a Nvidia or AMD ad-in card right now I would buy a system without one. Just make sure it has room and an open slot in case you decide need one in the future. If you can live without an add-in graphics card you've saved yourself $300. Adding a graphics card is easy.
    Quote Quote  
  7. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    One thing that has not been mentioned so I do wonder if it is relevant.

    Any HD encoding will consume memory. Surely built-in graphics will use system memory even for display purposes whereas a bespoke graphics card will have its own ram this freeing general ram for the important stuff.

    Like I sais, not sure if it is relevant but thought I would bring it up all the same.
    Quote Quote  
  8. Thanks for the very thoughtful replies. Excellent information. Yes, I did mean 1920x1080, H264. I'm inclined not to get a card.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!