VideoHelp Forum
+ Reply to Thread
Results 1 to 8 of 8
Thread
  1. greetings,

    i've recently been comparing the dozens of nvidia 2080 graphics cards currently on the market with an eye toward making an informed choice for a new editing workstation i'll be getting in the next month or two. over the past few weeks, i've learned about DLSS, ROP, TMU, CUDA, bandwidth, etc. now i know what all of these things mean, but i still don't really understand how they might be important to me. i.e., should i prioritise cards with more ROP's, more CUDA's, higher bandwidth, or ... ?

    the system will be windows-based, and will primarily run adobe premiere and davinci resolve. in addition, i'll be using some AI-based image processors that depend on torch and tensorflow. i'm told having lots of CUDA's will be good for that -- but what about the rest ? do i care about texture-mapping units, raster operations, and such ?

    note- i'm not really looking for advice like 'get card x' ; i'd really like to understand what importance these characteristics might having in my editing, if any [esp. rendering and filtering application times].

    thanks much !
    Quote Quote  
  2. There should not be any difference beyond the amount of onboard ram, ram speed and gpu clock speed between the 2080's from various vendors; at most some may sell slightly overclocked versions, or perhaps one with some extra ram, but that's it.

    You should also look at what the software vendor's recommend:

    https://helpx.adobe.com/premiere-pro/user-guide.html/premiere-pro/system-requirements.ug.html

    I can't see to find the recommendations for Resolve.

    Having said this, I personally would not build or buy a x86 Windows based system for video editing.

    I posted about this in another thread, the new Apply M1 Macs are beasts for video editing:

    https://www.youtube.com/watch?v=GMzT3bajoPs

    Warning, this video contains way more adverts than I usually like seeing in a Youtube video, or anywhere to be completely honest, but in this case it is worth it. What Apple has done with these ARM based M1's is very impressive, all the more so considering they just have 8Gb of shared unified ram. I would probably wait for a 16Gb variant, but even with 8Gb, at one point he compares that laptop to a 15 thousand dollar video editing Mac he bought and it's shocking to see that new Mac take the older expensive desktop to school.

    Something to consider.

    Edit, some more:

    https://www.youtube.com/watch?v=I_UxPmm2Tpo

    https://www.youtube.com/watch?v=mG5QT3R-90w
    Quote Quote  
  3. thanks much for the info, @sophisticles. i have to admit, that's pretty mind-blowing...
    at first i was wondering how much of the performance might've been due to final cut, but then i watched the video w/davinci, and man...

    unfortunately, i don't have much of a choice when it comes to the platform, i'm taking advantage of a special offer, and it's windoze-only. :/
    but seeing those videos is really making me wonder whether i should forgo the offer i'm taking advantage of and get a mac !

    Originally Posted by sophisticles View Post
    There should not be any difference beyond the amount of onboard ram, ram speed and gpu clock speed between the 2080's from various vendors; at most some may sell slightly overclocked versions, or perhaps one with some extra ram, but that's it.
    so the number of shading units, ROP's and such aren't really a factor, then ?


    thanks again for your post -- and for giving me something to chew on !
    Quote Quote  
  4. Sorry for butting in, but isn't the just recently released 3070 the better choice overal The benchmarks are quite higher than a 2080 and the price should be, compared to second hand 2080, pretty even. Unless there is an alternative use im not aware of, then just forget about my comment.
    Quote Quote  
  5. Originally Posted by davidfurst View Post
    thanks much for the info, @sophisticles. i have to admit, that's pretty mind-blowing...
    at first i was wondering how much of the performance might've been due to final cut, but then i watched the video w/davinci, and man...

    unfortunately, i don't have much of a choice when it comes to the platform, i'm taking advantage of a special offer, and it's windoze-only. :/
    but seeing those videos is really making me wonder whether i should forgo the offer i'm taking advantage of and get a mac !

    Originally Posted by sophisticles View Post
    There should not be any difference beyond the amount of onboard ram, ram speed and gpu clock speed between the 2080's from various vendors; at most some may sell slightly overclocked versions, or perhaps one with some extra ram, but that's it.
    so the number of shading units, ROP's and such aren't really a factor, then ?


    thanks again for your post -- and for giving me something to chew on !
    Cuda cores and what are sometimes called shading units are the same thing; the number of processing units, whether specialized or general, should not differ from the reference design in any meaningful way. If you are going to get a 2080, get one that adheres to NVIDIA's spec.
    Quote Quote  
  6. Originally Posted by DVDude34 View Post
    Sorry for butting in, but isn't the just recently released 3070 the better choice overal The benchmarks are quite higher than a 2080 and the price should be, compared to second hand 2080, pretty even. Unless there is an alternative use im not aware of, then just forget about my comment.
    For gaming, sure.

    But the OP is doing video editing and image processing, i haven't looked at the benchmarks for each card in that regard. I would still go with the M1 Mac though.
    Quote Quote  
  7. Originally Posted by DVDude34 View Post
    Sorry for butting in, but isn't the just recently released 3070 the better choice overal The benchmarks are quite higher than a 2080 and the price should be, compared to second hand 2080, pretty even. Unless there is an alternative use im not aware of, then just forget about my comment.
    no apologies necessary, you make a valid point. i did some checking after your post and, yeah, the price of the 3070's are pretty good too. i'm currently looking at a brand new 2080-ti 50% off clearance for 550 euros. it's true, that's only a couple hundred less than the 3070 but, according to what i've read, the 2080-ti may have more 'guts' to it for what i'm doing than the 3070, and a couple hundred is, well, not insignificant... of course, that's if i decide to take the subsidised offer i got rather than shell out everything myself to go for a mac !
    Quote Quote  
  8. Originally Posted by sophisticles View Post
    Cuda cores and what are sometimes called shading units are the same thing; the number of processing units, whether specialized or general, should not differ from the reference design in any meaningful way. If you are going to get a 2080, get one that adheres to NVIDIA's spec.
    cheers, that makes things clearer.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!