greetings,
i've recently been comparing the dozens of nvidia 2080 graphics cards currently on the market with an eye toward making an informed choice for a new editing workstation i'll be getting in the next month or two. over the past few weeks, i've learned about DLSS, ROP, TMU, CUDA, bandwidth, etc. now i know what all of these things mean, but i still don't really understand how they might be important to me. i.e., should i prioritise cards with more ROP's, more CUDA's, higher bandwidth, or ... ?
the system will be windows-based, and will primarily run adobe premiere and davinci resolve. in addition, i'll be using some AI-based image processors that depend on torch and tensorflow. i'm told having lots of CUDA's will be good for that -- but what about the rest ? do i care about texture-mapping units, raster operations, and such ?
note- i'm not really looking for advice like 'get card x' ; i'd really like to understand what importance these characteristics might having in my editing, if any [esp. rendering and filtering application times].
thanks much !
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays!
+ Reply to Thread
Results 1 to 8 of 8
Thread
-
-
There should not be any difference beyond the amount of onboard ram, ram speed and gpu clock speed between the 2080's from various vendors; at most some may sell slightly overclocked versions, or perhaps one with some extra ram, but that's it.
You should also look at what the software vendor's recommend:
https://helpx.adobe.com/premiere-pro/user-guide.html/premiere-pro/system-requirements.ug.html
I can't see to find the recommendations for Resolve.
Having said this, I personally would not build or buy a x86 Windows based system for video editing.
I posted about this in another thread, the new Apply M1 Macs are beasts for video editing:
https://www.youtube.com/watch?v=GMzT3bajoPs
Warning, this video contains way more adverts than I usually like seeing in a Youtube video, or anywhere to be completely honest, but in this case it is worth it. What Apple has done with these ARM based M1's is very impressive, all the more so considering they just have 8Gb of shared unified ram. I would probably wait for a 16Gb variant, but even with 8Gb, at one point he compares that laptop to a 15 thousand dollar video editing Mac he bought and it's shocking to see that new Mac take the older expensive desktop to school.
Something to consider.
Edit, some more:
https://www.youtube.com/watch?v=I_UxPmm2Tpo
https://www.youtube.com/watch?v=mG5QT3R-90w -
thanks much for the info, @sophisticles. i have to admit, that's pretty mind-blowing...
at first i was wondering how much of the performance might've been due to final cut, but then i watched the video w/davinci, and man...
unfortunately, i don't have much of a choice when it comes to the platform, i'm taking advantage of a special offer, and it's windoze-only. :/
but seeing those videos is really making me wonder whether i should forgo the offer i'm taking advantage of and get a mac !
so the number of shading units, ROP's and such aren't really a factor, then ?
thanks again for your post -- and for giving me something to chew on ! -
Sorry for butting in, but isn't the just recently released 3070 the better choice overal The benchmarks are quite higher than a 2080 and the price should be, compared to second hand 2080, pretty even. Unless there is an alternative use im not aware of, then just forget about my comment.
-
Cuda cores and what are sometimes called shading units are the same thing; the number of processing units, whether specialized or general, should not differ from the reference design in any meaningful way. If you are going to get a 2080, get one that adheres to NVIDIA's spec.
-
-
no apologies necessary, you make a valid point. i did some checking after your post and, yeah, the price of the 3070's are pretty good too. i'm currently looking at a brand new 2080-ti 50% off clearance for 550 euros. it's true, that's only a couple hundred less than the 3070 but, according to what i've read, the 2080-ti may have more 'guts' to it for what i'm doing than the 3070, and a couple hundred is, well, not insignificant... of course, that's if i decide to take the subsidised offer i got rather than shell out everything myself to go for a mac !
Similar Threads
-
Best Possible Quality for Blu-Ray and 4K UHD Encodes HW vs SW RTX 2080
By iS70rM in forum Video ConversionReplies: 7Last Post: 18th May 2020, 18:01 -
period for video convertion from h.264 to h.265 nvidia 2080
By empleat in forum Video ConversionReplies: 2Last Post: 7th Mar 2019, 10:12 -
Comparing Two Frames
By rockerovo in forum Video ConversionReplies: 7Last Post: 4th Dec 2018, 08:39 -
Comparing DNR on or off on 8mm video to digital conversion, help please
By videon00b in forum CapturingReplies: 27Last Post: 5th Feb 2018, 05:17 -
Comparing Video Redo and videopad editor from NCH.
By kirkmc in forum Newbie / General discussionsReplies: 9Last Post: 8th Oct 2016, 20:38