Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1950
Article Thumbnail

DaVinci Resolve Studio - NVIDIA GeForce RTX 3070, 3080 & 3090 Performance

Written on October 29, 2020 by Matt Bach

TL;DR: NVIDIA GeForce RTX 3070, 3080 & 3090 performance in DaVinci Resolve Studio

Overall, the new RTX 3000-series cards do extremely well in DaVinci Resolve Studio. For the tasks that rely heavily on the GPU (primarily noise reduction and OpenFX), the RTX 3080 is more than 60% faster than an RTX 2080 SUPER, while the RTX 3090 is 44% faster than a Titan RTX and within a few percent of two RTX 2080 Ti GPUs. The RTX 3070 is a bit less impressive, but still beats the RTX 2070 SUPER by a solid 22%. No matter how you slice it, a 20-60% performance gain in one generation is very impressive, and likely to result in a significant number of Resolve users upgrading their GPUs as soon as they can.

Excerpt from "Labs Open Office Hour: Labs Lads Discuss the New RTX 3070" live stream on 10/30/2020


DaVinci Resolve is known in the industry as having excellent GPU-acceleration support - greatly benefiting from a powerful (or multiple powerful) video cards. This makes it very interesting in regards to NVIDIA's recently announced GeForce RTX 30 Series GPUs since NVIDIA has been pushing their GeForce line of cards more and more into the professional content creation space with their "Studio" program. Gaming may still be front and center during these launches, but we have high expectations for what these new GPUs will be able to achieve in a professional application like DaVinci Resolve.

DaVinci Resolve Studio GPU Performance Benchmark - NVIDIA GeForce RTX 3070 8GB, RTX 3080 10GB & RTX 3090 24GB

If you want to see the full specs for the new GeForce RTX 3070, 3080, and 3090 cards, we recommend checking out NVIDIA's page for the new 30 series cards. But at a glance, here are what we consider to be the most important specs:

VRAM CUDA Cores Boost Clock Power MSRP
RTX 2070 SUPER 8GB 2,560 1.77 GHz 215W $499
RTX 3070 8GB 5,888 1.70 GHz 220W $499
RTX 2080 SUPER 8GB 3,072 1.65 GHz 250W $699
RTX 3080 10GB 8,704 1.71 GHz 320W $699
RTX 2080 Ti 11GB 4,352 1.55 GHz 250W $1,199
RTX 3090 24GB 10,496 1.73 GHz 350W $1,499
Titan RTX 24GB 4,608 1.77 GHz 280W $2,499

While specs rarely line up with real-world performance, it is a great sign that NVIDIA has doubled the number of CUDA cores compared to the comparable RTX 20 series cards with only a small drop in the boost clock. At the same time, the RTX 3080 and 3090 are also $500-1000 less expensive than the previous generation depending on which models you are comparing them to.

While it is a bit odd that the RTX 3080 has less VRAM than the 2080 Ti, all three of these new cards should all be capable of working with 4K timelines in DaVinci Resolve. If you want to work with 8K and above media, however, only the 3090 (with 24GB of VRAM respectively) would meet our current recommendation. Using a GPU with less than 20GB of VRAM when using 8K and larger media is likely to result in constant "out of GPU memory" errors that are not conducive to a smooth workflow.

With the launch of the RTX 3070, we can update our previous DaVinci Resolve Studio - NVIDIA GeForce RTX 3080 & 3090 Performance article with results for the 3070. We also have a separate article focusing on multi-GPU performance in Resolve with these new cards. It is important to note that most of the currently available GPU models are not a good choice for multi-GPU configurations, so while the performance in that article should be accurate, we would highly recommend waiting for blower-style cards to be released before getting a system with multiple RTX 3000-series GPUs.

DaVinci Resolve Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!

Test Setup

Listed below is the specifications of the system we will be using for our testing:

To test each GPU, we will be using the fastest platform currently available for DaVinci Resolve - most notably the AMD Threadripper 3970X. Since Resolve utilizes the CPU so heavily, this should minimize the impact of the processor and allow each GPU to perform at their fullest potential.

For the testing itself, we will be using an upcoming version of our PugetBench for DaVinci Resolve benchmark that is not yet available to the public. This new version is very close to being available for download, but since the tests are much better than the version that you can currently download, we opted to go ahead and use it for this comparison.

We will be using the "Extended" preset that includes both 4K, 8K media as well as specific GPU Effects and Fusion tests. Using 8K media with most of the cards we are testing is actually not a good idea due to the "out of GPU memory" errors you would likely encounter, but our benchmark does not load the Resolve UI which means that the VRAM load is much lower; allowing GPUs with just 8GB of VRAM to successfully complete the 8K tests.

Raw Benchmark Results

While we are going to go through our analysis of the testing in the next section, we always like to provide the raw results for those that want to dig into the details. If there is a specific codec or export setting you tend to use in your workflow, examining the raw results for that task is going to be much more applicable than our more general analysis.

NVIDIA GeForce RTX 3070, 3080 & 3090 performance in DaVinci Resolve Studio

Overall DaVinci Resolve Studio Performance Analysis

While many reviewers like to solely look at things like temporal noise reduction (often to an unrealistic degree) or OpenFX that heavily utilize the GPU, we first want to start off by looking at the overall performance we saw from our DaVinci Resolve benchmark with each GPU in order to show what most users would likely experience in their day-to-day work.

Looking at the Overall Extended Score, the new RTX 3000-series cards do very well, with even the RTX 3070 beating all the single GPU configurations from the previous generation. The RTX 3080 and 3090 even managed to sneak within a few percent of a dual RTX 2080 Ti setup!

Compared to the previous generation cards, the RTX 3070 is about 11% faster than the RTX 2070 SUPER, while the RTX 3080 is almost 20% faster than the RTX 2080 SUPER. The RTX 3090 isn't all that much faster than the RTX 3080 (although the extra VRAM is critical for 8K workflows), but compared to even the Titan RTX, it is a solid 13% faster.

If you are currently using a lower-end RTX card, an AMD Radeon GPU, or an older GTX 1080 Ti, the performance gains are even more significant. Depending on the exact card, you are looking at anywhere from a 20 to 50% increase in performance with the new RTX 3000-series cards.

Bear in mind that this looking at the Overall Extended Score which measures the performance in all of our tests - including the Fusion portion which is almost entirely CPU limited. To get a better idea of the maximum performance difference between these cards, we should hone in on the "GPU Effects" portion of our benchmark which looks at tasks like TNR and various GPU-accelerated OpenFX.

GPU Score Analysis

NVIDIA GeForce RTX 3070 8GB,  RTX 3080 10GB & RTX 3090 24GB DaVinci Resolve Studio GPU Effects benchmark performance

The GPU effects portion of our benchmarks looks at the performance of individual GPU-accelerated effects such as temporal noise reduction, film grain, lens blur, optical flow, face refinement, and more. In our testing, these effects easily show the largest benefit from having a powerful GPU, which means that they should give us the best look at the maximum performance gain you may encounter from each of the GPUs we are testing.

In this test, the new RTX 3000-series cards put up some very impressive numbers. The RTX 3080 is terrific for its cost, beating the similarly priced RTX 2080 SUPER by a whopping 62%. Compared to the more expensive RTX 2080 Ti and Titan RTX, the RTX 3080 also handily beats those cards by around 30%. Dual RTX 2080 Ti is still faster than a single RTX 3080, but even that configuration is only ~15% faster.

The RTX 3090 is even more impressive, beating the RTX 3080 by 13%, the Titan RTX by 44%, and the RTX 2080 Ti by 52%. In fact, it comes within 3% of the dual RTX 2080 Ti setup, which considering you get 24GB of VRAM compared to the 11GB on the 2080 Ti and a single RTX 3090 is significantly cheaper than two 2080 Ti's, is well worth the minor difference in performance.

The odd man out is really the RTX 3070 which is only 22% faster than the previous generation RTX 2070 SUPER. Now, a 22% performance gain is absolutely nothing to scoff at and is very impressive, it just pales a bit in comparison to the 50-60% performance gain we saw with the RTX 3080 and 3090.

Compared to the lower-end RTX 20-series and GTX 1080 Ti cards, the new RTX 3000-series cards are like night and day. With the RTX 3080 and 3090 especially, you are looking at around a doubling of performance once you get down to the RTX 2060 SUPER or GTX 1080 Ti. And if you are considering moving from an AMD Radeon GPU to the RTX 3000-series cards, the performance gain is even more significant - up to 3x faster!

How well does the NVIDIA GeForce RTX 3070, 3080 & 3090 perform in DaVinci Resolve Studio?

Overall, the new RTX 3000-series cards do extremely well in DaVinci Resolve Studio. For the tasks that rely heavily on the GPU (primarily noise reduction and OpenFX), the RTX 3080 is more than 60% faster than an RTX 2080 SUPER, while the RTX 3090 is 44% faster than a Titan RTX and within a few percent of two RTX 2080 Ti GPUs. The RTX 3070 is a bit less impressive, but still beats the RTX 2070 SUPER by a solid 22%. No matter how you slice it, a 20-60% performance gain in one generation is very impressive, and likely to result in a significant number of Resolve users upgrading their GPUs as soon as they can.

No matter what GPU you are currently using, these new cards are likely going to be a significant upgrade. Being able to get this much power from a single card, especially at these price points, is a big deal for DaVinci Resolve Studio users. Considering that a single RTX 3090 24GB is $1000 less than a Titan RTX and has close to the same performance as two RTX 2080 Ti 11GB cards, this also makes editing 8K media more accessible than ever before.

Of course, this article is focused on a single-GPU setup, with the dual RTX 2080 Ti only included to show how close these new cards get to the performance of two video cards from the previous generation. If you are considering a multi-GPU setup, however, we have you covered with our DaVinci Resolve Studio - RTX 3080 & 3090 Multi-GPU Performance Scaling article which looks at the performance with a triple RTX 3080 and dual RTX 3090 setup. That kind of setup is still in development waiting on the availability of blower-style cards, but the performance gain when using OpenFX or noise reduction is very impressive.

As always, keep in mind that these results are strictly for DaVinci Resolve Studio. If you have performance concerns for other applications in your workflow, we highly recommend checking out our Hardware Articles (you can filter by "Video Card") for the latest information on how a range of applications perform with the new RTX 3070, 3080 and 3090 GPUs, as well as with different CPUs and other hardware.

DaVinci Resolve Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Hardware Acceleration, hardware encoding, NVIDIA, NVIDIA vs AMD, AMD, Vega 64, Radeon RX 5700 XT, RTX 2060 SUPER, RTX 2070 SUPER, RTX 2080 SUPER, RTX 2080 Ti, Titan RTX, RTX 3080, DaVinci Resolve, RTX 3090, RTX 3070
Avatar Jr Pham

i was thinking about getting the asus tuf a15 (8 core ryzen gtx 1660 ti) i want to know how good would it be for long 4k h264 videos (30min-1hour) with effects on davinci resolve?

Posted on 2020-10-29 16:42:38

You would probably have a pretty rough time. That GPU only has 6GB of VRAM, and you might get a bunch of "out of memory" errors trying to work with 4K. Plus, it isn't a particularly powerful GPU, so it will struggle with OpenFX. The CPU also isn't all that much for 4K, so you will probably have to work with lower res optimized media.

It can probably work, but you likely will have to spend extra time generating proxies (optimized media) and export times are going to be fairly lengthy.

Posted on 2020-10-29 16:59:25
Avatar Jr Pham

on my desktop i use a 12 core ryzen with a rtx 2060 super 8gb if i get a new gpu i want to see a reduction on exporting by half like 30 min to 15 min export times when editing 4k h264 with effects on davinci resolve which gpu would get me this kind of result ?

Posted on 2020-10-29 16:46:42

Depends on what is on your clips. If there are not many effects that are going to make the export GPU limited, then look at the 4K media scores: https://www.pugetsystems.co... . There isn't all that much difference between the GPUs when just working with clips without much effects or noise reduction. If you want to cut down on export times, a more powerful CPU is going to be far more beneficial than a GPU upgrade.

If you do have some heavier OpenFX or noise reduction, then look at the GPU effects score: https://www.pugetsystems.co... . In that case, a RTX 3080 would give you that 2x improvement in export times, but again, ONLY on timelines that are using a lot of effects or NR.

Posted on 2020-10-29 17:02:56
Avatar Nate Pultorak

Upgrading a system that has a 3900x but an r7 360 gpu. (Upgrades over time! :) ). Looking at gpus, is the move from the 2060 super to a 3070 ground breaking enough in Resolve to warrant waiting for it to get back in stock? Maybe it is the fact that the 3090 and 3080 are so ahead of the competition, but the other gpus look so close together! Not sure if the wait is worth it for my price point (~$500 tops)

Posted on 2020-10-29 22:08:04

Yea, I would wait if you can. The RTX 2060 SUPER is also in pretty short supply, and many of the ones available have the price driven up. And the 3070 is 30% faster than the 2060SUPER, which is nothing to scoff at.

Posted on 2020-10-29 22:12:27
Avatar Asaf Blasberg

Hi Matt,

What score result do I need to look at that is similar to Premiere's H.264 59.94 FPS MultiCam (full/half) res? Can you tell me the frames per second on playback with MultiCam in DaVinci Resolve using the native H264 59.94 media (not using optimized media of course) using your test machine (Threadripper) and RTX 3080? I'm interested in purchasing a system from you, but i really need to know the result of this specific test to see if it makes financial sense for me to spend money on a new Puget machine. Currently I am using the same x299 i7-7920x/GTX 1080 Ti combo and getting a very low reading of 7 FPS (According to Resolve). If I bring in 29.97 FPS footage (H.264) I can play it back in realtime. Thank you so much for your time.

Posted on 2020-10-29 23:08:24

We don't test multicam in Resolve right now. There is no good way to automate testing playback performance in the app itself, so we have to rely on exporting pre-made projects.

For our customers, we can load up a project on a machine to see how it performs as long as you can send us some test clips and project files. H.264 is a very inconsistent codec, so it is very important to test with the media you actually use. Not only does the exact recording settings change things, but different cameras seem to affect performance quite a bit as well. If you aren't already working with one of our consultants, get in touch with them and they can facilitate that testing. https://www.pugetsystems.co...

Posted on 2020-10-29 23:16:50
Avatar Asaf Blasberg

Thank you.. which results do I need to look at that tell me export times? It seems export times from an nVidia GTX 1080 Ti to RTX 3080 didn't yield a huge difference according to the test results (H264 to H264)? Did I read it correctly?

Posted on 2020-10-29 23:26:29
Avatar Peetz

Matt, I notice in the Fusion scores that the 2070 Super is top (aside from the Titan), and Fusion is “almost entirely CPU limited”. My videos will be hour-long 4K explainer videos for youtube, filled start to finish with 2D Fusion motion effects animated diagrams. I won’t use much of the “GPU effects” named above. I aim to get a new 5900X or 5950X - whichever one Puget’s upcoming benchmarks will recommend for Fusion. Given all that -- and assuming GPU price is not an issue for me -- I wonder, is it conceivable, counterintuitively, that I actually might be better off with a 2070 Super rather than a 3080? (In the future, I hope to use Blender to create 3D models to use with Davinci Resolve / Fusion.)

Posted on 2020-10-31 10:03:04

The RTX 3000-series results in Fusion are definitely a bit odd, but Fusion in general is odd for GPU performance. For example, did you know that the more GPUs you have (and Resolve is set to use), the worse the performance gets in the Fusion tab?

At the moment, I would attribute it to early drivers and not worry too much about it. There is pretty much no way a RTX 2070 SUPER will be better than a 3080 once all the launch stuff gets ironed out.

Posted on 2020-11-01 17:58:09
Avatar Peetz

Thanks. Do you think it's worth mentioning that in the article itself. As the article stands - if people didn't seek the clarification from you that I did - there might be some who'd purchase a 2070S or 2080S thinking that the graphs indicate those are better for Fusion than the 3000 series, with minimal loss in 4K editing in exchange for cheaper price.

We appreciate all the benchmark work you do - most of the benchmarks by others focus on games, with so few people offering benchmarks for video editors.

Posted on 2020-11-01 23:31:13
Avatar Jesus

Hi, i was wondering if the jump from a 2070 super to a 3070 is worth it? currently i can work in davinci resolve with a 1080p timeline with smart render in the back ground. If i jump to a 4k timeline it begins to struggle with transitions and text. will a 3070 allow for a 4k timeline? what about a 6800xt amd card. will that one perform better than a 2070 super in davinci resolve?

Posted on 2020-11-02 05:51:38

Going from a 2070S to a 3070 should help, but only by about 20% at most. If those transitions and text goes down to something like 5FPS on a 2070S, you aren't going to end up with smooth playback, just a bit higher FPS.

As for the 6800XT, we will have to wait until it releases to see.

Posted on 2020-11-02 16:49:12
Avatar Isiah Bucao

Thanks for this article. I've been coming back to it these past few days. I'm currently waiting for your benchmarks with the new amd gpu's.

I just have one question.. I've already been considering buying an rtx 3080 since it was announced, just haven't pulled the trigger yet because of the big navi announcement and it's a little hard to get a hold of any 30 series from where I am. Do you think I'll have a hard time editing 6k/8k footages with a 3080 even if I have a 3900x, 64gb ram and nvme ssd's to go with it? Unfortunately, I currently can't afford the 3090 so i'll have to settle with the 3080 for now. Thank you.

Posted on 2020-11-03 15:50:07

A 3080 should be plenty for just editing even for 6K/8K, but if you do a lot of noise reduction or OpenFX, it may not be enough for realtime playback depending on your codec and effects. From the testing we have done with customer projects, however, you are more likely to run into a CPU bottleneck before the RTX 3080 is a problem in normal editing situations. But again, for things like NR and such, there is rarely a time where more GPU power wouldn't help.

Posted on 2020-11-03 20:00:11
Avatar tazztone

u mean VRAM, not just GPU power right?

Posted on 2020-11-05 09:56:35
Avatar Peetz

Matt, now that Davinci Resolve 17 has been annouced, I wonder if your Benchmark engines have to be re-done before Puget can give us data that refects version 17? Hope it comes soon before stocks of these GPUs become more available, so we can factor in Puget's benchmarks before buying new gear.

Posted on 2020-11-06 03:46:56

No clue until it actually releases. Usually, there isn't much of a problem with the Resolve testing moving to new versions of Resolve, but no way to no for sure until we can test. Beta versions often don't work with the methods we use for testing, but what we use has worked since Resolve 14 (maybe even earlier, but that was when we started), so I'm hopeful it will work OK.

We are currently planning on re-doing GPU testing when AMD launches the Radeon RX 6800(XT) on November 18th (subject to when we can get cards), so hopefully Resolve 17 will be available for download by then. I know the announcement is on Nov 9th, but I'm not sure if that is a paper launch of Resolve 17, or if it will be immediately available.

Posted on 2020-11-06 17:36:33
Avatar Ampere


Blackmagic Design Announces DaVinci Resolve 17.

Posted on 2020-11-09 20:08:46

Yep, I was watching the live stream. Didn't see too much in there that should affect performance, but the AI tracking looks super interesting!

Posted on 2020-11-09 20:12:40
Avatar Gabriel Passarelli

I can't wait for benchmarks on AMD RX 6800! Right now, on the verge of 8K and steady adoption of 4K as the new standard, VRAM is the major bottleneck in Resolve for the last GPU generation. Even if the performance is not so impressive as the gaming benchmarks suggest, 16GB of VRAM is nice to have. May be too little for certain tasks, but its much better than constants GPU Memory Full errors with 8GB GPUs.

Posted on 2020-11-18 15:18:57

I'm really curious as well, but we don't have any cards :( We have a reviewer who is willing to loan us their cards, but won't be able to until next week at the earliest. So probably at least a couple weeks before we have results.

Posted on 2020-11-18 17:53:17
Avatar Peetz

In an ideal world, it'd be nice to include the Apple M1 benchmarks in the mix. Is that wishful thinking, Matt? :)

Posted on 2020-11-19 07:14:42
Avatar Thaynara Santana

Is the 3070 paried with ryzen 7 3700X a good combination? Will it work fine with 1080p/4k timeline and some fusion work and color grading on the free software? Or should I go with the 3800x or another processor? I hear processors are more important with davince so wondering what would pair with the 3070 card. Thanks in advance

Posted on 2020-11-24 11:40:43

If you are using the free version, then definitely prioritize the CPU since the GPU isn't going to do much. However, definitely think about purchasing Resolve Studio - in general that $300 investment will give you more performance than spending $300 on hardware upgrades, and it is a permeant investment since it should work with all future versions of Resolve. Better to get a used RTX 2060 and 3700X if that means you can get a Studio license.

Posted on 2020-11-24 18:15:39
Avatar Thaynara Santana

Thanks for the reply. I'm not looking at buying the license at the minute (maybe in 1-2 years? Will have to see). I'm just an amateur and really need an upgrade to my pc that is currently 10 years old (I also game a bit). It barely runs fusion and color grading. So I'm trying to find a good mid tier combo...it seems that it's best to get the new GPU series because of price/performance if compared to old generations? That's why I was looking at the 3070...but feeling very unsure about which CPU to go for....the more cores the better right? So should I look for 12 core CPU's? Or just 8 since my timeline will mostly be 1080p/4K?

Would appreciate if you could give me thoughts on different combos for CPU/GPU that I can have....I want this pc to last me a good time but hopefully will improve my editing skills and maybe in the future I will buy the license for davince!

Posted on 2020-11-24 18:31:06

Yea, price/performance is great with the new RTX 3000 cards, but only with the Studio version of Resolve. The free version is basically all CPU, so that RTX 3070 will perform just as well as a 1650 Ti. If you want to work with 4K timelines then you should get a GPU with at least 8GB of VRAM, but the actual performance of the card won't matter much until you get a Studio license.

You will have to balance with your other uses (gaming), but for Resolve free, you basically have the following priorities: system RAM > GPU VRAM > storage size/speed > CPU >>> GPU performance

Posted on 2020-11-24 18:44:56
Avatar Thaynara Santana

Thanks a lot. Why and how are these two resolve versions so different?

Posted on 2020-11-24 19:49:58

There are a ton of great articles on the two versions, just search for "DaVinci Resolve Free vs Studio". This one is one of the first results and has a good breakdown: https://www.toolfarm.com/tu...

As for why they have two versions, only Blackmagic really knows. My personal belief is that by having a great free version sucks people in when they are first learning, a student, or a hobbyist. Then, as they move to more professional roles, they will migrate to the paid version because they already have a ton of experience with Resolve.

Posted on 2020-11-24 20:24:07
Avatar Timo Teräväinen

Hi, as a Resolve user for more than a decade, I have to correct the statement of Resolve free version being "all CPU". Resolve free version uses GPU as an image processing engine in the same way as the paid version when doing basic editing and color grading work. Resolve Studio adds features like GPU encoding and decoding of certain codecs, such as R3D and H.264, and provides support for using multiple GPUs. But it doesn't mean you couldn't work with those codecs on the free version, they are just decoded/encoded with the CPU, just like they have been for many years before the GPU decoding became available. Also noise reduction and many of the open FX don't work on the non-paid version. But the basic image processing pipeline uses GPU in the same manner on both versions, and there IS a major difference between, for example, GTX 1650ti and RTX 3070, on the Studio version as well as the free version. Fusion, of course, is another animal altogether, and it uses mainly the CPU on both Studio and non-Studio versions.
I used to work with the non-studio (or it was called "Resolve Lite" then) for quite a while before getting the paid version. It can be very well used for basic editing and colour grading work, when working with UHD resolutions or below, since the non-paid version cannot output larger than UHD resolution. And for learning to use Resolve. When more features are needed, then it's always possible to upgrade to the Studio.
The combination of Ryzen 3700X and RTX 3070 makes a powerful PC for 4K editing and colour grading, combined with fast enough disks and at least 32GB ram. I'm doing colour grading of 4K and 5K materials with a 3700X and RTX 2070, so 3070 would make things even more fluid. But I'm going to wait if there's a 3080ti with 20GB vram around the corner.

Posted on 2020-11-25 20:14:39

Yea, I was being very broad in my "All CPU" claim. The free version definitely does use the GPU for some stuff like R3D debayering, displaying stuff on the screen, and a number of other background tasks. But compared to the Studio version, the GPU usage is minimal. Outside of R3D codecs, I'm not sure off hand what tasks would show a significant difference between a 1650Ti and a 3070. Maybe face refinement and the new magic masks, but I believe those are Studio only as well.

Can you give an example where the GPU matters outside of R3D codecs in the free version? Genuinely curious since I mainly deal with the Studio version.

Posted on 2020-11-25 20:22:46
Avatar Timo Teräväinen

Are we talking about the same program, Davinci Resolve? It's hard to imagine where GPU is NOT being used. GPU is the main image processing unit in Resolve, just as much in Studio as in free version, with the exception of Fusion page. Just to ensure I'm was not going insane, I had to check this so I downloaded the free version, opened a film that I have been colour grading (4K and 5K material, boh R3D and Prores, about 12-15 colour grading nodes on each shot with basic corrections, masks, tracking etc) and sure enough, the performance is exactly the same on the free version as on the Studio version. GPU usage is howering around 85-100%. I would not exactly call that minimal. And as I said, it's the same in both Studio and free version, and there is no difference in performance whatsoever. The only difference is, that some OpenFX and noise reductions don't work on the free version, and there is a watermark on those shots. As a sidenote, I'm not using much of the OpenFX, except for occasional glow or such, but just basic colour grading stuff: Primary wheels, qualifiers, Power Windows, tracking and stablizing, blurring and sharpening, scaling. GPU is the main processing unit of these tools. CPU mainly deals with decoding and encoding (and this is a broad statement of the use of CPU.. of course it does other things too)

Posted on 2020-11-25 20:53:31

It all depends on what you are doing. In your case, ProRes is a CPU-based codec in both the free and Studio version, so there isn't much of a difference between the two versions. Basic grades (power wheels, qualifier, etc) is using the GPU on both versions, but we usually don't see much of a performance difference between even low and high-end GPUs for simple things like that. Multiple blurs/sharpening you might start to see a difference, but I wouldn't call that a normal workflow. All the effects you are stacking might show a difference as well - I'll definitely give you that.

If you start using H.264 media (which is overwhelming the primary type of media used by free edition users in our experience), the free edition really suffers from not having GPU decoding. I'm actually in the middle of testing our new Resolve benchmark with the free version, and if you look at the raw results, you can see that the Studio version can be as much as 2x faster with H.264 media. Don't look at the overall score, there was a new BIOS that I think is making Fusion performance much better on the second test run.

Studio: https://www.pugetsystems.co...
Free: https://www.pugetsystems.co...

Basically, in my mind the free version of Resolve is less GPU dependent than Premiere Pro at this point (since they added GPU decoding/encoding for H.264), and most people view Premiere Pro as being vastly more CPU dependent than GPU. You were absolutely right to call out my "All CPU" generalization, but I still hold to the idea behind it that the free edition needs a powerful CPU far more than it needs a powerful GPU!

Posted on 2020-11-25 21:06:44
Avatar Timo Teräväinen

I certainly know that H.264 media is used a lot, and it's something that I frequently have to work with, but you really don't need Resolve Studio to work with it. The best way to work with it is to transcode it into something more edit-friendly format, like Prores or DNxHD, but if theres no time for that, it really works quite decently on the free Resolve with a decent CPU and GPU, at least up to 4K resolutions A decent CPU would be anything above Ryzen 3600X and a decent GPU would be anything above RTX 2060 Super. I actually haven't noticed any difference performance-vice with this quite recent addition of H.264 encoding feature to Resolve. Maybe on the test bench the differences look big, but the real life situation is not so dramatic.

And the test that you sent just proves my point - the free and Studio versions are very close to each other, performance-vice. I really don't know what people use Resolve for these days, maybe as a some kind of a swiss-army knife, but it was built to be a colour grading platform and its core function is to perform these functions that you call basic grades. They are built into node trees, and a basic node tree can consist of something between 10-30 nodes. If you happen to follow some of the colorists forums such as LGG, the way colorists measure Resolve performance is by testing how many nodes (for example blur nodes or noise reduction nodes) it can handle with the playback staying in real-time. This is what Rersolve is about, real-time performance. These new encoding features such as R3D and H.264 GPU encoding are great when they help with getting real-time playback with large node trees and high resolution material. Rendering times are not that important, since the time used in rendering is usually less than a percent of the time used doing the actual colour grading or editing work.

I have to strongly disagree with your claim that the free version of Resolve would be less GPU dependent than Premiere Pro. It is just as much (or lets say 85% as much) GPU dependent as the Studio version. It's a different thing to look at benchmarks and render speeds than to actually work with these apps from morning till night. As someone who does that, I would say that Premiere can be very nicely used with a Macbook Pro and integrated GPU for its core function (editing). Whereas Resolve is totally crippled without a decent GPU for doing its core function (color grading).

Posted on 2020-11-25 21:51:19

I agree that the free edition can be more GPU dependent, I just argue that this isn't the norm. You are the first person I've talked to who is using the free edition with that complex of a grade. Almost every single person we work with using the free edition is using it for basic video editing like gaming montages, home movies, concerts, etc. The type of thing where if they do anything in the Color tab at all, it is going to be just basic adjustments with the color wheels. Heck, recently, we are having more and more customers with the Studio version that are barely touching the Color tab at all and are concerned primarily about performance in the Edit tab.

It sounds like we just have a different idea on what "normal" usage is in the free version of Resolve, and that is totally fine! Especially in an app like Resolve that has coloring, editing, VFX, audio, and more, there really isn't going to be a single workflow that applies to even a relatively small portion of users.

The exact workflow of the end user is always going to influence hardware requirements heavily. You can make Premiere Pro incredibly GPU dependent as well with just a handful of specific GPU-based effects, but if someone were to ask if Pr was more CPU or GPU dependent without any more context, I would assume that they are a more typical user and tell them that the CPU is vastly more important (just like I did for the original question about the free edition of Resolve). That is also one of the reasons why we highly encourage all our customers to talk to one of our consultants before buying a system. It isn't to try to up-sell them on anything, but rather to talk to them about their workflow to tailor the system to their exact needs.

Posted on 2020-11-25 22:07:38
Avatar Timo Teräväinen

Surely, Resolve can be used for whatever purposes people see it fit. My perspective comes from using it primarily as a color grading tool, and from having started using from version 8 back in 2011, and at that time color grading was the only thing you could do with it. There were no editing tools, just some clunky conforming tools. Back then it was almost a miracle that a color grading system+software that had previously cost around half a million could now be built from a gaming PC, and to either work with a free version or to buy the full version for just 995€. And I used to work with the free version for quite a while without needing any of the features from the paid version, but have been primarily working with the Studio version for the last 5 years or so.

The development of Resolve has been kind of mind-boggling, with the addition editing tools, sound tools, FX tools, the marriage of Fusion into Resolve.. and now there are some new AI features too. I just want to reminding that what this software was originally built for, and what it in my opinion still does best, is color grading. It is one of the top color grading software used for major films and TV series, along with systems like Lustre, Baselight etc. And sure, anyone can use it for whatever they want, and the needs for different workflows possibly require a bit different kinds of hardware. But apart from Fusion and Fairlight use, it is really built around GPU as the image processing engine.

What I originally reacted to, was the sentence ”the free version is basically all CPU”, to me that sounded really strange. It makes a whole world of difference which GPU you use with it, be it the paid or the free version, when using any of the image processing tools. If doing just basic editing of YouTube cat videos and such, then it doesn’t really matter. I have no need to argue more about this. I appreciate your testing work and your articles a lot. As a suggestion, I wish there could be something like ”Standard Candle Test” along with the Resolve benchmarks, as that is very useful from color grading performance perspective. It's basically measuring how many blur/noise reduction nodes Resolve can handle.

Posted on 2020-11-26 07:54:42
Avatar dnywlsh
and most people view Premiere Pro as being vastly more CPU dependent than GPU

Who, exactly?

Premiere has supported GPU decoding and encoding for years now.

They recently added VCE and NVENC support for Windows, but have supported Quick Sync for a while. Hardware encoding is enabled by default for H.264 and HEVC.

The CPU is no longer used for decoding or encoding, and hasn't been for years. The CPU definitely isn't nearly as relevant these days. It won't be the bottleneck, unless you have a very slow CPU.

Posted on 2020-11-30 05:01:19

Basing that on what we hear from our customers every day when we talk to them, exchanges at tradeshows, plus what tends to get repeated on sites like Reddit and the Adobe forums.

I don't think I would agree that the CPU hasn't been used at all for encoding and decoding for years. That may be largely true for people who had a CPU that supported Quick Sync, but that is no where near the majority for desktops - at least not at the level that we deal with. For example, the Intel X-series was until recently one of the most popular CPU lines for our Pr systems due to it's performance in professional codecs like ProRes, R3D, and various other raw codecs. More recently, that was taken over by AMD's Ryzen and Threadripper lines. Not to mention that many "corporate" level editing systems will often exclusively use the the higher-end Xeons. None of those support Quick Sync.

Heck, plenty of people still use software encoding even with the new GPU encoding because they want the best quality in the smallest file size. Software decoding is also still in use for some workflows where the hardware decoding can't keep up for straight playback performance (primarily multicam).

Really though, please take all this in context of the comment I was replying to. I'm not going to get into the intricacies of CPU versus GPU in every possible workflow for each comment I reply to. I was tailoring my comment based on what I imagined their workflow to be based on the information they provided.

Posted on 2020-11-30 20:30:58
Avatar dnywlsh
I don't think I would agree that the CPU hasn't been used at all for encoding and decoding for years.

For decoding, absolutely. Every discrete or integrated GPU has supported hardware decoding for a very long time. Software decoding is inefficient, and performs poorly.

For encoding, it's been supported in Premiere since the April 2018 update to Premiere.

Xeon systems really aren't that common for editing. They're overpriced, and complete overkill for editing. My Core i9 and Radeon RX 5000 series GPU is able to easily play back 8K Redcode Raw footage.

Our company purchased i9 editing systems from Puget last year. Xeons just aren't needed at all. I guess we're "not the level you deal with".

For example, the Intel X-series was until recently one of the most popular CPU lines for our Pr systems due to it's performance in professional codecs like ProRes, R3D, and various other raw codecs.

Again, it's unnecessary. My 10-core i9 is easily able to handle ProRes and R3D. You definitely don't need a Xeon or Threadripper to work with these formats.

Heck, plenty of people still use software encoding even with the new GPU encoding because they want the best quality in the smallest file size.

There's no noticeable difference in quality or file size with hardware encoding anymore, especially at higher bitrates. Quick Sync, VCE, and NVENC have dramatically improved since their early versions many years ago. Quick Sync in 2011 was pretty awful. It's not anymore.

At most, I've seen a 10-20MB difference in file size. File size isn't an issue.

Speed matters most to most people, which is why Premiere, Final Cut, Resolve, and others enable hardware encoding by default. If the quality was worse, they wouldn't enable hardware encoding by default, but they do.

Encoding modern formats like HEVC in software is painfully slow. On my system, software encoding is 4x slower than hardware.

Posted on 2020-11-30 21:09:17
Avatar ben

Matt Bach hi matt, any plans to bench the 3060TI?

Posted on 2020-12-02 16:40:19

Yep! Once we get our hands on a card, we will put it through our normal suite of benchmarks. Depending on when that is, we may hold it a bit and combine the 3060 Ti and Radeon 6900 (XT), but we'll just have to see when we can get either of those cards.

Edit: I should mention that we are getting a handful of results uploaded to our public benchmark database: https://www.pugetsystems.co... . Nothing for Resolve, but in Premiere Pro it is looking like it is about 4% slower than the 3070 for GPU Effects - https://www.pugetsystems.co... vs https://www.pugetsystems.co...

Resolve is usually a bigger performance difference for GPUs, so at a guess, maybe 10% slower at the most than the 3070? That would put it about on par with the 2080 Super, but we'll have to wait until we can do our benchmarks to be sure.

Posted on 2020-12-02 17:19:47
Avatar Ammon Ehrisman

Do you have plans to run this test with 12K BRAW footage? I know that is is very VRAM intensive.

Posted on 2020-12-12 13:03:48

We had considered it, but it increases the VRAM usage quite a bit - to the point that there isn't much reason to do a performance test since you should get an RTX 3090 24GB for the VRAM alone. If/when GPUs start to have more VRAM and 12K media gets a bit wider adoption, however, I can definitely see us including it.

Posted on 2020-12-12 18:26:03
Avatar Craig A

I am in the process of finalizing a build and have a couple questions: 1) 32GB ram vs 64GB, mostly I'll be starting with 1080 video for YouTube channel but will more than likely move to 4K at some point. 2) RTX3070 vs 3080. I plan on doing color grading and will probably play with LUT's as I develop my "style". So far I've selected the Gigabyte Auros Elite x570 wifi, AMD Ryzen 3900x, 850 watt power supply, and Lian Li Lancool Mesh Performance. I'm thinking of 64GB ram because...well why not, but if the $130 more doesn't get me much I'd stick with 32GB. Thanks for whatever advice you can give me.

Posted on 2020-12-22 21:28:42
Avatar Guilherme Montenegro

64gb + rtx3080

Posted on 2020-12-28 23:37:33