Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1904
Article Thumbnail

DaVinci Resolve Studio - RTX 3080 & 3090 Multi-GPU Performance Scaling

Written on September 24, 2020 by Matt Bach

TL;DR: NVIDIA GeForce RTX 3080 & 3090 GPU Scaling in DaVinci Resolve Studio

Power users of DaVinci Resolve Studio love to load up their workstations with multiple GPUs, and doing so with the new RTX 30-series cards is just as effective as it was with the previous generation. However, while each GPU you add can increase the performance in Resolve by roughly 50% for each card, be aware that we would highly recommend waiting before pulling the trigger on getting multiple RTX 3080 or 3090 GPUs.

The issue is that the cooler design on the models currently available is designed strictly single GPU configurations. Whether you are looking at the Founders Edition cards from NVIDIA or one of the various 3rd party designs, these cards are not going to work long-term in a multi-GPU setup. Once blower-style cards become available, the RTX 30-series cards should be an excellent option for a DaVinci Resolve workstation.


DaVinci Resolve is known in the industry as having excellent GPU-acceleration support - greatly benefiting from a powerful (or multiple powerful) video cards. This makes the new RTX 30-series cards from NVIDIA very attractive given the incredible performance we saw in our DaVinci Resolve Studio - NVIDIA GeForce RTX 3080 & 3090 Performance article. If you missed that post, the RTX 3090 is especially impressive as it matches a pair of RTX 2080 Ti 11GB cards while costing $1,000 less and having significantly more VRAM.

But for power users, one GPU is simply not enough. Dual, triple, and even quad GPU configurations are common in high-end Resolve workstations in order to further increase performance. However, the new RTX 30-series cards have a number of issues that, for the moment, prevent these types of configurations.

DaVinci Resolve Studio GPU Scaling - NVIDIA GeForce RTX 3080 10GB & RTX 3090 24GB

While Resolve can scale nicely with multiple GPUs, the design of the new RTX 30-series cards presents a significant problem. Not only is the power draw significantly higher (which means more heat is being generated), but the current cooler design on the FE (Founders Edition) cards from NVIDIA and all the 3rd party manufacturers is strictly designed for single-GPU configurations.

On the NVIDIA FE cards, the fan on the front of the card actually blows through the GPU - sucking cool air in from the bottom and exhausting out the top. That means that if you have multiple cards, you will be venting the hot air from one card directly into the intake of the card above it. Multiply this across two, three, or even four GPUs, and you have a recipe for thermal throttling, if not outright system crashes.

The aftermarket cards are not much better. They do not vent through the card, but almost all of them do not vent any of the waste heat outside the system and instead rely on the chassis fans to take care of removing the hot air out of the system. That is feasible for a single GPU, but with multiple cards what ends up happening is that the air simply gets recycled between the cards, getting hotter, and hotter, and hotter.

Triple NVIDIA GeForce RTX 3080 10GB on test bed

The good news is that Gigabyte has already announced a blower-style version of the RTX 3090 (the GeForce RTX 3090 TURBO 24G) which should in theory make multi-GPU configurations of the RTX 3090 possible. Oddly, there is no word on an RTX 3080 model as of yet, but we are hopeful that other manufactures will take note and make blower-style models of the RTX 3080 and 3090.

In the meantime, we can do some performance testing with multiple RTX 3080 and 3090 GPUs using the cards that are currently available. However, we need to stress that this is very early testing. The RTX 30-series cards have much higher power requirements compared to the previous generation, and the coolers are not optimal for multi-GPU configurations. We do not know yet what will be stable and reliable long-term, but we will likely need to wait for blower-style cards to be released. So, while we are looking at performance in Resolve with multiple cards in this article, we highly recommend waiting until our qualification team determines what will stable and reliable long term.

DaVinci Resolve Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!

Test Setup

Listed below is the specifications of the system we will be using for our testing:

Test Platform
CPU AMD TR 3970X 32 Core
CPU Cooler Noctua NH-U14S TR4-SP3
Motherboard Gigabyte TRX40 AORUS PRO WIFI
RAM 4x DDR4-2933 16GB (64GB total)
Video Card 1-2x NVIDIA GeForce RTX 3090 24GB FE
1-3x NVIDIA GeForce RTX 3080 10GB FE
1-3x NVIDIA GeForce RTX 2080 Ti 11GB
Hard Drive Samsung 960 Pro 1TB
Software Windows 10 Pro 64-bit (Ver. 2004)
DaVinci Resolve Studio (Ver. 16.2.7)
PugetBench for DaVinci Resolve (Ver. 0.92)

*All the latest drivers, OS updates, BIOS, and firmware applied as of September 15th, 2020

To test multiple GPUs, we will be using the fastest platform currently available for DaVinci Resolve - most notably the AMD Threadripper 3970X. Since Resolve utilizes the CPU so heavily, this should minimize the impact of the processor and allow each configuration to perform to its fullest potential.

For the testing itself, we will be using an upcoming version of our PugetBench for DaVinci Resolve benchmark that is not yet available to the public. This new version is very close to being available for download, but since the tests are much better than the version that you can currently download, we opted to go ahead and use it for this comparison.

One thing you might notice is that we only used three RTX 2080 Ti and 3090 GPUs, and just two RTX 3090 GPUs. The reason for not testing quad GPU configurations is simply because we almost never sell that configuration for Resolve. Almost all professional workflows require a video monitoring card for unbiased video, which has to take the place of a GPU. It also requires either a motherboard that we have not qualified or moving to an Intel Xeon platform which is not as good for Resolve overall.

For the RTX 3090, we maxed out at just two cards because the RTX 3090 FE cards are triple-slot, which means we can only fit two of them into our current Threadripper platforms. With certain motherboards, it may be possible to cram three RTX 3090 cards into a single workstation, but that is going to be extremely difficult to do - not to mention likely impossible to keep cool. So until dual-slot blower cards come out, we have to stick with just a dual RTX 3090 configuration for this testing.

Once again, we want to stress that the GPUs we are using are not designed for being used in this manner. It shouldn't affect our results much, but we would absolutely not recommend using this setup for your own workstation.

Raw Benchmark Results

While we are going to go through our analysis of the testing in the next section, we always like to provide the raw results for those that want to dig into the details. If there is a specific codec or export setting you tend to use in your workflow, examining the raw results for that task is going to be much more applicable than our more general analysis.

NVIDIA GeForce RTX 3080 & 3090 DaVinci Resolve Studio GPU scaling raw testing results

Overall DaVinci Resolve Studio GPU Scaling Analysis

While many reviewers like to solely look at things like temporal noise reduction (often to an unrealistic degree) or OpenFX that heavily utilize the GPU, we first want to start off by looking at the overall performance we saw from our DaVinci Resolve benchmark with each configuration in order to show what most users would likely experience in their day-to-day work.

GPU scaling in Resolve is interesting to look at because there are so many facets to the application. Much of Resolve does not actually take advantage of the GPU all that much, with the performance when working with bare media or just a simple grade almost always bottlenecked by your CPU rather than your GPU. Because our benchmark looks at such a wide range of tasks in Resolve, the Overall Score - and even the individual 4K and 8K media scores - is not all that exciting.

In fact, if you look at the last chart for the Fusion tests, you will actually see that the performance actually gets worse with multiple GPUs.

So to get a better idea of the maximum performance benefit from using multiple GPUs, we should focus on the "GPU Effects" portion of our benchmark which looks at tasks like noise reduction and various GPU-accelerated OpenFX.

GPU Score Analysis

NVIDIA GeForce RTX 3080 10GB & RTX 3090 24GB DaVinci Resolve Studio Multiple GPU Scaling

The GPU effects portion of our benchmarks looks at the performance of individual GPU-accelerated effects such as temporal noise reduction, film grain, lens blur, optical flow, face refinement, and more. In our testing, these effects easily show the largest benefit of having multiple GPUs.

Starting with the dual RTX 3090 configuration, it does extremely well, easily out-performing the triple RTX 2080 Ti setup. This makes dual RTX 3090 not only more affordable (at $3,000 for two 3090s vs $3,600 for three 2080 Tis) but also gives you 24GB of usable VRAM versus the 11GB on the RTX 2080 Ti cards. The current triple-slot design of the RTX 3090 FE cards does mean that you will have a hard time fitting a video monitoring card which is a requirement for most professional workflows, but since we would never recommend using these exact cards in this configuration in the first place, we will give that a pass until the blower-style cards become available.

The dual and triple RTX 3080 configurations also do well, consistently beating the same number of RTX 2080 Ti cards by a solid 30%.

The main thing to take away is that the scaling with multiple cards is no different than what it is with the RTX 20-series cards. Two cards are roughly 50% faster than a single card, while three cards are about double the performance of one card. If we extrapolate this out for those that want a quad GPU setup, four cards should be roughly 2.5x the performance of a single card.

How well does the NVIDIA GeForce RTX 3080 & 3090 perform in DaVinci Resolve Studio?

Power users of DaVinci Resolve Studio love to load up their workstation with multiple GPUs, and doing so with the new RTX 30-series cards is just as effective as it was with the previous generation. However, while each GPU you add can increase the performance in Resolve by roughly 50% for each card, be aware that we would highly recommend waiting for blower-style cards before pulling the trigger on multiple RTX 3080 or 3090 GPUs.

The issue is that the cooler design on the models currently available is designed strictly for use with a single GPU. Whether you are looking at the Founders Edition cards from NVIDIA or one of the various 3rd party designs, these cards are not going to work long-term in a multi-GPU setup. Once blower-style cards become available, the RTX 30-series cards should be an excellent option for a DaVinci Resolve workstation.

The good news is that some manufacturers like Gigabyte have already announced blower-style cards (the GeForce RTX 3090 TURBO 24G) which should work significantly better for this. Of course, we need to wait and see how well those cards are able to handle the higher power draw of the RTX 3090, but at the very least there is still hope for using multiple GeForce RTX 30-series cards and not having to jump up to the more expensive Quadro models that (we assume) are coming down the pipe.

As always, keep in mind that these results are strictly for DaVinci Resolve Studio. If you have performance concerns for other applications in your workflow, we highly recommend checking out our Hardware Articles (you can filter by "Video Card") for the latest information on how a range of applications perform with the new RTX 3080 and 3090 GPUs (including multi-GPU setups when relevant), as well as with different CPUs and other hardware.

DaVinci Resolve Workstations

Puget Systems offers a range of poweful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Hardware Acceleration, hardware encoding, NVIDIA, NVIDIA vs AMD, AMD, Vega 64, Radeon RX 5700 XT, RTX 2060 SUPER, RTX 2070 SUPER, RTX 2080 SUPER, RTX 2080 Ti, Titan RTX, RTX 3080, DaVinci Resolve, RTX 3090, GPU Scaling
Håkon Broder Lund

Lower scaling that I would expect on the overall score. I assume that is the CPU holding it back? GPU effect scaling looks great though!
How come the Fusion score was lower? Too much heat in the system so the CPU would thermal throttle?
Also, strange that optical flow does not scale from 1 to 2 GPU, but does when the third is added.

Posted on 2020-09-24 13:27:49

I don't think the CPU is holding it back, or else you would see a different slope with the more powerful cards. The problem is likely simply how much data has to be moved to and from the cards (and likely between the cards as well). That isn't just related to the PCI-E bus speed, but also the memory speed, CPU caching speed, and a whole host of other factors.

It isn't a perfect comparison since the tests are different (not to mention Resolve is likely better optimized now), but if you look at the "Color Grading - Torture" tests in our first GPU scaling testing for Resolve (https://www.pugetsystems.co..., we only saw a 30% improvement with two GPUs, and around 50% with three cards. So with the newer platforms, software, and everything else, GPU scaling in Resolve has actually doubled in efficiency since 2017

For Fusion, we haven't been able to figure out why performance gets worse with more cards. It isn't due to throttling, so our only guess is simply badly optimized code. Fusion was a stand-alone app (and still is) before it was added into Resolve, so there probably is some sort of conflict between how it uses the GPU and how Resolve as a whole uses it. Perhaps the base code for Fusion was never programmed with multiple GPUs in mind?

The Optical Flow result is definitely strange as well! But... it is consistent in it's strangeness, so it isn't due to a bad test run or anything like that. This is honestly why real-world testing like this is so important! We would have never guessed that that would be the case, but it is what it is.

Posted on 2020-09-24 16:45:28
Ethan Hallbeyer

Matt, would it be possible to put a 3090 and 3080 in the same system? Would it necessitate the use of a PCI extender for the 3090 and mounting it vertically? I’m considering a 3080 for gpu passthrough in Linux for Windows gaming and 3090 would be exclusively used by the host for Davinci Resolve. Is this feasible and do you think it’s a better solution than having 2 x 3080 instead, then using both 3080 for Resolve when the 2nd one isn’t used for gaming?

Posted on 2020-10-02 06:10:39

You can mix cards in Resolve, but we generally recommend against it. Windows update is usually the biggest issue - it can get confused when you have different cards and try to apply a driver update to only one GPU, but break things for the other GPU. Not insurmountable of an issue, but might mean having to manually re-install drivers every once in a while.

Posted on 2020-10-02 16:47:12

Look like JF?

Posted on 2020-09-24 13:40:31
Jose Santos

Great Article! I‘m eagerly awaiting for the 30 series to be in stock here in germany so that I can get one or 2 of them.

With that in mind: two 3080’s will cost only 100$ more than the 3090 but according to your results yield much better performance. What concerns me is the limited 10gb of vram of the 3080 (I currently own a Radeon VII and regularly do 4K projects) would 10gb of vram be enough for 4K grading (sometimes with 6/8k media?)

Also I’m on a Ryzen 3850x not a threadripper so pcie lane availability is also something I’m wondering could be an issue?

Posted on 2020-10-08 12:34:56

4K timelines should be OK with 10GB of VRAM, but if you are putting 8K media in a 4K timeline and using multiple OpenFX or noise reduction, that might get dicey. In general, if someone is using 8K media, we would always say to get a GPU with at least 20GB of VRAM.

Generally, I wouldn't worry about PCI-E lanes unless it is going to end up below x8. x8 vs x16 usually is only a handful of a percent difference which is rarely going to change your choice of CPU/platform.

Posted on 2020-10-08 20:51:10
Daniel O'Flaherty

Hi Matt! Following in from Jose’s post he is talking about the GTX 3080 10GB VRAM. However it is reported that Nvidia will come out with the GTX 3080 20GB VRAM. Now would it be better to get 2x GTX 3080 20GB VRAMS, or, 1x GTX 3090? Specifically for using 8K footage and also using Fusion? You mentioned 20GB VRAM as minimum but attested to Resolve working better with more GPU’s (and alas this would be 2 of the exact same GPU’s which deducts the issue of mixing cards). Obviously testing is the best way to check I’m sure but am excited to hear your initial thoughts. Thanks!

Posted on 2020-10-09 13:16:11

I would take the reports of a 20GB model very lightly. That hasn't been confirmed by NVIDIA, and frankly I would be very surprised if they did it since it would eat into their RTX 3090 sales. I could be wrong. and frankly, I hope I am because an RTX 3080 20GB would be amazing.

But if a 20GB model does come into existence, it would be a great way to be able to edit 8K media on a bit lower of a budget. Most of the time, you are going to be CPU limited when just doing edits, so you don't need an RTX 3090 unless you want to process OpenFX or do NR faster.

As for performance, two 3080's are faster (and cheaper) than a single 3090, but you are limiting yourself a bit for future upgrades and the power draw/heat/noise will be higher. More cards also means more complexity, which always means a higher risk of something breaking. The higher performance may be worth the tradeoffs, but that is something you have to decide for yourself.

Posted on 2020-10-09 17:22:09
Adrian Woods

Thanks for the above test. I know you mentioned the downside of the 3080 FE cooling for multi-GPU builds, but could we expect a test similar to this https://www.pugetsystems.co... for the 3080 FE in the near future? Curious to see if the 'half blower' would be sufficient to prevent clock speed drops. Or would the concern in this case be mostly for damage in long term usage?
Thanks so much for all the great articles!

Posted on 2020-10-17 18:57:49
Adrian Woods

I also wondered if you measured the power draw of the 3 x 3080 cards running at full. I am wondering if a 1300w psu would be sufficient for this setup to handle the peaks. I saw a linus test with 2 x 3090 coming in just around 1000w (which needed a stronger psu for the peaks), so i theory 3 x 3080 may work on a 1300w PSU?

Posted on 2020-10-18 11:14:12

I don't know if we will do an article like that for the FE cards, but if it looks like it is a problem we probably will. We are pretty committed to using blower-style cards because of exactly the issues we showed in that testing, and the higher wattage 3080/3090 cards are just going to be worse.

As for power draw, I believe we are planning on a 1200W PSU for dual 3080/3090. That should have plenty of power for the GPUs plus even the highest wattage CPU we currently use. Going up to 3-4 3080 or 3 3090 (we aren't planning on 4 as an option) will be using a 1600W PSU.

Posted on 2020-10-19 16:16:50
Jose Santos

Just watched the new Big Navi Announcement from AMD. I hope you will be testing those out as well! Will be looking forward to it.

Posted on 2020-10-28 20:01:17

We definitely will! Since it is such a focus on gaming, I don't know if we will be able to have testing results right at launch, but certainly we will as soon as we can!

Posted on 2020-10-28 20:03:28

It is also important to note that many of the applications we test only work with NVIDIA graphics cards. Off the top of my head, I believe OctaneRender, Redshift, RealityCapture, and Pix4D, are all NVIDIA / CUDA only.

Posted on 2020-10-28 20:18:22

Hi .
Need help please !!

I have two rtx 3080, and I want to group them both how to do it please, because they do not support the sli.

Posted on 2020-11-16 23:23:12

I'm not sure what you mean by "group them". If you mean just making sure that Resolve is set to use both, you need DaVinci Resolve Studio (the paid version), and check in the "Memory and GPU" section of the DaVinci Resolve Preferences. You can set the GPU processing mode to CUDA, the selection mode to Manual, and select which GPUs you want to use. You can also leave it all on Auto and just make sure the "Use display GPU for compute" is checked.

Posted on 2020-11-16 23:32:13

Can I add a 1080 to my 1950x/1080ti and expect any decent improvements in GPU-specific loads? I am thinking of selling the 1950x rig and swapping the 1080ti currently in there into my 3950x which has a 1080 in it at present.

Posted on 2021-07-06 23:24:51