Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1382
Article Thumbnail

DaVinci Resolve 15: AMD Radeon VII 16GB Performance

Written on March 12, 2019 by Matt Bach
Share:

Introduction

DaVinci Resolve is one of the few applications in the video editing world that really pushes the envelope in terms of GPU acceleration. Not only will a higher-end GPU almost always give you better performance, but Resolve can take advantage of multiple GPUs at the same time. In fact, many professional Resolve workstations will have three, four, or even more GPUs all working together to provide the user with the smoothest color grading workflow possible.

For quite a while now, NVIDIA GPUs have been our go-to recommendation for DaVinci Resolve. AMD's Radeon video cards were fine for their price, but for the level of performance most of our customers desired, AMD simply couldn't keep up. However, AMD's new Radeon VII 16GB has a number of features that could shake things up.

Right away, we knew that this was a card that was going to be very interesting for DaVinci Resolve due to its 16GB of VRAM. Many of our customers and readers are starting to work with 8K media or are using Resolve for heavy noise reduction where having a high amount of VRAM is incredibly important.

AMD Radeon VII DaVinci Resolve

While the 16GB of VRAM alone makes this an interesting option for DaVinci Resolve, raw performance is a major concern as well. To see how the Radeon VII fares, we are going to take a look at how it stacks up against the AMD RX Vega 64 as well as the entire NVIDIA GeForce RTX lineup - plus the Titan RTX - in DaVinci Resolve Studio 15.3. We also have (or will soon have) a number of articles available looking at the performance of the Radeon VII in a range of other applications that you can view by filtering our recent articles to just those about Video Cards.

If you would like to skip over our test setup and benchmark result sections, feel free to jump right to the Conclusion section.

Test Setup & Methodology

Listed below is the system we will be using in our testing:

Our testing primarily revolves around the minimum FPS you would see with various media and levels of grading in the Color Tab. The lowest level of grading we test is simply a basic correction using the color wheels plus 4 Power Window nodes with motion tracking. The next level up is the same adjustments but with the addition of 3 OpenFX nodes: Lens Flare, Tilt-Shift Blur, and Sharpen. The final level has all of the previous nodes plus one TNR node.

We kept our project timelines at Ultra HD (3840x2160) across all the tests, but changed the playback framerate to match the FPS of the media. For all of the difficult RAW footage we tested (CinemaDNG & RED), we not only tested with the RAW decode quality set to "Full Res" but we also tested at "Half Res" ("Half Res Good" for the RED footage). Full resolution decoding should show the largest performance delta between the different cards, but we also want to see what kind of FPS increase you might see by running at a lower decode resolution.

Codec Resolution FPS Bitrate Clip Name Source
H.264 3840x2160 29.97 FPS 80 Mbps Transcoded from RED 4K clip
H.264 LongGOP 3840x2160 29.97 FPS 150 Mbps Provided by Neil Purcell - www.neilpurcell.com
DNxHR HQ 8-bit 3840x2160 29.97 FPS 870 Mbps Transcoded from RED 4K clip
ProRes 422 HQ 3840x2160 29.97 FPS 900 Mbps Transcoded from RED 4K clip
ProRes 4444 3840x2160 29.97 FPS 1,200 Mbps Transcoded from RED 4K clip
XAVC S 3840x2160 29.97 FPS 90 Mbps Provided by Samuel Neff - www.neffvisuals.com
XAVC Long GOP 3840x2160 29.97 FPS 190 Mbps Transcoded from RED 4K clip
Blackmagic RAW 4608x1920 24 FPS 210 Mbps A001_08122231_C008 Blackmagic RAW
RED (7:1) 4096x2304 29.97 FPS 300 Mbps A004_C186_011278_001 RED Sample R3D Files
CinemaDNG 4608x2592 24 FPS 1,900 Mbps Interior Office Blackmagic Design
[Direct Download]
RED (7:1) 6144x3077 23.976 FPS 840 Mbps S005_L001_0220LI_001 RED Sample R3D Files
RED (9:1) 8192x4320 25 FPS 1,000 Mbps B001_C096_0902AP_001 RED Sample R3D Files

With the addition of the "Fusion" tab in Resolve, we are also going to be including some basic tests for that tab as well. At the moment these are relatively easy projects that specifically test things like particles with a turbulence node, planar tracking, compositing, and 3D text with a heavy gaussian blur node. These projects are based on the following tutorials:

One thing we want to note is that we had quite a bit of GPU driver issues during our benchmarking process. First, we discovered that the NVIDIA GeForce 418.x and 419.x are currently giving about 10-20% lower performance in Resolve. We have a full write up on this issue in our Support Guide: DaVinci Resolve 15 - Performance loss with NVIDIA 418 & 419 driver. Due to this, we used the NVIDIA driver version 417.71 for our testing.

AMD also had some driver issues that caused image corruption of Blackmagic RAW footage. Even after trying every GPU driver AMD has available for the Radeon VII (ranging from 19.2.1 to 19.3.1) and going back to Resolve 15.0, we were unable to resolve the issue. The benchmark results are in line with what we would expect, however, so to keep this from holding up this article we decided to go ahead with the latest AMD driver. Just be aware that the results with BRAW may not be 100% accurate.

AMD GPU Blackmagic RAW BRAW corruption

Corruption of Blackmagic RAW footage with current AMD Radeon VII drivers (19.2.1-19.3.1)

Benchmark Results

While our benchmark presents various scores based on the performance of each type of task, we also wanted to provide the individual results in case there is a specific codec or level of grade someone may be interested in. Feel free to skip to the next section for our analysis of these results.

AMD Radeon VII 16GB DaVinci Resolve Studio 15 Benchmark Performance Results

Benchmark Analysis

Green = NVIDIA, Red = AMD. Darker colors indicate dual GPU configuration

Our results are divided into several categories based on the level of grade, an "Overall Color Grading Score" that is a combination of each grading score, and a dedicated result for Fusion. For most users, the Overall Color Grading Score should be a pretty accurate breakdown of how you would expect each GPU to fare in DaVinci Resolve. However, if you tend to do very heavy grades with things like noise reduction, you may want to scroll to the fourth chart which has the results for a grade with a combination of Power Windows, OpenFX, and TNR.

From a color grading perspective, the Radeon VII does extremely well. Even though it is priced right between the RTX 2070 and RTX 2080, it is about 20% faster than the RTX 2070 and 13% faster than the RTX 2080. In fact, since it scored a single point higher than the 2080 ti, it was technically the fastest single GPU we tested! Keep in mind that the Radeon VII has a MSRP about $500 less than the 2080 Ti and more VRAM than any NVIDIA GeForce video card. Only the Titan RTX 24GB (which isn't technically a GeForce card) has more VRAM and it costs 3.5x more than the Radeon VII. Yea... that makes the Radeon VII 16GB a pretty good GPU for Resolve.

While the Radeon VII isn't ideally suited for multi-GPU configurations due to its cooler design (we've talked about this before in regards to the similarly designed RTX card), with proper chassis airflow you should be able to use two of these cards in a full-sized chassis. What is interesting is that in a dual GPU setup, the Radeon VII doesn't scale quite as nicely as the RTX cards. Two Radeon VII are definitely much faster than one, but where the NVIDIA cards saw about a 35-40% performance bump on our hardest grading tests with TNR, the Radeon VII only saw about a 22% improvement.

Don't get us wrong, the Radeon VII 16GB is still very good in a dual GPU configuration since it is a hair faster (and has twice the VRAM) than a pair of slightly more expensive RTX 2080 8GB cards. But even if you were somehow able to get three or four Radeon VII cards in a system without them overheating, our guess is that NVIDIA would end up taking over in terms of raw price-to-performance once you got up to a triple GPU setup.

Is the AMD Radeon VII 16GB good for DaVinci Resolve?

The short answer is: YES! The AMD Radeon VII 16GB is an excellent GPU for DaVinci Resolve. We had issues with BRAW footage, but the fact that the Radeon VII has 16GB of VRAM and matches the RTX 2080 Ti 11GB in terms of single-GPU performance (and with a MSRP that is $500 less) makes it a very, very strong GPU for DaVinci Resolve.

However, nothing is perfect and there are two issues with the Radeon VII that you may want to consider:

  • First, supply is extremely tight at the moment. This usually improves over time, but if you are trying to get your hands on a Radeon VII at the time of this article, don't expect to get one quickly.
  • Second, the cooler design is not very good for multi-GPU setups which are very common on high-end DaVinci Resolve workstations. You may be able to get two cards in a system that has enough space and airflow, but that is going to be a hard limit. To be fair, the NVIDIA RTX cards have the exact same issue, but there are "rear exhaust" versions available that you can get for multi-GPU setups. Whether a manufacturer will come out a blower-style Radeon VII at some point in the future is completely unknown, but right now they simply don't exist.

Dual AMD Radeon VII 16GB Multi GPU

While the Radeon VII is great for Resolve, we do want to point out that if you have a very healthy budget, you can still get better performance from NVIDIA. Two (or three, or four) RTX 2080 Ti will give you more performance in Resolve than a pair of Radeon VII, and if you really need the VRAM you can go up to the Titan RTX or the Quadro RTX 6000/8000. That will dramatically increase the cost of your system, but if you absolutely need the best performance possible, NVIDIA is still king.

Of course, very few workflows involve just one piece of software. Because of this, we highly recommend checking out our recent Video Card articles where we look at - or will be looking at - the performance of the Radeon VII in Premiere Pro, Photoshop, DaVinci Resolve, as well as a number of other applications.

Looking for a
Content Creation Workstation?

Puget Systems offers a range of workstations designed specifically for video and image editing applications including Premiere Pro, After Effects, Photoshop, DaVinci Resolve, and more.

Tags: DaVinci Resolve, Vega 64, Radeon VII, RTX 2060, RTX 2070, RTX 2080, RTX 2080 Ti, Titan RTX
Misha Engel

First of all thanks for the review and all the time and effort you put in to this, Pugetsystems is the best and the worst review-site for post production hardware/software, yes you are the only ones.

I do have some questions/remarks:
What bios version did you use in the Davinci/radeon7 article?
(First comment Hwgeek in the photoshop/radeon7 article https://www.pugetsystems.co... )

After reading this: One thing we want to note is that we had quite a bit of GPU driver
issues during our benchmarking process. First, we discovered that the
NVIDIA GeForce 418.x and 419.x are currently giving about 10-20% lower
performance in Resolve. We have a full write up on this issue in our
Support Guide: DaVinci Resolve 15 - Performance loss with NVIDIA 418 & 419 driver. Due to this, we used the NVIDIA driver version 417.71 for our testing.
I get the feeling that Pugetsystems doesn't care much about security where NVidia does https://nvidia.custhelp.com... , NVidia doesn't change those drivers for fun, thay got 3 months time to come up with an update before the security flaws would go public.

Security is a big thing these days with Spectre, Meltdown and Spoiler.

How can the Radeon be faster with +3 OpenFX than without in many occasions.

Then this part: But even if you were somehow able to get three or four Radeon VII cards
in a system without them overheating, our guess is that NVIDIA would end
up taking over in terms of raw price-to-performance once you got up to a
triple GPU setup.
Based on what?

For a company like pugetsystems it should be easy to fit a waterblock "Bykski AMD RADEON VII Full Coverage GPU Water Block" on a radeon 7, or any other GPU you put in your systems, it would distinguish Pugetsystems form box-building OEM's like HP and Dell.

Thanks again for the article, although it's pretty confusing

Posted on 2019-03-14 15:17:40

Lots of questions here, but I'll try to hit them all:

1) We are using BIOS version 106 which I believe is the newest available.

2) We absolutely care about security! However, when doing performance benchmarks we never want the results to be skewed by a bug that is going to be fixed in the near future. The performance issue with the newer NVIDIA drivers has already been confirmed by NVIDIA/Blackmagic and they are working on a solution. The problem is not being caused by the security updates at all - if it was, then the issue would not be present in the 418 driver since the security stuff was added in 419. If this performance loss was due to a permanent change, then we would of course use the latest driver (just like how we test all Intel CPUs with the meltdown/spectre fixes even though it can cause a drop in performance).

3) Higher FPS with more effects is definitely odd, but actually not that unusual. I can't really comment on the why since I don't have full knowledge of the render pipeline, but my understanding is that sometimes, shifting things around a bit between the different stages in the pipeline can actually make the software slightly more efficient. This actually comes up in Premiere Pro more often where applying a color grade with Lumetri Color can sometimes slightly raise the playback FPS. It seems to be more common with long-GOP codecs, but can happen with others as well. The couple of results you pointed out could also just be due to normal testing variations - we try to minimize those, but we aren't going to run the benchmark over and over until we get the results we expect since what we expect is not always what reality actually is.

4) We actually are very intentional that we do NOT do full system liquid cooling. We used to do it years ago, but what we found is that if a customer is not the kind of person who would build their own system, they are likely not the kind of person who is willing to do the maintenance required with full liquid cooling. There are of course exceptions, but way too many of those liquid cooled systems ended up being a bad time for everyone involved. So not offering liquid cooling is a very conscious choice we made in order to ensure that we are giving our customers the best possible experience.

Posted on 2019-03-14 16:45:32
Misha Engel

1) I think you are right about that.
2) So it's okay to use insecure drivers and leave your system open for known security issues till NVidia and BMD have solved this problem.
The RTX Titan X values look like they are from the patched drivers (Slower than the RTX2080ti with insecure drivers).
3) Will ask Wendell for this.
4) Good point, it's more something for tweakers and data-center to do customloops.
I do however think that AIO of the shelf good CPU coolers and AIO assembled GPU coolers(by puget) would be a good option
(also maintenance free).

I noticed an other strange conclusion:

"What is interesting is that in a dual GPU setup, the Radeon VII doesn't scale quite as nicely as the RTX cards. Two Radeon VII are definitely
much faster than one, but where the NVIDIA cards saw about a 35-40% performance bump on our hardest grading tests with TNR, the Radeon VII
only saw about a 22% improvement."

Looking at Basic + 4 Power Window + 3 OpenFX + TNR (where it makes sense to use multiple GPU's)

8k.red full

Radeon 7 15 fps
RTX2080 9 fps
2x Radeon 7 17 fps
2x RTX2080 9 fps

Long GOP relies on CPU-power when the hardware acceleration in the GPU is not supported by the software (davinci in this case).

Your conclusion could also have been that on average 1x radeon 7 has about the same speed as 2x RTX2080 (Looking at Basic + 4 Power Window + 3 OpenFX + TNR).

and another one:

"But even if you were somehow able to get three or four Radeon VII cardsin a system without them overheating, our guess is that NVIDIA would end
up taking over in terms of raw price-to-performance once you got up to a triple GPU setup."

Really!!!!!!!!!

2x RTX2080 has on average the same speed as 1 radeon 7 while costing twice as much
2x RTX2080ti is $2400 MSRP where 2x radeon 7 is $1400 MSRP
2x RTX2080ti is $2400 MSRP where 3x radeon 7 is $2100 MSRP
1x RTX Titan X is $2500 MSRP where 3x radeon 7 is $2100 MSRP.

How are you planning to achieve that?

You did a comparision a while ago with GTX1080ti 1x..4x and TitanV 1x..4x with davinci resolve 14 and 3 different CPU setups and by the scaling of that
it's easier to derive that it is often useless to use more than 2 GPU's

https://www.pugetsystems.co...

Davince Resolve improves every year but that part of the engine had no upgrade going for 14 to 15.

As much as I value your article's (yes I check them on a regular basis) and insites, you have some confussing conclussions.

Posted on 2019-03-14 19:51:14
Jose Santos

Hi Thanks for another great review. I'm building my own PC, unfortunately I live outside the US and cannot order one of your amazing workstations,. While my budget is definetly not small I was planning on getting 1x 2080ti, but with the information of this article, for the same price I could buy 2 Vega VII for the price of the 2080ti and have better performance. I'm a colorist and the main purporse of the workstation would be color grading.

Would you recommend the dual Vega VII instead of the single 2080ti? (For the rest of the build I am basicaly copying your Resolve workstation configuration.

Posted on 2019-03-17 09:42:15

Yes, dual Vega VII should be much better than a single RTX 2080 Ti. The only thing you need to be careful of is that you have enough airflow around the cards because they run really hot. At least 1 space between the cards and a side fan blowing over them should be sufficient, but you may want to have that side fan PWN controlled so it can ramp up under load. We are still working on the full qualification to add the Radeon VII to our product line (deciding we want to carry it is just step one, then we have to make sure it is reliable and work out cooling/power concerns), so I unfortunately can't give you much more detail than that.

Posted on 2019-03-18 16:17:18
Jose Santos

Thanks so much for your answer! If I could bother you some more, like I said I am basically replicating your Resolve workstation, but I've come across another interesting motherboard, the MSI Meg Creation. My question is, why did Puget Systems decided to go with the Designare EX from Gigabyte? Did you test the MSI Creation?

Posted on 2019-03-18 17:20:16

We actually used the X399 version of the MSI MEG CREATION for a bit and it was a fine board, so I'm sure the X299 is good as well. The main reason we use Gigabyte these days is because they seem to be among the most stable, have the best direct engineering/support/RMA for us, and Gigabyte by far has the best track record for Thunderbolt actually working properly. Thunderbolt is pretty important for us since a significant portion of our customers either need or want it, which is why we like the Designare EX in particular - it has Thunderbolt 3.

If you don't need Thunderbolt, the MSI MEG CREATION should work just fine since most of the benefits we see from Gigabyte are not things that you as an individual consumer would be able to take advantage of. I still feel like Gigabyte is one of the better brands currently available for motherboards, but I also don't think you are going to get yourself in any trouble by using MSI.

Posted on 2019-03-18 17:26:52
Jose Santos

wow thanks for answering so quickly. I definitely need and will make use of thunderbolt! Maybe I'll go for the gigabyte then! Thanks again!

Posted on 2019-03-18 17:29:09