Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1552
Article Thumbnail

Photoshop GPU Roundup: NVIDIA SUPER vs AMD RX 5700 XT

Written on August 14, 2019 by Matt Bach


AMD has recently made quite a splash in the PC industry with the launch of their 3rd generation Ryzen processors, but they have also released some new video cards like the Radeon RX 5700 XT 8GB. From "team green", NVIDIA has also released new RTX 2060 SUPER 8GB, RTX 2070 SUPER 8GB, and RTX 2080 SUPER 8GB cards. These models are the same price as the normal versions, but feature a small performance bump across the board, and an increase in VRAM on the RTX 2060 SUPER (from 6GB to 8GB).

NVIDIA GeForce SUPER vs AMD Radeon RX 5700 XT in Photoshop

While Photoshop does not typically need a super powerful GPU, we still always like to test new video cards in Photoshop to see how they perform. We have had surprises come up in the past, but the only way to know for sure is to simply put these cards through our Photoshop benchmark.

Since we are interested in how these new cards compare to the previous generation, we will also be including in our testing the full line of NVIDIA GeForce RTX cards as well as the AMD Radeon Vega 64 and Radeon VII. If you would like to skip over our test setup and benchmark sections, feel free to jump right to the Conclusion.

Looking for a Photoshop Workstation?

Puget Systems offers a range of workstations that are tailor made for your unique workflow. Our goal is to provide most effective and reliable system possible so you can concentrate on your work and not worry about your computer.

Configure a System!

Test Setup & Methodology

Listed below are the specifications of the test system we will be using for our testing:

To test each GPU, we will be using one of the fastest platforms currently available for Photoshop - most notably the Intel Core i9 9900K. We do want to point out that in our latest Photoshop CPU Roundup testing, the Ryzen 3900X was actually a touch faster than the i9 9900K, but we have had a few recent issues with that platform so we opted to stick with the tried and true i9 9900K for now.

This does mean that the AMD Radeon RX 5700 XT will not be able to utilize PCI-E 4.0, but we have an entire series of posts coming up that will be examining PCI-E 4.0 in professional apps like Photoshop. For that testing, we will be using the new Ryzen 3rd Gen platform since that is currently the only way to get PCI-E 4.0 functionality.

The benchmark we will be using is the latest release of our public Photoshop benchmark. Full details on the benchmark and a link to download and run it yourself are available at Puget Systems Adobe Photoshop CC Benchmark.

Benchmark Results

While our benchmark presents various scores based on the performance of each test, we also wanted to provide the individual results. If there is a specific task that is a hindrance to your workflow, examining the raw results for that task is going to be much more applicable than any of the calculated scores. Note that the tests that utilize the GPU are highlighted in blue.

Feel free to skip to the next section for our analysis of these results if you rather get a wider view of how each GPU performs in Photoshop.

NVIDIA SUPER and AMD Radeon RX 5700 XT Photoshop Performance Benchmark

Benchmark Analysis

Since we are specifically looking at video cards in this post, we are going to mostly examine the "GPU Score (16 Bits/Channel)" result. If you scroll to the next chart to look at the "Overall Score", you will notice that there isn't much of a difference between each GPU when looking at Photoshop performance from an overall perspective. In fact, in that case even the NVIDIA Titan RTX is only ~5% faster than the integrated graphics on the Core i9 9900K. However, if we look at just the tasks that actually benefit from using a discrete GPU, we at least get enough of a separation between the different models to pull out some useful information.

Starting with the NVIDIA SUPER cards, they are a tiny faster than the "normal" versions, but it is by at most around 1% which is not going to be noticeable in the real world. Besides the extra VRAM on the GeForce RTX 2060 SUPER 8GB, there really isn't much of an advantage to these new cards over the old ones - although since they are not more expensive, it is more of a "why not?" situation if you are already in the market for a new workstation or GPU upgrade.

Things get more interesting when we look at the AMD cards. First of all, if you are choosing between AMD and NVIDIA for Photoshop, it is pretty obvious that NVIDIA has a clear advantage. Like anything GPU-based in Photoshop, the difference isn't huge, but the Radeon RX Vega 64 is about 3% slower than the NVIDIA equivalent (the RTX 2070 SUPER) while the Radeon RX 5700 XT and Radeon VII are about 10% slower than their NVIDIA counterparts (the RTX 2060 SUPER and RTX 2080 SUPER respectively).

This brings up another interesting result: the Radeon RX Vega 64 is faster than both the Radeon VII and the new Radeon RX 5700 XT. This confused us quite a bit at first, and we ended up re-running our benchmark multiple times, reinstalling drivers, etc. to try to figure it out. What really threw us through a loop was that we did not see this same behavior in other apps like After Effects, so it did not seem to be a problem with the hardware, drivers, firmware, or system configuration.

We spoke with our contacts at AMD and what appears to be happening is that the Radeon RX 5700XT (and Radeon VII) currently include software optimizations for gaming, but not yet for office/professional workloads. While it is possible that post launch software optimizations will improve performance in applications like Photoshop, we will have to wait to see if that happens.

How well do the NVIDIA SUPER and AMD Radeon RX 5700 XT perform in Photoshop?

While the new NVIDIA SUPER cards are not noticeably faster than the "normal" versions, they are still faster than any of the AMD GPUs currently available, including the new Radeon RX 5700 XT. In fact, since the 5700 XT does not yet appear to have optimizations for pro apps like Photoshop, even the previous generation AMD Radeon RX Vega 64 is faster than the 5700XT right now.

To be fair, however, Photoshop is not exactly a GPU powerhouse. There isn't much reason to use a higher-end NVIDIA GPU, and even if we only look at the tasks that utilize the GPU, there is only about a 10% advantage at most for using NVIDIA over AMD. This isn't nothing, but it also isn't likely to be a deal breaker for many users.

In fact, the best GPU for you is likely going to be determined more by the performance in other application you use (After Effects, Premiere Pro, DaVinci Resolve, etc.) than Photoshop - even if Photoshop is a big part of your workflow. Be sure to check our list of Hardware Articles for the latest information on how these CPUs perform with a variety of software packages.

Looking for a Photoshop Workstation?

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Photoshop, NVIDIA, AMD, NVIDIA vs AMD, Radeon RX 5700 XT, RTX 2060 SUPER, RTX 2070 SUPER, RTX 2080 SUPER

I know it has nothing to do with the article in question, but I would like to know if it's just me suffering from poor HD (ssd) readings on Adobe Lightroom export ...

I have ssd's on the machine and it doesn't spend much of 30mb / s on export. Is it meant to be anyway or could it be something on my system?

Adobe, please let LR use my PC with all its power, I never asked you for anything!

Posted on 2019-08-15 00:48:09
Avatar Behive

Adobe just released a new update to LR. They've improved the performance generally quite a bit. Plus it uses more of your graphics card which takes the load of the CPU. Regarding drives, I find it best to have the catalog and LR cache on a fast 4K read drive like the Intel Optane drives. Then your RAW images on a fast sequential M.2 drive. Overall though, when exporting, it's all about the number of cores. Pugit Systems has good Photoshop benchmarks showing this.

Posted on 2019-08-15 03:28:53
Avatar stowawayobx

LR/ACR export/save is just sending an image to a core and letting that chew on it for a bit and then writing to disc. There's not going to be continuous drive access and drive speeds don't matter.

Now the efficiency of that processing is another story, and gpu and the new update doesn't appear to have any effect on save/export times.

Posted on 2019-08-21 23:19:37
Avatar Manthas

Suggesting that Nvidia released their mid-cycle refresh Super line of cards in response to a single new mid-tier card from AMD is disingenuous at best. Nvidia is following the pattern they've been using for many, many product cycles now. Their video cards operate on a two year cycle, alternating between a new mainline release then a mid-cycle refresh with modest upgrades. The only thing different this time around is the terminology. In previous years the refreshes were given the Ti moniker, but because they included the top of the line 2080 Ti at launch, they had to use something different, so we have the Super naming convention instead.

Unless Navi 2 really is the "Nvidia killer" GPU internal memos from AMD suggest, Nvidia is in no danger from AMD, and has not had to "respond" to anything AMD has done in over a decade. So not sure if you just didn't bother to do any research, and just tossed out the first thought that came to your mind or what, but this is just lazy. If you really are that far out of touch with how the hardware companies are operating today, then you should either learn, or stop writing this type of nonsense entirely.

When your article starts with such a basic premise that is easily demonstrably wrong, it paints the remainder of the article as completely untrustworthy.

Posted on 2019-08-15 15:24:36
Avatar RD94

I suggest you go to Youtube and look up a channel called CoreTeks, and see if you still feel the same way after watching his AMD vs. Nvidia videos. I'm pretty sure Puget's comments were directly referencing CoreTek's same information, they just didn't go into all of the same technical how's and why's.

Posted on 2019-08-22 13:10:53
Avatar Pieter Trytsman

Wow what a over reaction to a article not aimed at gpu buffs but rather photoshop users , and in that context it gave informative information as intended. Please try not to brake your keyboard, instead get out more and get some fresh air .

Posted on 2019-08-27 03:48:21
Avatar dogwalker350 .

Topaz has produced plug-ins based on artificial intelligence for sharpening, noise reduction, and image resizing. These require intense computation and can require many minutes per image. Topaz indicates that GPU processing can facilitate speedier results, and even suggests video cards that should work well. Growing numbers of Photoshop users take advantage of these new products (which also can process images stand-alone or in other image processing software hosts). Evaluation of how well components available in Puget computers (CPU, GPU) can speed up such processing could be a very useful addition to the testing you have done for general Photoshop processing..

Posted on 2019-08-15 18:17:26
Avatar tomdarch

Thanks for running these tests with the current crop of consumer GPUs, including the 5700XT. Looking forward to your test results in Resolve and other applications where GPU performance makes more of a difference. (Also very useful to know how little difference these make in Photoshop and AE.)

Posted on 2019-08-15 22:42:38

Resolve and Premiere Pro should be sometime mid/late next week. Resolve is going to be an interesting one since AMD usually performs very well for their price. In the past, only the Radeon VII managed to beat NVIDIA in terms of total performance, so I'm interested to see if the 5700 XT keeps up as well. Really hoping for a real NVIDIA competitor in the video editing field that isn't an odd-ball product like the Radeon VII was.

Posted on 2019-08-16 16:07:32
Avatar Eric Marshall

Awesome work on another comprehensive performance analysis!!!

Too bad it's just another example of Adobe's total lack of software optimization for modern hardware. No performance scaling with more CPU power or more GPU power in this application. As usual, Adobe is the bottleneck.

Posted on 2019-08-17 06:16:58
Avatar José R. Ramos

Hi Matt and all the good people at Puget Systems

Although we all appreciate Puget’s efforts to test gamming cards, I wish that workstation cards were tested too. While gamming graphic cards may give a higher processing speed per dollar, they fail in something that may be more important than speed. Arriving at seeing in 8-bit sooner than at 10-bit per color is pointless to me, since I would not be visualizing the end product. It’s like getting to an approximate address faster in a Ferrari with no GPS help vs getting to the right address a little later in a Corolla with GPS help. Previsualizing the end product may be more important in some applications such as: seeing gradations with 10-bit video, Photoshop and Lightroom in Photography and perhaps other scenarios.

Nonetheless there is still another hurdle. I did not expected the better known monitor manufactures for creators (Eizo and NEC as examples) to be manufacturing most or all of their monitors in 8-bit + FRC instead of true 10-bit per color. In fact NVidea workstation video cards don’t even work with the FRC eye fooling technique. Eizo would not even answer me a question about this FRC feature in their monitors. Perhaps because I wrote that I use an NVidea workstation GPU. We are left with only a $3,000 Eizo and an over $5,000 Eizo model for true 10-bit per color.

José R. Ramos

Posted on 2019-08-17 17:56:51

Hey José, as of July 29, 2019 GeForce cards now support 10-bit displays: https://www.nvidia.com/en-u... . However, be aware that Lightroom does not support 10-bit out in any form at the moment. I believe they have a ticket in their feature request forum somewhere, but it has been an ongoing request for a long time so I doubt it will get added anytime soon.

At this point, the main value add for workstation cards like Quadro is the higher VRAM and slightly better reliability. We do have Quadro/Radeon Pro testing planned, we just typically have the testing for consumer and workstation cards separate since looking at just performance isn't really a fair comparison between those product lines.

Posted on 2019-08-27 16:51:10
Avatar Steve Burke

Thanks for an informative article.

Have you done similar tests using PTGui, that I gather makes more use of the GPU than Photoshop? The makers of PTGui recommend 4mB of RAM and as many cores in the GPU as possible.

Comments from others most welcome too!

Posted on 2019-09-05 13:28:44
Avatar Guillaume Chataignier

Hi, thank you for the test !

I have a question, and the answers I get from Google are not clear for me :

With an "art" monitor, what video cards offer a true 30 bits display ?

Apparently Nvidia now offers the 10 bpc through their "Studio" drivers, even with their gaming cards (GTX and Titan), but what about AMD ?

I struggle to find a proper answer to that. From what I understood, 10 bpc is available for gaming cards but only in Windows and DirectX-based applications. For openGL-based application, we have to use an AMD workstation card (like firePro).
Is it still true in 2019 ? What about the 5700XT ? Is it software dependant ? (Maybe only Adobe forces to use pro cards ? )

I plan to buy an Eizo CS2731 + an AMD 5700XT and I would like to know what to expect ^^

Posted on 2019-11-26 14:28:51

AMD's spec page lists "10-bit HDR" in the text (https://www.amd.com/en/prod..., but my understanding is that is referring to 10-bit support in DirectX rather than OpenGL. I'm not 100% on that since we really don't sell Radeon cards (GeForce are faster at the same price points for the workflows we focus on), but I believe in order to get 10-bit display in an application like Photoshop, you would need either a Radeon Pro, GeForce, or Quadro card. This was the best answer I could find: https://community.amd.com/t... , the official response from AMD was that the Radeon VII doesn't support 10-bit in apps like Photoshop and if that card doesn't, I highly doubt the 5700XT (which is much more gaming focused) does.

Keep in mind as well that whatever application you are using needs to be able to support a 10-bit display as well. Photoshop and Premiere Pro do, but After Effects requires the use of a video monitoring card like a Blackmagic Decklink.

Posted on 2019-11-26 19:01:27

Do you have plans to test cheaper graphic cards in Lightroom, Photoshop, Premiere and DaVinci Resolve? Something like GF 1650 Super, GF 1660 Super, RX5500, RX5600XT? Thank you for this tests.

Posted on 2020-02-05 06:08:21

Usually the lowest we go is the 1660 ti (or Super if there is one now) on the Nvidia side and the 5700XT on the AMD side. Below that we simply have no demand for from our customers, which makes it hard to justify spending the time and money to do that testing.

Posted on 2020-02-05 06:48:44