Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/899
Article Thumbnail

Photoshop CC 2017 NVIDIA GeForce GPU Performance

Written on February 15, 2017 by Matt Bach


Since it was first added in Photoshop CS6, Adobe has been continuously improving and expanding support for GPU acceleration in Photoshop. Since both hardware and software is constantly evolving and changing, we like to periodically evaluate how well Photoshop is able to utilize various video card models.

In this article we will be looking at the performance of Photoshop 2017 with the latest GeForce video cards from NVIDIA including the GTX 1000 series and the Pascal-based Titan X. These cards are easily the most common models used in our Photoshop workstations due to the excellent performance they are able to achieve for relatively low cost.

The one downside to these cards is that they do not support 10-bit displays which NVIDIA currently reserves for their Quadro line of cards. If you use a 10-bit display, we recommend checking out our Photoshop 2017 NVIDIA Quadro GPU Performance article to see how the various Quadro cards perform.

Test Setup

To see how the different GeForce cards perform in Photoshop, we used the following workstation hardware and software:

Testing Hardware
Motherboard Asus PRIME Z270-A
CPU Intel Core i7 7700K 4.2GHz (4.5GHz Max Turbo) 4 Core
RAM 4x Crucial DDR4-2400 16GB (64GB total)
Hard Drive Samsung 850 Pro 1TB SATA 6Gb/s SSD
OS Windows 10 Pro 64-bit
Software Photoshop 2017.0.1

This hardware is essentially what we use in our Photoshop Recommended System and should be an excellent platform for our testing. The different video cards we will be testing are:

To benchmark the performance difference between each card we timed how long it took to complete various actions that are supposed to be able to utilize the power of the video card. These actions were applied to a 360MP image (21500x16718) in order to ensure that they took long enough for us to get accurate timing results.

The specific actions we tested are:

  • Field Blur
  • Iris Blur
  • Tilt-Shift Blur
  • Smart Sharpen
  • Render - Tree
  • Lighting Effect
  • Camera Raw Filter
  • Image Size (Preserve Details)

Benchmark Results

If you take a even a quick look at the charts above, it is obvious that the results are very polarized. Some actions like the various blurs and "Smart Sharpen" show excellent performance gains with even a low end GPU compared to the integrated graphics, and pretty decent gains with the higher end GPUs (up to the GTX 1080 at least). Others like "Render - Tree" and "Lighting Effect" show a good performance gain from using a discrete GPU, but not much of a reason to get a higher end card.

Interestingly, the "Camera Raw Filter" and "Image Size (Preserve Details)" actions show almost no difference between integrated graphics and any of the GPUs we tested. This means that from a performance standpoint it shouldn't matter what GPU you use for these actions.


Since the performance benefit from the different GPU models changes drastically depending on the specific action, we considered not having an overall average chart in our conclusion. However, since most Photoshop users perform a wide variety of actions throughout the course of their work we decided that it is still a valid way to present our conclusions. Just keep in mind that if there is a specific action that you hate waiting on and want to optimize your system for, this average is not going to be as accurate at looking at the results of just that singular action.

Photoshop CC 2017 GeForce GPU Acceleration Benchmark
Looking at the average, it is clear that there is a benefit to using a discrete GPU instead of simply relying on the integrated graphics that comes on an Intel CPU. However, after even a low-end GPU like the GTX 1050 or 1050 Ti there is less and less of a performance gain as you spend more and more money. There is still some, but it is only going to be about 3-4% for every model you go up and plateaus at the GTX 1080.

If possible, however, we would still advise using at minimum a GTX 1060 6GB or possibly the GTX 1050 Ti 4GB. Even if you don't feel that you need the extra performance from one of these cards, the 4-6GB of VRAM can be very useful for your system as a whole. Photoshop itself won't need it unless you work with very large images - like the 360MP image we used in our testing - but it does allow you to comfortably use multiple monitors or a 4K monitor without any problems. If you want to use multiple 4K monitors, you might stick with the GTX 1070 8GB or above to ensure you have enough VRAM and raw power to drive those displays.

Photoshop Workstations

Puget Systems offers a range of poweful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: GTX, 1050, 1050Ti, 1060, 1070, 1080, Titan X, Photoshop

Thanks for this very interesting article! I think it really shows the shortcomings of Photoshop's support for GPU acceleration, as only the Smart Sharpening seems to scale with the available computing power of the GPUs. The cross comparison with Quadro GPUs is interesting too.

Posted on 2017-04-20 08:54:09

Very helpul i was about to buy a quadro for 2D when its almost exactly the same for the half of its price!

Posted on 2017-06-29 01:53:27

thank you for the test!

Wanna know if Geforce 1030 performs similar to 1050?

Posted on 2017-10-11 15:50:09

Looks useful, also I wonder why no AMD is tested? I guess Ps is optimized for nVidia, but still.

Posted on 2017-10-28 14:21:58

Because CUDA cores (only found in NVIDIA) is better/more efficient for GPU parallel processing than OPENCL (in NVIDIA and AMD).
So since CUDA is much better than OPENCL then they only need to measure the use of CUDA which thus exclude all AMD cards.
Not even sure if Adobe even codes their stuff to use OPENCL since CUDA is that much better.

Posted on 2017-10-30 18:49:13

"CUDA is that much better" - I'm not sold. If AMD cards are so damn good for crypto-currency mining, and often called "too computation heavy" while nVidia is better balanced for gaming, this simplistic "nVidia good, AMD bad" doesn't quite work for me.

You could explain what makes CUDA so much better. Apparently even Intel HD 630 doesn't do an awful job except for Smart Sharpen, while of course it loses in other benchmarks too. And the CUDA cores count doesn't seem to make huge difference outside Smart Sharpen. So the "parallel processing" doesn't seem to be used in such a massive extent as to involve all the cores of the GPU in most of the application.

Posted on 2017-10-31 08:43:58
mike ware-lane

Given you are specifically reviewing for photoshop it's interesting that you have made no reference to the Adobe RGB gamut percentage of the cards, this to me is an important element of any such review?

Posted on 2018-05-14 20:16:37
David Hasbrouck

That's up to the monitor.

Posted on 2018-10-19 03:50:11

Yep, David is right. the important part from a performance standpoint is the bits per channel of the image itself, not what is going out to the display. I've been considering adding 16 bits/channel to our testing, but I'm unsure if it will actually be any different beyond simply taking longer for things to complete. That is something I still need to test, however.

Posted on 2018-10-19 15:33:34