Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1620
Article Thumbnail

What is the Best CPU for Photography (2019)

Written on November 25, 2019 by Matt Bach
Share:

Introduction

Picking which CPU to use in your workstation can be an overwhelming task with literally hundreds of options to choose from. Should you use Intel or AMD? How much does the core count matter? Will an expensive CPU be worth the cost? Even for those that enjoy keeping up on the latest technology it can be daunting, which is why so many of our customers love that they can simply talk to our consultants about what they are doing, and we take care of figuring out what the best choice is for their unique workflow.

With the launch of the Intel Core X-10000 series and AMD Threadripper 3rd Gen processors, we now know how the latest and greatest CPUs from both brands perform in the real world. We already have several articles for these new CPUs that examine how they perform in applications like Photoshop and Lightroom Classic. However, these articles tend to dive pretty deep into the details which can make them a bit overwhelming for many readers.

In this post, we want to keep things less tech-heavy for those that do not have either the time or interest to closely following PC hardware. Being able to choose the right CPU is something that should be possible for anyone, no matter how much time they have invested in keeping up with the latest tech.

What is the best CPU for photography in 2019?

Currently, there are four main processor families that you should consider for a photography workstation:

  • Intel Core 9th Gen (up to 8 cores, $499 max MSRP)
  • Intel Core X-series (up to 18 cores, $979 max MSRP)
  • AMD Ryzen (up to 16 cores, $749 max MSRP)
  • AMD Ryzen Threadripper (up to 32 cores, $1,999 max MSRP)

While your overall budget is typically going to limit the number of CPU models you may be considering, since many of these product lines have overlapping price points you are still often left with a number of options to choose from. In addition, more expensive does not always mean faster, and in many cases, a more expensive CPU can results in worse performance.

In order to help you pick the right CPU, we will be going over some of our benchmark results to give you an idea of the relative performance between each of the latest CPU options. Then, in our Conclusion, we will discuss our recommendation for which CPU models to use for different workflows and budgets.

Looking for a Photography Workstation?

Puget Systems offers a range of workstations that are tailor-made for your unique workflow. Our goal is to provide the most effective and reliable system possible so you can concentrate on your work and not worry about your computer.

Configure a System!

What does the CPU (processor) do?

In the theme of making this post approachable to everyone, we first wanted to have a brief discussion about what the CPU does - particularly in photography applications. In a nutshell, the CPU (or processor) is the most critical component when it comes to performance. We are starting to see more and more applications utilize the GPU (video card) to accelerate a limited number of individual tasks, but even in these cases, the CPU is still being used to at least some degree.

Within a CPU, there are several factors that determine how fast it is, but it can be simplified into two main specifications: core count and core frequency (speed). If you want this broken down into layman terms, we have a terrific video that explains them using a car analogy:

However, in addition to just the number of cores and frequency, there are a ton of other factors that affect the real-world performance of a processor. The amount of cache (similar to short-term memory), and even just the general architecture can make a huge difference when it comes to how the CPU actually performs. This is why it is especially inaccurate to use core count and frequency to compare between Intel and AMD CPUs - they are simply too different for a comparison like that to work.

So, if pure specs are not a reliable way to pick a CPU, what is? Honestly, that is why we spend the amount of time we do testing a plethora of processors in a range of applications. We have tried many different methods over the years, and in the end, actual performance benchmarks are the only reliable and accurate way we have found to determine how a specific CPU will compare to others.

Photoshop CPU Performance

A lot is going on in our performance charts below, so before getting into it we wanted to provide a key regarding the color scheme we used.

  • Light blue = Intel consumer CPUs (9th Gen)
  • Dark blue = Intel HEDT CPUs (X-10000 Series)
  • Light red = AMD consumer CPUs (Ryzen 3rd Gen)
  • Dark red = AMD HEDT CPUs (Threadripper 3rd Gen)

What is the best CPU for Photoshop 2019

Full benchmark and test data available in our post:
Photoshop CPU performance: Intel Core X-10000 vs AMD Threadripper 3rd Gen

Photoshop isn't all that great at using a large number of cores, so there isn't much of a reason to use the more expensive Intel X-series or AMD Threadripper CPUs. Recent technology advances have made it so that higher core count CPUs are no longer much slower in these kinds of workloads, but in this case, you simply won't get much of a return on your investment.

In fact, what is amazing is that between the slowest and fastest CPUs we tested, there is less than a 10% difference in performance. What this means is that your choice of CPU will likely be dictated by the rest of your workflow, even if you spend the majority of your time in Photoshop.

Lightroom Classic CPU Performance

Full benchmark and test data available in our post:
Lightroom Classic CPU performance: Intel Core X-10000 vs AMD Threadripper 3rd Gen

Unlike Photoshop, there is a lot of performance variation between different CPU models in Lightroom Classic. Luckily, choosing between Intel and AMD is at least and easy choice since AMD is the faster option across the board.

For many, this will make your choice of CPU pretty straight forward since you should simply get whatever the fastest AMD Ryzen or Threadripper CPU is that you can afford. However, if you scroll to the second and third charts, you will see that there is a bit of nuance here depending on whether you care about performance in active tasks like culling, or passive tasks like exporting.

If you don't need the best export performance, you can safely ignore the AMD Threadripper CPUs in their entirety. They are fine for tasks like scrolling through images and swapping between the various modules in Lightroom Classic, but they are no faster (or slightly slower) for these tasks than the more affordable Ryzen processors like the 3900X 12 Core or 3950X 16 Core.

On the other hand, if you regularly export a large number of images at a time, the AMD Threadripper 3960X 24 Core can significantly decrease export times. Compared to the AMD Ryzen 3950X 16 Core, the 3960X 24 Core is about 50% faster for these tasks, or more than 2x faster than the Intel Core i9 9900K. To put this into perspective, here are some theoretical export times you can expect with a few of these CPUs:

  • Intel Core i9 9900K: 20 minutes
  • Intel Core i9 10900X: 17 minutes
  • AMD Ryzen 9 3950X: 14 minutes
  • AMD Threadripper 3960X: 9.5 minutes

Oddly enough, the Threadripper 3970X 32 Core is actually slower than the 24 Core for unknown reasons, so while the 3960X 24 Core can be highly beneficial for some Lightroom Classic users, we do not recommend using the more expensive 3970X at this time.

What processor should you use for a Photography workstation?

At the moment, we recommend the AMD Ryzen 3rd Generation processors for a general Photography workstation​​​​​​. Depending on your budget, the Ryzen 7 3800X 8 Core, Ryzen 9 3900X 12 Core, and Ryzen 9 3950X 16 Core are all excellent choices that should be more than capable of even the most challenging photography workflows.

Overall recommended CPU for Photography: AMD Ryzen 3rd Generation

Overall recommended CPU for Photography:
AMD Ryzen 3rd Generation

However, as is almost always the case, there are some times when a different CPU will make more sense:

Best CPU for processing large image sets in Lightroom Classic

The AMD Ryzen 3rd generation CPUs are still excellent for this, but if your budget allows for it you may consider upgrading to the AMD Ryzen Threadripper 3960X 24 Core to speed up performance in tasks like exporting. For these kinds of passive tasks, the AMD Threadripper 3960X 24 Core is on average 50% faster than the Ryzen 3950X, although note that the Threadripper 3970X 32 Core is not any faster, so there is no reason to spend more money on that specific model.

Best CPU with Thunderbolt support

Thunderbolt is not something everyone needs, but if you own a number of Thunderbolt devices, you likely want to use a platform that has support for those devices. Unfortunately, no AMD platform has certified Thunderbolt support at this time which means you should use an Intel-based platform. ASRock has a few AMD motherboards that have their own un-certified implementation, but we highly recommend sticking with certified Thunderbolt solutions due to how finicky it can be on PC.

In this case, the Intel Core 9th Gen CPUs like the Core i7 9700K 8 Core and Core i9 9900K 8 Core are great all-around choices that are fairly affordable, and if you want to maximize performance for passive tasks like exporting, any of the Intel X-series CPUs will be slightly faster (and slightly more expensive). However, be aware that there is little performance difference between each of the X-series models in Lightroom Classic, so you may as well stick with the less expensive options like the Core i9 10900X 10 Core.

Hopefully, this post has helped you choose the right CPU for your photography workstation. Keep in mind that even with these recommendations, the exact right CPU for you may be different depending on the applications you use and what you do in those applications. If at all possible, we highly speaking with one of our technology consultants (425.458.0273 or sales@pugetsystems.com) if you are interested in purchasing a Puget Systems workstation as they can help you get the exact right system for both the work you do today, as well as what you hope to do in the future.

Looking for a Photography Workstation?

Puget Systems offers a range of workstations that are tailor-made for your unique workflow. Our goal is to provide the most effective and reliable system possible so you can concentrate on your work and not worry about your computer.

Configure a System!

Tags: Intel 9th Gen, Intel X-series, Intel vs AMD, AMD Ryzen 3rd Gen, Intel X-10000, Lightroom CLassic, Photoshop, Photography
Misha Engel

Intel has certified leaks(CVE's), a lot of them and to make things even better they also have a lot of leaks comming up that will get certified soon. https://www.tomshardware.com/features/intel-amd-most-secure-processors.
Asrocks implementation with Thunderbolt on AMD motherboards works just fine, so do the leaks in intel CPU's(certified and uncertified).

Royalty situation
On 24 May 2017, Intel announced that Thunderbolt 3 would become a royalty-free standard to OEMs and chip manufacturers in 2018, as part of an effort to boost the adoption of the protocol.[69] The Thunderbolt 3 specification was later released to the USB-IF on 4 March 2019, making it royalty-free, to be used to form USB4.[70][71][72] Intel says it will retain control over certification of all Thunderbolt 3 devices, though it will not be mandatory.[73]

Before March 2019 there were no AMD chipsets or computers with Thunderbolt support released or announced because of royalty situation (problem was Intel's refusal to certify non-Intel platforms). However, the YouTuber Wendell Wilson from Level1 Techs was able to get Thunderbolt 3 support on an AMD computer with a Threadripper CPU and Titan Ridge add-in card working by modifying the firmware, indicating that the lack of Thunderbolt support on non-Intel systems is not due to any hardware limitations.[74][75] As of May 2019, it is possible to have Thunderbolt 3 support on AMD using add-in card without any problems.[76] And motherboards like ASRock X570 Creator already have integrated Thunderbolt 3 support.[77]

https://en.wikipedia.org/wiki/Thunderbolt_(interface)

The choice is upto the user/customer certified leaks in a CPU or non intel certified working thunderbolt on saver AMD CPU's.

And by the way, AMD officially advices to use at least a 280 AIO cooling solution on the 3950x (less heat means more performance with Zen 2).

Posted on 2019-11-26 01:35:40

We never said that Thunderbolt won't work on AMD boards, just that it is uncertified by Intel which keeps us from recommending it to people who need Thunderbolt to just work. We've been selling systems with Thunderbolt for a long time, and it is honestly still a big mess on PC. We've used pretty much every brand out there with both internal Thunderbolt and add-on cards, and it has always been very problematic.

The problem is that Thunderbolt is a very complex port. Not only does it handle data, but it also does video, as well as both at the same time! I can't say that I know all the inner workings of it, but my understanding is that you have a whole bunch of things that are working together, from the chipset, to the Thunderbolt controller, the cable, and the Thunderbolt device. If any one of those is not using the proper firmware or driver or some other random conflict, you can run into issues with devices wither not showing up, devices dropping randomly, or not getting the performance you should.

Right now, there is only one type of Thunderbolt we are comfortable selling in our systems, and that is Gigabyte motherboards with Thunderbolt built-in to the board. Other brands of motherboards, add-on cards, and even Gigabyte motherboards with Gigabyte Thunderbolt add-on cards we have not had anywhere near a great experience with. That doesn't mean they won't work for you - many times Thunderbolt will work fine on a bunch of devices, but will have issues with just a handful. If that handful is not something you use, then it is no problem, but we don't know what devices our customers may end up using so we have to make sure it will work on pretty much everything.

All those problems we have had to deal with for years were all on certified platforms, which is why we are extremely hesitant about recommending people use Thunderbolt on a platform that is uncertified at all. If you are building your own system and want to use it, absolutely go ahead. We are simply making recommendations based on our 19 years of experience in the industry.

Posted on 2019-11-26 18:33:40
Misha Engel

Try AsRock X570 boards with built in thunderbolt 3 ports and yes those thunderbolt add-on cards are a pain in the...

Posted on 2019-11-28 12:14:22
Mark

It would be more useful if you broke it into content creation platforms. (you could exclude commercial video production.)

I think it would help if you throw in some actual numbers of images to describe a large amount.
There are different types of shooter.

(Trying to keep the option in general terms)

1) Low number images with heavy editing. think fashion or head-shots. 500 or less images. (photoshop)

2) High number of images from multiple cameras: 1day wedding. start with 2000-3000 and export 500-800 (lightroom)

3) High number of images multiple camera: multiple day event. start with 10,000+ and export 3000-5000 images. (lightroom)

4) High number of image + video editing. - 1 day event. 2000-3000 images and export 500-800 + 100+ gig of video and deliver 2-5 minute highlight.
Lightroom + premier or lightroom + Davinci resolve)

5) video only option: wedding/ event all day. 200-1Tb of video. depends on recording format.
(Davinci, premier or final cut)

I have personally done #2 & #3 for several years.

I am breaking into a video only option. (#5)

just my 2 cents.

Appreciate the information you provide.

-Mark

Posted on 2019-11-26 05:42:45

The reason why we don't do that for things like performance is that it is really subjective. You may feel that being able to export 500 images in 9 min with a 3960X versus 14 min with a 3950X is well worth the cost, but someone else looks at the price jump and can't make that justification. Even increasing that 10x, going from 90 minutes to 140 minutes sounds worth it to me personally, but way more people than I expect are fine with longer export times since they just have it run overnight. For others, anything that can reduce the time is worth any cost and they buy a new system as soon as they can get just a 10% performance bump.

We do have more firm recommendations for things like RAM capacity and storage speed, but those are more system requirements rather than user preference/priority. I wish there was some sort of rule that would make sense for performance like that, but so far we have found that there are just too many different situations and needs for that to work.

Posted on 2019-11-26 18:22:42
ImmigrantParadigms

Do I need a hard core water cooler if I went with the Ryzen 9 12 Core processor? Or would I be fine using a regular liquid cooler like the h110 from Corsair with a 2 fan radiator?

Posted on 2019-11-29 19:25:45

I don't think there is any need for liquid-cooling on Ryzen. We use a nice Noctua heatsink and fan combo and it works really well at stock speeds.

Posted on 2019-11-29 19:47:15
ImmigrantParadigms

Thank you William!

Posted on 2019-11-29 21:18:35

Stock speeds shouldn't tax even a basic air cooler. If you want to OC, even with PBO then a better cooler is recommended.

Posted on 2019-12-03 17:26:32
Michael B

Actually, for the 3950x AMD recommends liquid-cooling. Though this sounds more scary than it actually is as liquid cooling for the CPU alone is not very complicated.

Posted on 2019-12-05 21:29:15
qwRad

Thanks for the article. Good stuff as always. Have you done any recent articles on SSD/HDD setup or performance in Lightroom/Photoshop/Premiere etc? I know you have the old article about disk recommendations for Premiere (best to have at least two disks) but would be nice to have an update on this topic with the 2020 version and maybe recommendations about drive setup for Lightroom too? And maybe at the same time see if catalog size matters for performance?

Posted on 2019-12-04 05:56:47

Our disk setup recommendations haven't really changed all that much. Really the only thing we would do in an updated version is to push NVMe drives a bit more since the price of those have dropped considerably. They still aren't really necessary in most cases, but if you can get a little boost for not much money you may go for it.

For Lightroom, it really shouldn't matter all that much. We've done some internal testing and didn't notice a difference when moving the cache, catalog, and images to different drives. I would like to do a bit more in-depth analysis at some point through. For the catalog size, it isn't supposed to matter according to everything I've heard from Adobe, but that is something else that would be nice to test when we get a chance just to confirm it.

Posted on 2019-12-05 21:47:43

New build is an overclocked Ryzen 3800X with 64GB RAM, GTX 1660 on X570-Plus, Win10, Photoshop CC 2020. I had higher speeds with an all-core 4.4GHz overclock but had to boost voltage to 1.4V which isn't good for longevity. Dialed it back to 1.3V and still getting good performance with low temps. CPU package hits 71C under load, 38C at idle, that's with air cooling.
https://uploads.disquscdn.c...

Posted on 2019-12-04 07:50:42
Frank Field

Clearly, work such as this is one of the value-adds from Puget Systems. I wonder if you have any historical perspective on this question. That is, do Intel and AMD tend to swap off first place as each introduces new generations of processors? Or, has AMD had a distinct edge over time? Some photographers are moving away from Lightroom to competitive products such as Capture One. Wonder if you have done any testing. Adobe has a reputation for being on the trailing edge of multi-core programming, tending to leave a lot of computing cycles sitting on the table.

Posted on 2019-12-05 14:34:57

Intel and AMD for a long time have been pretty static. From a very generic perspective, AMD was the "value" option, while Intel was the "performance" option. These latest CPUs from AMD have really flipped things on their heads and now AMD is in many instances the higher performance choice. Whether that will last or not is really hard to guess. It usually takes long time for this kind of technology to make these leaps (AMD has been working towards this for years) so it will probably stay this way for a while. On the other hand, Intel is a massive company with insane resources, so they could dump a ton into R&D and come out on top again with their next CPU launch.

As for Capture One, that is one of the next apps we will probably be looking at. I think we want to tackle Avid Media Composer on the video side first though, and it will depend on how long it takes us to get that up an going as far as figuring out how to benchmark it and giving our service staff all the training they need to be able to properly recommend hardware and support issues.

Posted on 2019-12-05 21:44:14
photogeezer

Matt, I recommend that your future articles about photography solutions include information about the coming AI-based solutions. I ordered a Quadra P2000 with my Puget Systems PC because of the requirements of my Eizo monitor, but it turns out I really need that Nvidia GPU and 5 GB of video memory for my new Topaz AI software, which is a great suite of photo-editing software. Processes which take seconds on my PC take minutes on my laptop, which only has 1 GB of video memory. Photographers need to know that it's the GPU which is the workhorse of AI computing.

Posted on 2019-12-05 14:50:11

AI is still pretty new in the Photography field - Topaz is one of the biggest, but at the moment we don't do much in the way of performance testing for plugins (we are working on that though!). Lightroom does have "enhanced details" now that uses the GPU that we looked at a while back https://www.pugetsystems.co... but from everything we've heard, not many people are actually using it.

It is definitely great to see Adobe starting to use the GPU and AI to improve performance, as that kind of thing becomes more common in people's workflows, we will definitely start looking at it in more detail!

Posted on 2019-12-05 22:13:28
Jim Reid

This is good information. I have a system with an Intel 8700 that you built for me several years ago. Last summer I moved away from Lightroom to Capture One since I believe the performance of Capture One is superior in many ways. I still use Photoshop for some of my needs. My experience with this CPU and Capture One has been very satisfactory. Do you have any test information on the newer hardware with Capture One?

Posted on 2019-12-05 17:01:13

Capture One is actually one of the next apps we will probably be looking at. I'm a bit worried about how we will be doing the actual testing since the API is not that great from what I could find and we really need to have automated testing in place since we test so many different hardware combinations. That may take us a bit to figure out.

I also think we want to tackle Avid Media Composer on the video side before getting to Cpature One, and it will depend on how long it takes us to get that up an going as far as figuring out how to benchmark it and giving our service staff all the training they need to be able to properly recommend hardware and support issues.

Posted on 2019-12-05 22:15:02
Frank Field

The folks at Capture One should be highly motivated to open an API for you and support your testing. You should not be shy about contacting them. TI know of no one else doing the service for photographers Puget is doing here. It really does help us make fact-based decisions. I can't say enough good things about the experience I've had with my now nearly four year old Puget Serenity (i7-6700K).

Posted on 2019-12-05 23:04:51

We definitely will be reaching out when we get to that point, and hopefully they will be receptive to helping out. In the end, the testing and hardware analysis we do directly helps their own customers, so it really is a win-win for everyone involved. I just know that software developers are often in a crunch trying to get bugs fixed, new features added, etc. and it can be hard to work in dev time that doesn't directly impact their user experience.

Posted on 2019-12-06 00:26:49