Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1612
Article Thumbnail

Lightroom Classic CPU performance: AMD Ryzen 9 3950X

Written on November 14, 2019 by Matt Bach


When AMD released the first of their 3rd generation Ryzen processors back in July 2019, they were quickly established as the fastest processors for Adobe Lightroom Classic. Their lead over Intel was not small either, the Ryzen 9 3900X was a very impressive 22% faster than the Intel Core i9 9900K in our Lightroom Classic benchmark.

Now, AMD is launching one more 3rd generation Ryzen CPU - the AMD Ryzen 9 3950X. This processor features a staggering 16 CPU cores which is really starting to blur the line between "consumer" and "HEDT" (High End Desktop) processors. However, the increase in core count comes with a fairly large MSRP price of $749. For comparison, both the AMD Ryzen 9 3900X 12 Core and Intel Core i9 9900K 8 Core have a MSRP of $499. If you want more information on the specs of this new processor, we recommend checking out our New CPU Announcement: AMD Ryzen 9 3950X post.

AMD Ryzen 9 3950X CPU for Lightroom Classic

In this article, we want to see whether the increase in core count (and price) is worth it for Adobe Lightroom Classic. However, since Intel is launching their new Core X-10000 series processors and AMD is launching their new 3rd Gen Threadripper processors in the near future, we are only going to compare the 3950X to a handful of Intel and AMD CPUs. If you want to see how it stacks up against a wider range of Intel and AMD processors, check back in the coming weeks for articles that will include the AMD Ryzen 3rd Gen, AMD Threadripper 3rd Gen, Intel Core 9th Gen, and Intel Core X-10000 series processors in a number of applications.

If you would like to skip over our test setup and benchmark sections, feel free to jump right to the Conclusion.

Looking for a Lightroom Classic Workstation?

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!

Test Setup & Methodology

Listed below are the specifications of the systems we will be using for our testing:

AMD Ryzen Test Platform
CPU AMD Ryzen 9 3950X​​​​
AMD Ryzen 9 3900X
CPU Cooler Noctua NH-U12S
Motherboard Gigabyte X570 AORUS ULTRA
RAM 4x DDR4-2933 16GB (64GB total)
Intel 9th Gen Test Platform
CPU Intel Core i9 9900K
CPU Cooler Noctua NH-U12S
Motherboard Gigabyte Z390 Designare
RAM 4x DDR4-2666 16GB (64GB total)
AMD Threadripper Test Platform
CPU Cooler Corsair Hydro Series H80i v2
Motherboard Gigabyte X399 AORUS Xtreme
RAM 4x DDR4-2666 16GB (64GB total)
Intel X-Series Test Platform
CPU Intel Core i9 9960X
CPU Cooler Noctua NH-U12DX i4
Motherboard Gigabyte X299 Designare EX
RAM 4x DDR4-2666 16GB (64GB total)
Shared Hardware/Software
Video Card NVIDIA GeForce RTX 2080 Ti 11GB
Hard Drive Samsung 960 Pro 1TB
Software Windows 10 Pro 64-bit (version 1903)
Adobe Lightroom Classic 2020 (version 9.0)
PugetBench V0.8 BETA for Lightroom Classic

*All the latest drivers, OS updates, BIOS, and firmware applied as of November 11th, 2019

A few notes on the hardware and software used for our testing: First, we have decided to standardize on DDR4-2933 memory for the Ryzen platform. The officially supported RAM speed varies from DDR4-2666 to DDR4-3200 depending on how many sticks you are using and whether they are dual or single rank, and DDR4-2933 is right in the middle as well as being the fastest supported speed if you want to use four sticks of RAM. In fact, this is the speed we are planning on using in our Ryzen workstations once JDEC DDR4-2933 16GB sticks are available.

The second thing to note is that we are using our soon to be released Lightroom Classic Benchmark. Lightroom Classic is not an easy application to directly benchmark, but we hope to have a publicly available version for download in the coming months.

Benchmark Results

While our benchmark presents various scores based on the performance of each test, we also wanted to provide the individual results. If there is a specific task that is a hindrance to your workflow, examining the raw results for that task is going to be much more applicable than the total scores.

Feel free to skip to the next section for our analysis of these results if you rather get a wider view of how each CPU performs in Lightroom Classic.

AMD Ryzen 9 3950X benchmark results PugetBench V0.8 for Lightroom Classic

Lightroom Classic Benchmark Analysis

Our Lightroom Classic benchmark tests a wide range of tasks that are divided between "active" tasks (scrolling through images, brush lag, etc.) and "passive" tasks (exporting, generating smart previews, etc.). These results are then combined into an overall score to give you a general idea of how that specific configuration performs in Lightroom Classic.

No matter how you look at it, however, the AMD Ryzen 9 3950X performs very well in Lightroom Classic. It may only be about 5% faster overall than the AMD Ryzen 9 3900X, but that still makes it solidly the fastest CPU we have ever tested for Lightroom Classic. Compared to the Intel Core i9 9960X 16 Core or Core i9 9900K, you are looking at close to a 25-30% increase in performance!

Is the AMD Ryzen 9 3950X good for Lightroom Classic?

Overall, the AMD Ryzen 9 3950X is currently the fastest CPU we have tested for Lightroom Classic, but the extra 5% performance over the AMD Ryzen 9 3900X for a 50% increase in cost is likely to be hard to justify for most users. Either way you look at it, however, the 3950X further solidifies AMD's lead over Intel for Lightroom Classic.

Keep in mind that the benchmark results in this article are strictly for Lightroom Classic. If your workflow includes other software packages, you need to consider how the processor will perform in all those applications. Currently, we have articles for Photoshop, Premiere Pro, After Effects, DaVinci Resolve, and a number of other applications.

In addition, both Intel and AMD have new processors coming out in the near future which may change the price to performance picture. We will be publishing more articles as these new processors launch, so be sure to keep a close eye on our list of Hardware Articles in the coming weeks.

Looking for a Lightroom Classic Workstation?

Puget Systems offers a range of poweful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Intel 9th Gen, Intel X-series, Intel vs AMD, AMD Ryzen 3rd Gen, AMD Threadripper 2nd Gen, Ryzen 9 3950X, Lightroom CLassic, Lightroom

so great that you did the test with the new 9.0 version! Could you make the benchmark downloadble to execute yourself? it is very hard to know where you stand with performance on your current system.

many thanks for all the great work

Posted on 2019-11-17 20:31:29

We are working on getting the benchmark up for download. We were close about a month ago, then we realized Lightroom 9.0 was going to launch during Adobe MAX so we held off. It is definitely one of the more "finicky" of our benchmarks (none of these apps are made for benchmarking, so we have to do some "creative" things to get them to work). I would guess maybe in 2-3 weeks we can have a version for Windows up for download.

Posted on 2019-11-19 18:07:59
Nuno Fernandes

Are you going to do a Lightroom Classic 9.0 GPU performance test?
It seems that Adobe has improved the GPU usage in Lightroom and I would like to know if I should update my graphics card or not.
Great article, keep up with the great work.

Posted on 2019-11-18 01:26:24

That is definitely something I want to look at! Right now our plate is pretty full, but that is pretty close to the top of my to-do list. It will probably end up being a pretty big project since we are going to have to take into account how many displays are being used as well as the resolution for each display (since that apparently is a big factor for Lightroom GPU performance).

Posted on 2019-11-19 18:06:15
Nuno Fernandes

Great, can't wait to see those results.

Posted on 2019-11-21 05:59:02
Dave Sang


Posted on 2019-11-22 07:49:51
Josef Kissinger

Yes, that would be great!

Posted on 2020-03-13 00:44:24

Please add the Quadro RTX 4000 to your GPU test.

Posted on 2020-03-13 20:43:45
Sebastian Guerraty

Is there any chance you might add capture one to the software you benchmark in the future? Its a strong alternative to lightroom and it has better performance, but I can´t seem to find how it responds to different hardware
Great article BTW :D

Posted on 2019-11-18 21:38:48

Capture One is on our list, but it honestly will likely be at least a year or longer before we are able to take it on - we have a few other major project to take on first. I have played around with it a bit as well, and it looks like it is going to be really difficult to accurately and reliably benchmark. The API is about as barebones as it could possibly be which makes it really difficult to get a benchmark created that isn't going to constantly break.

Posted on 2019-11-19 18:04:57

Thanks for the info on Lightroom's inability to use SMT. When I bought the 3900X I immediately noticed the huge difference when exporting images. I used to run this task, go out for lunch, return home and listen to music for a few hours before it finished. Now I can just take a small break and get back to work. I haven't tried exporting with SMT off, but I have turned off SMT when editing and it runs so much smoother.

Posted on 2019-11-20 22:04:58
Manuel Findeis

First things first: Thank you for the lightning fast testing of the new 3950X!
However, it is very difficult to draw meaningful conclusions without a closer look at your numbers:
• You seem to have tested Intel with HT-on. Why? You already know it better!
• Looking at the NEF numbers, there is really no reason to spend even a penny more for a 3950x instead of a 3900x (for Photoshop and Lightroom only). In other reviews, however, there are indications that the 3950x could do significantly better than the 3900x with SMT-off. Can you confirm this?
• Compared to your roundup on October 16, 2019, the NEF export of the 3900X is suddenly considerably slower - by 35%! Why? Is this due to another "performance optimization" of Adobe? Or is it a problem with your benchmark?
• NEF-Export: Intel 9960x is about the same as 3900x/3950x as expected. But 9960x is suddenly much worse with smart previews in comparison to your October-Benchmark. Why?
• Video Card: Is it really meaningful to use a graphics card that would normally not be installed in a Lightroom computer (RTX 2080 Ti)? Some of the active tasks are accelerated by LR through the GPU ... Perhaps the difference in CPU performance would be much clearer with a lower GPU.
• Many Lightroom users still have a Core i7-4700K in use. For years, neither Intel nor AMD have done anything to really justify an upgrade. A faster export is certainly welcome. Most important, however, is the performance leap in editing. So it would be really exciting to compare the new CPUs to a Core i7-4700K or Core i7-7700K. Could you do this, please?
• In comparison today vs 6 years ago (in IT-Calender: When the dinosaurs still walked the earth): you have to pay twice as much for the CPU and twice as much for the motherboard, to get a 2-3 times faster export, but only about 35% more power in active tasks. Is this right? In my opinion that is a shame for Intel, AMD and Adobe altogether and not a reason to hype anybody. At least today we have the option to get twice the performance for twice the money.

Posted on 2019-11-21 10:16:52

Things have actually changed a bit regarding HT/SMT with Lightroom Classic V9.0 . The export/smart preview performance drop is still present, but performance for everything else saw a pretty sizable increase in performance with Hyperthreading enabled. So overall, performance is not better with HT enabled than with it disabled, which is why we didn't disable it for this testing.

The reason we use a 2080Ti in our CPU-based testing is simply to make sure that the GPU is not a bottleneck. There is no need for that high-end of a GPU, but in the off chance that it does make an impact, we want to make sure that the performance is being primarily limited by the CPU rather than another component. That is the same reason we use a NVMe storage drive as well. It shouldn't affect performance much, but good benchmarking is about removing variables to try to get the most accurate results as possible.

As far as performance relative to older systems, that is something we've done in the past and want to do more of - we just don't have the bandwidth to do that in addition to keeping up with the latest hardware and software updates. We do have a couple of projects planned for 2020 that we hope will help things quite a bit for this however. One of the first things is to get our Lightroom Classic benchmark up for public download. Even if we do out own testing on older platforms, nothing is ever going to be as accurate as comparing the performance of the exact system you are using today to whatever the latest hardware is.

Posted on 2019-11-21 19:04:01

Hey Matt, there are some things that are not clear to me. You say that the score of 1000 is made by the average of Passive Score + Active score of a system who is based on the Intel 9900K. Yet, if i take a look on the scores of the 9900k it's 921 (87.7 active + 96.5 passive). Can you please explain this? Also, waiting for the LR benchmark. Thanks for all the reviews you're making, there are really useful. Maybe you should setup a databases system where people could upload their results to compare with others

Posted on 2019-11-26 14:36:55

Yep, you are right on the average thing, the only thing you missed was that we multiple the average by 10 because a bigger number means it is more important. The average of 87.7 and 96.5 is 92.1, which x10 is 921.

And hold that thought on the upload thing - that is a project we are hoping to get to next year. It is looking like a pretty massive programming project to not only allow people to upload, but sort, search, compare, etc., but that is something we are really excited about doing. There are also some back-end features we want to make that makes it even more complex, but hugely useful for our articles. So stay tuned on that!

Posted on 2019-11-26 19:04:35

i understood how you calculate the total score (Active + Passive)/2*10 .. I dont understand why the 9900K is not 1000. I will quote from your Lightroom benchmark procedure : How does the scoring work?
The scoring system used in our benchmark is based on the performance relative to a reference system with the following specifications:

Intel Core i9 9900K 8 Core
NVIDIA GeForce RTX 2080 8GB
64GB of RAM
Samsung 960 Pro 1TB
Windows 10 (1903)
Adobe Lighroom Classic CC 2019 (ver. 8.4)
Overall Score: 1000
Active Tasks Score: 100
Passive Tasks Score: 100

I dont understand why if everything is normalized to 9900K, why the score for 9900K is not 1000 (100 active / 100 passive)

Yeah, compare is really interesting.. So far I'm using OCR to get everything in excel and compare things. Either way, great job :)

Posted on 2019-11-26 19:13:35

Ah, got you, sorry I misunderstood! That reference score is completely static and won't ever change until we add tests to our benchmark that forces us to re-create it. We don't re-use results from previous testing (or do so very rarely and clearly mark them), and since performance changes over time, that means that the 9900K will pretty much never hit exactly the same scores that it did on that specific day.

Since that reference score was made, we've upgraded to Lightroom Classic 9.0 and there have been numerous BIOS, driver, and Windows updates that have come though. All of those can affect performance, and it looks like we have overall seen a performance drop of about 8% with the 9900K since that time. I honestly don't know what specifically has caused that drop, but there have been a number of Intel security vulnerabilities that have been fixed at the expense of performance, and Lightroom Classic is adding more GPU acceleration which sometimes can reduce performance at first until they get it really dialed in.

Posted on 2019-11-26 19:20:17

On my system, for the Develop sliders (the only performance characteristic I care about as I spend 90+% of my Lightroom time dragging sliders), V9.1 was a slowdown and 9.2 a huge slowdown. 9.2 is at least 4 times slower than the last V8 release.

Example for dragging the Noise Reduction Luminance slider, Fuji X-T1 RAW image: from almost real time to 3 seconds. 9.1's biggest reduction was undo (Ctrl-Z), now with 9.2, applying the slider is as slow as undo.

HP Z440, 6-core Xeon, 64GB ECC RAM, Quadro K1200 4GB, five SSDs (dedicated Samsung 2TB 860 EVO on the PCIe bus for the library/catalog and 1 TB Samsung 840 EVO for the Preview Cache), two 4K monitors but Lightroom full screen on just one monitor. Calibrating the monitors had no impact as expected, Datacolor Spyder 5 Pro.

Posted on 2020-03-13 20:59:22
Subir Thapa

So for A7R3 42Mp .ARW files , is the 9900k better than 3900x ?

Posted on 2019-12-26 08:54:25

How about a comparison between the fastest affordable Quadro (the RTX4000) and the GTX 2080 TI? For a number of reasons which I won't go into here, there is a preference for Quadro cards.

Posted on 2020-03-13 20:42:46

A Quadro RTX4000 is going to perform about on par with a RTX 2060 Super or RTX 2070. To get up to the same performance as a RTX 2080 Ti, you are going to need a Quadro RTX 6000, and even then it will likely be slightly slower. The Quadro line is mostly about having high amounts of VRAM which almost never a problem for photography applications.

Posted on 2020-03-13 20:58:04

As has been stated in the benchmarks that the video card, above a minimum level, doesn't much impact Lightroom performance (except for the Texture slider); if I upgrade from the K1200 to the RTX 4000 vs the GTX 2080 Ti, am I going to see equivalent performance with the RTX 4000? Interestingly the Texture slider on the K1200 is real time, no measurable delay.

HPZ440 with all the bells and whistles.

Posted on 2020-03-13 21:05:41

The K1200 is a pretty old GPU, so you should notice some difference with the newer versions of Lightroom Classic where they have been improving GPU acceleration support. Between a Quadro RTX 4000 and RTX 2080 Ti, however, you likely won't notice much of a difference. Something like a RTX 2060 is probably a better choice since it will likely perform about the same in Lightroom Classic, but at a much lower cost.

I think above a small GPU upgrade, you are going to be bottlenecked by your CPU. The CPUs in the HP Z440 are almost 6 years old now, so that is what is going to be holding you back. Even with all the improvements Adobe has done in the last couple of Lightroom versions to take advantage of the GPU, it is still primarily a CPU-driven application.

Posted on 2020-03-13 21:24:48

There is only a 5-10% improvement above the E5-1650 V4 by the latest 6-core Xeon processors. Not sure there is anything meaningfully faster that will go into the current CPU socket. The 8-core Xeon will fit but considering how much slower it is, not sure that would be an upgrade.

Putting a dual slot video card right next to the HP Z Turbo Drive would likely create heat issues as Hard Disk Sentinel says it's the hottest running drive in my machine. Hence the attraction of a single slot card.

Thanks for your advice.

Posted on 2020-03-14 00:05:11

As always you guys do great work, thank you for the excellent write-ups and tests!

Posted on 2020-07-26 20:09:01