While applications like Lightroom Classic utilize the GPU to accelerate a number of tasks, investing in a high-end GPU generally doesn’t net you much performance gain. With NVIDIA’s new RTX 30 series cards, will this continue to hold true, or is there a reason to invest in one of these new GPUs?
Lightroom Classic CPU performance: Intel Core 10th Gen vs AMD Ryzen 3rd Gen
Lightroom Classic has a number of interesting performance quirks – chief among them the fact that AMD processors are overwhelming faster than Intel processors for a number of tasks like exporting and generating smart previews. Will this hold true for the new Intel 10th Gen processors, or will we see Intel take over as our go-to recommendation for Lightroom Classic?
AMD Threadripper 3990X: Does Windows 10 for Workstations improve Adobe CC performance?
As AMD continues to release processors with more and more cores, we are getting to the point that there is are concerns that the normal version of Windows 10 Pro is not able to effectively utilize all these cores. To find out, we decided to test the 3990X and a number of other processors with Windows 10 Pro for Workstations as well as with SMT/HT disabled.
Lightroom Classic CPU performance: AMD Threadripper 3990X 64 Core
Lightroom Classic contains a number of tasks that can leverage a decent number of CPU cores to improve performance. The new AMD Threadripper 3990X, with a massive 64 cores, should excel when exporting images in particular, but many applications see less and less benefit as you get into extremely high core counts. Will Lightroom be able to leverage all 64 cores, or is there no benefit to using the 3990X over a much less expensive CPU like the Threadripper 3960X?
PugetBench for Lightroom Classic
Updated 12/17/2019. Want to see how your system performs in Adobe Lightroom Classic? Download and run our free public Lightroom Classic benchmark to see how your system compares to the latest hardware.
What is the Best CPU for Photography (2019)
While our hardware articles are extremely helpful in helping you pick the right CPU for your workflow, they only look at a single application and are often more technical than many readers may want. In this post, we will be discussing what the best CPU is to use for a photography workstation while keeping things at a relatively high level so that it can help answer the question for anyone – regardless on how much they keep up with the latest tech.
Lightroom Classic CPU performance: Intel Core X-10000 vs AMD Threadripper 3rd Gen
AMD’s Ryzen processors are currently our go-to recommendation for Lightroom Classic, but both Intel and AMD are launching some very intriguing high end desktop CPUs that may change things. On Intel’s side, the new X-series CPUs include a drastic reduction in price, while AMD has focused heavily on improving performance. Will either of these new processor lines end up taking the performance crown from Ryzen?
Lightroom Classic CPU performance: AMD Ryzen 9 3950X
When AMD launched their 3rd generation Ryzen CPUs, they took a commanding lead over Intel in terms of performance in Lightroom Classic. Now, AMD has released a new CPU called the Ryzen 9 3950X which increases the number of cores available on that platform to 16 physical cores. Will this allow AMD to extend their lead even further, or is Lightroom Classic not able to utilize those additional cores?
Lightroom Classic CPU Roundup: AMD Ryzen 3rd Gen, AMD Threadripper 2, Intel 9th Gen, Intel X-series
Lightroom Classic has changed dramatically over the last few years, with improved multithreading support and the recent addition of GPU acceleration. But exactly how much of a difference is there between the latest processors from both Intel and AMD? Does the higher core count on the new Ryzen CPUs make a difference?
Lightroom Classic CC 2019: Enhanced Details GPU Performance
In the latest version of Lightroom Classic CC (8.2), Adobe has added a new featured called “Enhanced Details” which uses machine learning to improve the quality of the debayering process for RAW images. This is very GPU-intensive, so we wanted to see exactly how much faster it can be on a modern, high-end GPU.