Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1643
Article Thumbnail

What is the Best CPU for Rendering (2019)

Written on December 20, 2019 by William George


Picking the processor, or CPU, is one of the most important decisions when building or buying a workstation - but there are dozens of options to pick from at any given point, with varying specifications and price points, so making the right choice isn't always clear-cut. Is Intel or AMD the way to go? How many cores do you need? Will a more expensive CPU be worth the cost? It can be overwhelming to keep up with all the possibilities, even for those who like to read up on the latest technology, and many review websites focus on game performance and artificial benchmarks rather than real-world applications. That is why we prefer to directly test the applications our customers use, and also offer free consulting to help potential customers know what the best system will be for their unique workflow.

Having recently tested the latest Intel Core X-10000 series and AMD Threadripper 3rd Gen processors, along with all the other major CPU launches from the past year, we can now look back and say what the best processors from 2019 are for a variety of applications. In the case of rendering, we published some articles looking at how these new CPUs perform in Cinema 4D and V-Ray Next. In this final round-up for the year, though, we want to provide a more general overview - including a brief discussion of GPU based rendering - along with some solid recommendations for what processors are the best for this type of workload going into the new year.

What was the best CPU for rendering in 2019?

Right now there are four main processor families, two from each of the big manufacturers, that are worth considering for most users:

  • Intel's mainstream Core 9th Gen (up to 8 cores, $499 max MSRP)
  • Intel's high-end desktop Core X 10000-series (10 to 18 cores, $979 max MSRP)
  • AMD's mainstream Ryzen 3rd Gen (up to 16 cores, $749 max MSRP)
  • AMD's high-end desktop Threadripper 3rd Gen (24 to 32 cores, $1,999 max MSRP)

There is a lot of overlap between the prices of the first three product lines listed there, while Threadripper comes in at a higher cost but also offers substantially more cores. Your budget is typically going to limit the number of CPU models you may be considering, though, so after we cover some background about CPUs and application-specific performance data we will list a few recommendations at different price points. Also, just a reminder: more expensive does not always mean faster, and in some cases, a more expensive CPU can actually result in worse performance.

Looking for a Rendering Workstation?

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!

What does the CPU (processor) do?

Since the goal is to make this article as widely usable as possible, I want to start with a brief description of what the CPU does - specifically in rendering applications. In a nutshell, the CPU (or processor) is the 'brain' of a computer, and it is where the bulk of the calculations happen that trace rays of light as they bounce around a scene. Millions of such rays have to be calculated to get a proper picture out of a rendering engine, and in the case of videos / films a lot of individual frames need to be rendered over the course of the whole project. However, the calculation of each ray's interaction with objects and light sources can be handled separately - making this one of the best workflows for multi-core processors.

That same lack of interdependence between each ray's calculations means this type of work can also be handled very well on GPUs, or graphics processing units. Those are the primary processors on video cards that excel at handling a lot of small calculations at the same time. That has lead to the rise of GPU-based rendering over the last decade or so, and many of the popular rendering engines used today are part of that breed (including OTOY's Octane Render, Maxon's Redshift, and more). In those cases, the CPU itself suddenly matters a lot less: it is primarily just feeding data to the video cards, making clock speed far more important than core count. The number of PCI-Express lanes, and in turn video cards, that a system supports also depends on the CPU. We won't be looking at those sorts of details in this post, as we are focusing on CPU-based rendering, but if that is what you use then check out our other articles on the topic.

Getting back to our focus for today, there are several aspects of a CPU that impact how fast it is, but it can be simplified into two primary attributes: core count and frequency (clock speed). To break this down into simpler terms, we have a brief video that explains those concepts using a car analogy.

However, in addition to the number and frequency of the cores, there are many other factors that affect the real-world performance of a processor. The amount of cache memory a processor has can play a role, the presence of features like Hyperthreading, and even just the architecture (internal design) of the CPU can make a huge difference when it comes to how well it performs. This is why we can't just look at the specs on paper and determine which processor is best, but instead need to test them running real applications and data sets.

With that in mind, let's proceed to an overview of modern processor performance in a couple popular CPU rendering engines.

Cinema 4D CPU Rendering Performance

A lot is going on in the various charts below, so before getting into it we wanted to provide a key regarding the color scheme we used.

  • Light blue = Intel consumer CPUs (9th Gen)
  • Dark blue = Intel HEDT CPUs (X-10000 Series)
  • Light red = AMD consumer CPUs (Ryzen 3rd Gen)
  • Dark red = AMD HEDT CPUs (Threadripper 3rd Gen)

Cinebench R20 Multi-core CPU Rendering Performance Comparison

Cinema 4D's built-in rendering engine is very good at utilizing a lot of processor cores, and in terms of per-core performance sees good results on both Intel and AMD chips. However, AMD has the high core count market cornered for workstations right now, with their 24- and 32-core CPUs far outpacing what Intel has to offer in this space.

V-Ray Next CPU Rendering Performance

Full benchmark and test data available in our post:
V-Ray Next CPU Performance: Intel Core X-10000 vs AMD Threadripper 3rd Gen

We have two charts for V-Ray Next, because this rendering engine offers two different modes of operation. The first is a traditional CPU-based rendering pipeline, aptly named V-Ray Next CPU, while the second is called V-Ray Next GPU and primarily uses the video card(s) in a system to performance ray tracing. However, Chaos Group built emulation into that mode to allow the CPU to contribute to performance as well, so having a strong processor will provide added benefits on top of the video cards when running the GPU version.

In both cases, AMD's Threadripper chips take the top performance spots among the processors we tested. Interestingly, among the other models, the CPU-only mode seems to slightly favor Intel's Core X chips - while the GPU mode favors AMD Ryzen.

What processor should you use for a CPU rendering workstation?

As shown in the graphs above, CPU-based rendering engines excel on processors that have a lot of cores while also maintaining good clock speeds. Right now, for single-processor workstations, that means AMD's Threadripper chips are king. If you have room in your budget, you won't regret getting a 3960X or 3970X - and since they also have good per-core performance, they will excel in modeling and 3D animation applications too. Not everyone can afford a Threadripper, though, so here is our list of recommendations at varying prices:

  • AMD Ryzen 7 3900X (~$499)
  • AMD Ryzen 9 3950X (~$749)
  • AMD Threadripper 3960X (~$1,399)
  • AMD Threadripper 3970X (~$1,999)

Both Intel and AMD indeed offer higher core counts - as well as multiple physical CPUs - in their server processor lines... but those get a lot more expensive, and would be better suited to a render farm than a workstation, especially since they sacrifice per-core speeds which other applications benefit from (like Cinema 4D, Maya, 3ds Max, and others). We do offer systems that can be equipped with dual Xeons and the like, so if that is something you are interested in just reach out to us and we can advise you as to whether or not they would be a good fit for your situation and budget.

Hopefully, this post has helped you choose the right CPU for your rendering workstation. Keep in mind that even with these recommendations, the right CPU for you may be different depending on the combination of programs you use and exactly what you do in those applications. If at all possible, we recommend speaking with one of our technology consultants (425.458.0273 or sales@pugetsystems.com) if you are interested in purchasing a Puget Systems workstation, as they can help you get the exact right system for both the work you do today and what you hope to do in the future.

Looking for a Rendering Workstation?

Puget Systems offers a range of poweful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Intel 9th Gen, Intel X-series, Intel vs AMD, Rendering, CPU, AMD Ryzen 3rd Gen, AMD Threadripper 3rd Gen, Intel X-10000, Cinebench, Cinema4D, Cinema, 4D, V-Ray

I can't believe that next month we could see new Graph with 3990X, the graph scaling gonna look funny :-).
P.S: Can you comment about Intel's 10980XE availability ? Here in Israel there is only 10900X and even it's only on paper with very limited stock, the 10940X/10980XE aren't even listed.
And we never had Intel CPU shortages here.

Posted on 2019-12-24 16:21:31

I know we just got a batch of the 10980XE in this week, but in general we've been having trouble getting adequate supply of both Intel and AMD's higher-end stuff recently. I think the heavy competition in this space is pushing them to launch before they have solid supply in place, hence the shortages in early weeks / months after launches.

Posted on 2019-12-26 20:02:46
Geoffrey Mark

How does the AMD 2990WX 32-core fare for V-Ray Next in this comparison?

Posted on 2019-12-27 16:47:04

I only included the latest-gen stuff in this article, but we did a bigger round-up with some of the previous-gen CPUs a little while back - and the 2990WX is on the charts in that article:


The short story? It only gives about 67% of the performance of the 3970X in GPU mode and less than 60% in CPU mode. The new Threadrippers are a *huge* jump in performance for a single generation :)

Posted on 2019-12-30 18:29:41

I`m wondering if CPU performance is so important these days. I ran some comparison Arnold Maya benchmarks with my lowly Intel i7 3770k 3.5ghz and a Nvidia GTX 970 graphics card. The benchmarks for the latest CPU`s were taking around 2min-5min rendering the scene. My CPU took 25mins !!. I then swapped to GPU rendering and it rendered the same scene in 46 seconds!!!
I used this as a benchmark https://www.antoniobosi.com...
Be interested to know thoughts on this as i`m considering upgrading my PC. But maybe just a new GPU would suffice !?

Posted on 2020-01-01 11:45:25

It really comes down to which rendering engine you are using since some are still purely CPU-based, but I do personally believe that CPU-based rendering is going to go away in the next couple of years for most people. There are times where it will still make sense - such as extremely large scenes that can't possibly fit inside a GPU's VRAM - but those are fairly extreme cases.

We still have a significant number of customers that are using CPU-based rendering engines, however, which is why we need to continue to have options for both CPU and GPU rendering systems.

Posted on 2020-01-02 18:14:54

you'd be surprised how much faster your workflow can be with updated cpu.
Snappyness is one thing and while doing heavy renders on gpu\cpu it does not get bogged down, I work while my system works so it is definitely something that is important even if your gpu does rendering.
if 3990X is important, that is a different question but 3960X is definitely what a professional should get regardless of cpu or gpu rendering.
or 3900x-3950x depending on budget.
Have the 3900x myself and it's quite the workhorse from my xeon W-2145(SKL-X 7900x~~) which feels utterly sluggish at all times.

Posted on 2020-01-23 13:31:31
Camilo Cano

Are you guys considering running some Renderman benchmarks? It would be awesome to see the performance on the 3990X, was funny how they showed the charts on 3 screens :P

Or is there any reason for keeping away from Renderman tests?


Posted on 2020-01-13 01:56:06

Renderman isn't one of the engines we specifically target, and as far as I am aware there isn't an existing benchmark for it from Pixar themselves (or anyone in the community). I'm sure there are some sample scenes we could use if we wanted to just manually measure rendering speed in something like Maya, but since we already test two CPU-based engines (V-Ray and Cinema 4D) I fully expect that Renderman and other engines would all show very similar relative CPU performance. What would be more interesting, to me at least, is when Renderman XPU is released - that is their foray into combining CPU and GPU performance into a single rendering pipeline, much like what V-Ray Next GPU has done. It was supposed to come out in Renderman version 22, but now 23 is out and they still haven't made that a public feature yet :(

Posted on 2020-01-13 17:44:23
Jason Osterday

Where does the W-3175X sit? Have you guys tested it directly in this lineup? It still poses an interesting option for Intel customers that are on LGA3647 and want good clocks across a lot of cores and additional PCIe lanes in an S1 install.

Posted on 2020-01-14 18:55:11

We tested the W-3175X and the Core i9 9990XE (another oddball, limited SKU from Intel) along with some of the other top CPUs that were available when they came out: https://www.pugetsystems.co...

That was long before the new Core X 10000-series or 3rd Gen Threadrippers, though, and we don't have a testbed for the W-3175X built up anymore... but from those results you can see where it landed compared to the previous-gen Core X and TR chips (just a little faster than the 2990WX) and then see where those are in relation to the new CPUs and extrapolate where the W-3175X would fall today. Here are the articles which include both the old and new generation processors to bridge that gap:



In my estimation, the W-3175X is not worth it for a new system - but if someone already has a S1 LGA3647 system and just wants to upgrade the CPU without having to invest in a whole new rig then it could make sense. Both of the new Threadrippers should be faster, though, as well as less expensive.

Posted on 2020-01-15 17:55:21
Andrei Tuduran

Why no 3700X is in the list but 9700K is?

Posted on 2020-01-23 13:08:18

The 3700X and 3800X are very similar, with the only difference being a little bit of clock speed. As such, if you want to know how the 3700X should perform you can simply look at the 3800X result and subtract around 5-10%. The 9700K, however, is substantively different from the 9900K because it lacks Hyperthreading in addition to having slightly lower clock speeds - so that seemed to merit testing separately from its bigger brother :)

Posted on 2020-01-23 18:08:39
Chuck Sexton

I'm just blown away to see an amd as top recommendation. YOU guys rock!

Posted on 2020-01-24 00:54:28

William M George Can you please check SMT ON/OFF for the 3990X review? since some windows applications may not scale well with 128 Threads (max 64T per NUMA?).
Thanks in advance!

Posted on 2020-02-02 06:37:12

Hmm, that is an interesting question - I'm not sure if I'll be able to include that in the main 3990X articles (will depend on timing and whether the results are interesting / applicable)... but if I don't I will at least check it after launch and get back to you here.

Posted on 2020-02-03 18:18:04
Neo Morpheus

What is the best CPU for rendering 2019?



Because we are fanb ois!

Posted on 2020-02-07 17:26:30

What?? Did you actually read the conclusion? All four of the processors we recommended are AMD... :/

Posted on 2020-02-07 17:40:35

i want use Blender software in my PC but not running well,even though i saw the recommended requirements in Blender website. i want buy new laptop and i want to know the exact Processor and Graphics and RAM to check for my new laptop !

Posted on 2020-04-13 10:18:31

Hmm, in order to answer that it would help to know a little bit more about your specific usage:

- Are you mostly doing modeling and animation, or is rendering a big part of your workload?
- For rendering, are you using the built-in capabilities within Blender or do you use plug-ins? (and if so, which ones?)

Posted on 2020-04-13 18:31:44
Fernando P

Hi, could you make a rendering test in Fusion 360 between threadripper, Ryzen 9 and Intel 10th gen?. here are some sample test:


in the description is the file to download:


Posted on 2020-06-02 15:50:03

Fusion 360 is not currently one of the programs we test, but my understanding is that it is an entirely CPU-based renderer - so if you look at the results in this article from Cinebench and V-Ray (CPU mode) you should get a good idea of how those processors stack up. If you are curious about the new Intel Core 10th Gen models, we have articles looking at those now as well:



Posted on 2020-06-02 17:07:30
Rajesh Kundu

I have AMD Ryzen 5 3600 with GTX 1050ti graphic card. how can we use GPU hardware acceleration for video rendering in Edius software. please reply

Posted on 2021-02-15 04:33:30

Hey Rajesh, we don't really do much with Edius so I can't give you much guidance. I recommend you contact their support department or post on their forum: https://forum.grassvalley.c...

Posted on 2021-02-15 17:41:02