Puget Systems print logo


Read this article at https://www.pugetsystems.com/guides/1189
Article Thumbnail

Can You Mix Different GPUs in Octane and Redshift?

Written on June 28, 2018 by William George


GPU rendering engines like OctaneRender and Redshift utilize the computational power of the graphics processing chips on video cards to create photo-realistic images and animations. The more powerful the video card, the faster the rendering process goes - and multiple video cards can be used together to further improve performance. But can those video cards be a mix of different models, or do they all need to be identical?

Test Setup

To answer this question we need to look at two pairs of video cards: different models from the same hardware generation, as well as cards from different generations. We will also need to check the performance of the cards individually so that we can compare the rendering speed when used in combination to their stand-alone performance. Since Octane and Redshift use CUDA, these also must be NVIDIA graphics cards. Given that criteria, we selected a GeForce GTX 1070 Ti, GTX 1060, and GTX 980 Ti.

For our testbed, we wanted to use a high clock speed processor so that the platform itself would not be limiting performance. In recent tests we found Intel's Xeon W-2125 processor to be ideal in that regard, especially for users who might want even more than just two cards. Even though we just needed to test them in pairs, we still used the Gigabyte MW51-HP0 board. That provides the right PCI-Express slot layout for up to four GPUs, and the Xeon W-2125 is quite fast: 4.0GHz base and up to 4.5GHz turbo.

If you would like full details on the hardware configuration we tested on, just .

Benchmark Results

First up is a graph showing individual OctaneBench results for the GTX 1060 and 1070 Ti, along with their score together. Estimates for dual-GPU results are also included as points of reference, based on the virtually perfect multi-GPU scaling we've seen with Octane.

OctaneBench 3.0.8 Same Generation Mixed Multi GPU Rendering Performance Comparison

We've also got the same type of chart showing the results with mixed generations: the older GTX 980 Ti and current 1070 Ti.

OctaneBench 3.0.8 Different Generation Mixed Multi GPU Rendering Performance Comparison

Moving on to Redshift, here are the results in seconds from the 1060 and 1070 Ti cards. Redshift doesn't scale quite as well with multiple GPUs as Octane, but we've found going from one card to two increases performance by about 92% (hence the estimates used below).

Redshift 2.6.11 Demo Same Generation Mixed Multi GPU Rendering Performance Comparison

And lastly, we have a similar chart showing the render times with different GPU architectures: the older GTX 980 Ti and current 1070 Ti.

Redshift 2.6.11 Demo Different Generation Mixed Multi GPU Rendering Performance Comparison


In both OctaneBench and the Redshift demo, we found that the mixed video card configurations worked just fine. There was no problem in these benchmarks with different GPUs from the same technological generation or even when mixing the current, Pascal-based GeForce 1000-series cards with older Maxwell-based 900-series models.

Not only were no problems or errors encountered, but the performance of the mixed pair was also quite good. In both cases, the measured results fit right between what we would expect for dual card configurations using each individual GPU. Before having tested this empirically, I was concerned that there might be a negative impact - especially when mixing in an older architecture - such that the combined performance would lean more toward the side of the lower-end card. No such tendency was found!


So, can you mix different GPUs in Octane and Redshift? Yes! Not only does mixing GPUs seem to work fine - we did not experience any crashes with either benchmark - the performance increase is right in line with what we expect when using multiples of the same video card model. There are some things to keep in mind which don't show up in these results, though:

  • When mixing GPUs, you are limited by the card with the lowest amount of memory. A card's onboard RAM affects how large and complex of a scene can be rendered, so having even just one card with less VRAM could limit your whole system. You could disable that card when rendering bigger stuff, but then there is no longer any benefit to it being there.
  • We were only able to test a limited number of video cards for this article, and there are thousands of possible combinations when you factor in 2, 3, and 4 GPUs across the last few generations of GeForce (and potentially Quadro) models. As such, we cannot guarantee that every combination will be as trouble-free and effective as this. Your mileage may vary.
  • Since we only looked at GeForce models, these cards all used the same driver (397.93 is the version we used). If you were to mix GeForce and Quadro cards, though, you might need to install separate drivers. Windows 10 is supposed to be able to handle that, but it introduces yet another layer of complexity and potential for problems.

Because of those issues we strongly recommend equipping new GPU rendering workstations with uniform sets of video cards. If you are upgrading an existing system, though, or have a good condition video card just sitting around gathering dust... then maybe give a shot to mixing GPUs and see how it works out. I'd love to read your stories - successful or otherwise! - in the comments below.

Recommended Systems for Redshift

Recommended Systems for OctaneRender

Tags: Redshift, Octane, Render, Performance, Mixed, Multi, GPU, Rendering, Benchmark, NVIDIA, GeForce, 1070 Ti, 1060, 980 Ti, Video, Card
Joe S.

Could you try mixing them with Davinci Resolve? Trying to figure out if I need to sell my older card once I upgrade...

Posted on 2018-07-27 12:59:47

You can mix in Resolve, but there are the exact same caveats that are listed in this article. The few times we've done that for customers, however, it hasn't been great. I don't think Resolve ever had issues, but if I remember right there was secondary software like Photoshop that got confused and threw errors randomly.

I think what it comes down to is that the more different the cards, the more potential for problems. So mixing a 1070 and 1080 probably will be fine. But mixing a Quadro and GeForce you should expect to have to work through errors at some point at least once. And don't even think about mixing AMD and NVIDIA.

Posted on 2018-07-27 14:56:36
Joe S.

Once a few years ago, I already tried to mixed a gtx 570 ti with a gtx 970, at first it seems to work fine. but then overtime, resolve refused to start, while throwing a generic 0x00005 error in event viewer. But once I disabled the card, resolve was working fine again.

I just wanted to know, if you maybe had better experiences.

Also a current question I'm pondering: with several cards inside,which card would be better to use?
Afaik one is used to display windows gui and also resolve uses for displaying its own gui. The other one resolve uses exclusivly for computing it's goodness.
Question is, which one to use for which? I currently think maybe it's best to use the weaker one for connecting all displays (excluding decklink ones), so the resolve can use all the power/memory of the better/new card?

Posted on 2018-07-27 17:09:33

Yea, that's the way you should do it. Actually displaying the GUI doesn't take much, so better to use the weaker card for that. Just make sure you have the second card set manually in the Resolve settings as the compute card and de-select the primary card and you should be good to go.

Posted on 2018-07-27 17:42:48