Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1189
Article Thumbnail

Can You Mix Different GPUs in Octane and Redshift?

Written on June 28, 2018 by William George


GPU rendering engines like OctaneRender and Redshift utilize the computational power of the graphics processing chips on video cards to create photo-realistic images and animations. The more powerful the video card, the faster the rendering process goes - and multiple video cards can be used together to further improve performance. But can those video cards be a mix of different models, or do they all need to be identical?

Test Setup

To answer this question we need to look at two pairs of video cards: different models from the same hardware generation, as well as cards from different generations. We will also need to check the performance of the cards individually so that we can compare the rendering speed when used in combination to their stand-alone performance. Since Octane and Redshift use CUDA, these also must be NVIDIA graphics cards. Given that criteria, we selected a GeForce GTX 1070 Ti, GTX 1060, and GTX 980 Ti.

For our testbed, we wanted to use a high clock speed processor so that the platform itself would not be limiting performance. In recent tests we found Intel's Xeon W-2125 processor to be ideal in that regard, especially for users who might want even more than just two cards. Even though we just needed to test them in pairs, we still used the Gigabyte MW51-HP0 board. That provides the right PCI-Express slot layout for up to four GPUs, and the Xeon W-2125 is quite fast: 4.0GHz base and up to 4.5GHz turbo.

If you would like full details on the hardware configuration we tested on, just .

Benchmark Results

First up is a graph showing individual OctaneBench results for the GTX 1060 and 1070 Ti, along with their score together. Estimates for dual-GPU results are also included as points of reference, based on the virtually perfect multi-GPU scaling we've seen with Octane.

OctaneBench 3.0.8 Same Generation Mixed Multi GPU Rendering Performance Comparison

We've also got the same type of chart showing the results with mixed generations: the older GTX 980 Ti and current 1070 Ti.

OctaneBench 3.0.8 Different Generation Mixed Multi GPU Rendering Performance Comparison

Moving on to Redshift, here are the results in seconds from the 1060 and 1070 Ti cards. Redshift doesn't scale quite as well with multiple GPUs as Octane, but we've found going from one card to two increases performance by about 92% (hence the estimates used below).

Redshift 2.6.11 Demo Same Generation Mixed Multi GPU Rendering Performance Comparison

And lastly, we have a similar chart showing the render times with different GPU architectures: the older GTX 980 Ti and current 1070 Ti.

Redshift 2.6.11 Demo Different Generation Mixed Multi GPU Rendering Performance Comparison


In both OctaneBench and the Redshift demo, we found that the mixed video card configurations worked just fine. There was no problem in these benchmarks with different GPUs from the same technological generation or even when mixing the current, Pascal-based GeForce 1000-series cards with older Maxwell-based 900-series models.

Not only were no problems or errors encountered, but the performance of the mixed pair was also quite good. In both cases, the measured results fit right between what we would expect for dual card configurations using each individual GPU. Before having tested this empirically, I was concerned that there might be a negative impact - especially when mixing in an older architecture - such that the combined performance would lean more toward the side of the lower-end card. No such tendency was found!


So, can you mix different GPUs in Octane and Redshift? Yes! Not only does mixing GPUs seem to work fine - we did not experience any crashes with either benchmark - the performance increase is right in line with what we expect when using multiples of the same video card model. There are some things to keep in mind which don't show up in these results, though:

  • When mixing GPUs, you are limited by the card with the lowest amount of memory. A card's onboard RAM affects how large and complex of a scene can be rendered, so having even just one card with less VRAM could limit your whole system. You could disable that card when rendering bigger stuff, but then there is no longer any benefit to it being there.
  • We were only able to test a limited number of video cards for this article, and there are thousands of possible combinations when you factor in 2, 3, and 4 GPUs across the last few generations of GeForce (and potentially Quadro) models. As such, we cannot guarantee that every combination will be as trouble-free and effective as this. Your mileage may vary.
  • Since we only looked at GeForce models, these cards all used the same driver (397.93 is the version we used). If you were to mix GeForce and Quadro cards, though, you might need to install separate drivers. Windows 10 is supposed to be able to handle that, but it introduces yet another layer of complexity and potential for problems.

Because of those issues we strongly recommend equipping new GPU rendering workstations with uniform sets of video cards. If you are upgrading an existing system, though, or have a good condition video card just sitting around gathering dust... then maybe give a shot to mixing GPUs and see how it works out. I'd love to read your stories - successful or otherwise! - in the comments below.

Redshift Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!

OctaneRender Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Redshift, Octane, Render, Performance, Mixed, Multi, GPU, Rendering, Benchmark, NVIDIA, GeForce, 1070 Ti, 1060, 980 Ti, Video, Card
Joe S.

Could you try mixing them with Davinci Resolve? Trying to figure out if I need to sell my older card once I upgrade...

Posted on 2018-07-27 12:59:47

You can mix in Resolve, but there are the exact same caveats that are listed in this article. The few times we've done that for customers, however, it hasn't been great. I don't think Resolve ever had issues, but if I remember right there was secondary software like Photoshop that got confused and threw errors randomly.

I think what it comes down to is that the more different the cards, the more potential for problems. So mixing a 1070 and 1080 probably will be fine. But mixing a Quadro and GeForce you should expect to have to work through errors at some point at least once. And don't even think about mixing AMD and NVIDIA.

Posted on 2018-07-27 14:56:36
Joe S.

Once a few years ago, I already tried to mixed a gtx 570 ti with a gtx 970, at first it seems to work fine. but then overtime, resolve refused to start, while throwing a generic 0x00005 error in event viewer. But once I disabled the card, resolve was working fine again.

I just wanted to know, if you maybe had better experiences.

Also a current question I'm pondering: with several cards inside,which card would be better to use?
Afaik one is used to display windows gui and also resolve uses for displaying its own gui. The other one resolve uses exclusivly for computing it's goodness.
Question is, which one to use for which? I currently think maybe it's best to use the weaker one for connecting all displays (excluding decklink ones), so the resolve can use all the power/memory of the better/new card?

Posted on 2018-07-27 17:09:33

Yea, that's the way you should do it. Actually displaying the GUI doesn't take much, so better to use the weaker card for that. Just make sure you have the second card set manually in the Resolve settings as the compute card and de-select the primary card and you should be good to go.

Posted on 2018-07-27 17:42:48

We had a customer ask today if the older GeForce GTX cards could be mixed with the newer RTX series, so I tested a GTX 1080 Ti with both a RTX 2080 and a 2080 Ti and they all worked fine together in OctaneBench 4 and Redshift Demo 2.6.22.

Posted on 2019-01-04 21:10:22

WilliamMGeorge briefly

Posted on 2019-01-05 00:37:31
César Andrade

Thanks for the answer. I have a weird setup, and that is, i'll be gettin an RTX 2070, and I have a 750ti from when I first assembled this computer. Could the benefits of combining these cards, if even possible, be worth putting both of them in the same system, or would I be better off using the RTX ? The use case is 3d modelling in 3dsmax and rendering in Marmoset Toolbag 3. Thanks for any answers!

Posted on 2019-01-16 16:57:20

The 750 Ti is pretty old at this point, and perhaps more importantly has only 2GB of VRAM (if my memory is correct). I suspect that including it in the mix with your new RTX 2070 would have a small impact, but the downside is that you would be limited by the much lower memory capacity of that card. Because of that, I think I would tend more toward using the single 2070... and then perhaps adding a second of those later on down the road, if possible.

Posted on 2019-01-16 17:06:21
César Andrade

I suspected as much. I guess I will be making a small build for my dad to use with the spare parts, or something. Thanks for the fast response and recommendation! I was definitely veering towards that in the future.

Posted on 2019-01-16 17:16:38
Michael Garske

How about these same tests with Adobe Premiere and After Effects? Thanks for your work, these findings are quite helpful.

Posted on 2019-05-07 20:46:19

PR and AE technically can use multiple cards and technically should work OK with mixed cards... but I wouldn't recommend it. I know a lot of work went into the most recent versions to improve multi-GPU support, but my understanding is that was focused on laptops using external GPUs rather than multiple discrete GPUs.

Overall, for Adobe apps I would recommend not mixing models (and especially not families like Quadro/GeForce) if you can avoid it.

Posted on 2019-05-09 20:09:31
Mike Alger

Using a 2080ti and a 980ti simultaneously with Octane while doing a final render works fine, but using the live viewer window in Cinema 4D causes C4D to crash frequently. The fix is to disable the second card's checkboxes in the Octane settings while using the live viewer, but enable both for the final render.

Posted on 2019-10-22 20:22:57

Thank you for posting that tip! I don't do testing with the various plug-in versions of rendering engines, as there are too many to keep up with all of them, so I wouldn't have run into this - but I'm sure a lot of our readers might. If you have a chance, it could also be worth submitting a report to OTOY and Maxon :)

Posted on 2019-10-22 20:37:55