Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1935
Article Thumbnail

Quad GeForce RTX 3090 in a desktop - Does it work?

Written on October 20, 2020 by Matt Bach
Share:

Introduction

Here at Puget Systems, our customers range from those needing relatively modest workstations, all the way up to those that need the most powerful system possible. These extreme workstations used to be dual or quad CPU systems, but recently, the shift has been to quad GPU setups. Having this much GPU power can provide incredible performance for a number of industries including rendering, scientific computing, and even has some use for video editing applications like DaVinci Resolve.

With NVIDIA's latest GPUs - the GeForce RTX 3080 10GB and 3090 24GB - the amount of performance you can get from a single GPU has dramatically increased. Unfortunately, the amount of power these cards require (and conversely the amount of heat they generate) has also increased. In addition, almost all the GPU models available from NVIDIA and 3rd party manufacturers are not designed for use in multi-GPU configurations which in many cases limits you to only one or two cards.

However, Gigabyte has recently launched a blower-style RTX 3090 that should give us our best chance of using three or four RTX 3090's in a workstations: the GeForce RTX 3090 TURBO 24G

Gigabyte GeForce RTX 3090 TURBO 24G Quad GPU

This type of blower-style cooling system is much better for multi-GPU configurations as it exhausts the majority of the heat directly out the back of the chassis. And when we are dealing with four 350 watt video cards, that is 1,400 watts of heat that we certainly want out of the system as quickly as possible.

While the cooler design may be able to help with the heat output, we also have the problem of total power draw. We should be able to power 1,400 watts with a single 1,600 watt power supply, but that doesn't leave much room for any voltage spikes, not to mention having enough to power the CPU, motherboard, RAM, storage, and other devices inside the computer.

This begs the question: is having four RTX 3090s inside a desktop workstation actually feasible? Or is the heat and power draw too much to handle?

Rendering workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!

Test Setup

To see if quad RTX 3090 is something we even want to consider offering, we wanted to put a number of configurations to the test to look at performance, temperatures, and power draw. The main system we will be using has the following specs:

Test Platform
CPU Intel Xeon W-2255 10 Core
CPU Cooler Noctua NH-U12DX i4
Motherboard Asus WS C422 SAGE/10G
RAM 8x DDR4-3200 16GB REG ECC(128GB total)
Video Card 1-4x Gigabyte RTX 3090 TURBO 24G
Hard Drive Samsung 970 Pro 512GB
PSU 1-2x EVGA SuperNOVA 1600W P2
Software Windows 10 Pro 64-bit (Ver. 2004)

While our testing with up to three RTX 3090s is fairly straight-forward, we have some serious concerns about power draw when we get up to four GPUs. We are going to attempt using just a single 1600W power supply with stock settings, but also try setting the power limits for each card to 300W (which should bring us comfortably below the 1600W max), as well as using a dual PSU setup to ensure none of the cards are starved for power.

Quad RTX 3090 power draw

Power draw is a bit of a concern with quad RTX 3090

For our testing, we will look at performance, power draw, and GPU temperature with OctaneBench, RedShift, V-Ray Next, and the GPU Effects portion of our PugetBench for DaVinci Resolve benchmark.

Performance - OctaneBench

OctaneBench is often one of our go-to benchmarks for GPUs because it runs extremely well with multiple video cards. As you can see in the charts above, the scaling from one to four RTX 3090 cards is almost perfect with four cards scoring exactly four times higher than a single RTX 3090.

In fact, the biggest surprise here was that limiting each of the GPUs to 300W only dropped performance by a little more than 1%. While we are going to talk about power draw in more detail later, we will tease that limiting the GPUs reduced the overall system power by ~16%, which is a great return for such a tiny drop in performance.

Performance - V-Ray Next

V-Ray Next is quickly becoming another staple for our GPU testing because it not only scales just as well as OctaneRender, but it actually causes slightly higher overall system power draw which makes it a great benchmark for stressing GPUs.

Here, there again isn't much to say. Scaling up to four RTX 3090 cards is perfect, and limiting the GPU power reduced the benchmark result by less than 1%. We also aren't seeing any increase in performance with dual power supplies, which means that so far, a single 1600W power supply appears to be doing OK.

Performance - RedShift

RedShift is interesting because it does not scale as well as OctaneRender or V-Ray, but it's recent acquisition by Maxon (makers of Cinema4D) means that we are likely to see more people using it in the near future. One thing to note is that this benchmark returns the results in seconds, so a lower result is better (the opposite of our other tests).

In RedShift, we didn't see quite as good of scaling, but four RTX 3090 cards is still 3.6 times faster than a single card. Once again, power limiting the cards and using dual power supplies didn't affect the performance to a significant degree.

Performance - DaVinci Resolve

To round out our testing, we wanted to look at something that wasn't rendering. We actually were going to include a few other tests such as NeatBench and a CUDA NBody simulation, but either the scaling wasn't very good with multiple GPUs, or we had issues running it due to how new the RTX 3090 is.

Our DaVinci Resolve benchmark, however, has support for these cards and the "GPU Effects" portion of the benchmark scales fairly well up to three GPUs. We didn't see a significant increase in performance with four RTX 3090 cards, which may in part be due to our choice of CPU. This test is going to load the processor more than any of the others, and while that shouldn't explain the performance wall entirely, it may be a contributing factor.

Power Draw, Thermals, and Noise

Performance is great to look at, but one of the main reasons we wanted to do this testing was to discover if putting four RTX 3090 cards into a desktop workstation was even feasible. The higher power draw and heat output means that only cards like the Gigabyte RTX 3090 TURBO 24G with a blower-style cooler will even have a chance, but these cards are still rated for 350W each.

Power draw was one of our biggest concerns, so we decided to start there:

Quad RTX 3090 24GB Power Draw

We measured the power draw from the wall during each benchmark, which showed us some very interesting results. First, the benchmark that pulled the most power was actually our DaVinci Resolve GPU Effects test. This is likely because it not only uses the GPUs, but puts a decent load on the CPU as well. Because of this, this test is likely a bit more accurate for the kind of maximum load you might put on a system in an "everyday" situation.

Overall, what we found was that while Quad RTX 3090s was able to run on a single 1600W power supply, it is cutting is extremely close. Remember that this is power draw from the wall, and going by the rated 92% efficiency of the EVGA unit we are using, our peak power draw of 1,717 watts should only translate to 1,580 watts of power internally. That leaves a whole 20 watts to spare!

We didn't have the system shut down on us during our testing, but this is way too close for long-term use. Not to mention that if we used a more power-hungry CPU, or even just added a few more storage drives, we likely would have been pushed over the edge. So, while we technically succeeded with four RTX 3090s on a single 1600W power supply, that is definitely not something we would recommend.

There are larger power supplies you can get that go all the way up to 2400W, but at that point you are going to want to bring in an electrician to make sure your outlets can handle the power draw. The 1717 watts we saw translates to 14.3 amps of power since we are using 120V circuits, and most household and office outlets are going to be wired to a 15 amp breaker. That leaves almost no room for your monitors, speakers, and other peripherals. If you do go the route of a 2400W PSU, you are going to need to ensure that you are using a 20 amp circuit.

But power draw aside, how did the Gigabyte cards handle the heat generated by these cards? 1700 watts of power is more heat than most electric space heaters put out, which can be very hard to handle within a computer chassis.

In the two charts above, we are looking at the peak GPU temperature for each card, as well as the peak GPU fan speed for each configuration in OctaneBench. We recorded results for the other tests as well, but OctaneBench was the only test that ran long enough to truly get the system up to temperature.

Surprisingly, the temperatures were not bad at all. Even with all four GPUs running at full speed, the temperatures ranged from 73C on the bottom GPU, to just 80C on the top card However, something to keep in mind is that the temperature of the GPU is only half the picture. GPU coolers - and CPU coolers for that matter - are tuned to increase the fan speed gradually, which means that the temperatures not only have to be acceptable, but there must be adequate fan speed head room to account for higher ambient temperature, longer load times, and the heat generated by other components that might be installed in the future.

In this case, quad RTX 3090s peaked at 88% of the maximum fan speed. To us, that is starting to cut it close, but technically it should be enough headroom - especially if you beefed up the chassis cooling with additional front or side fans.

The last thing we wanted to consider was noise. The temperature of these cards was actually fairly decent, but if the system sounds like a jet engine, that may not be acceptable for many users. Noise is very difficult to get across, but we decided the best way would be to record a short video of each configuration so you can at least hear the relative difference.

All things considered, the noise level of four RTX 3090 cards is not too bad. It certainly isn't what anyone would call quiet, but for the amount of compute power these cards are able to provide, most end users would likely deem it acceptable.

Quad RTX 3090 - Feasible or Fantasy?

Interestingly enough, we actually had very few issues getting four RTX 3090 cards to work in a desktop workstation. Using a blower-style card like the RTX 3090 TURBO 24G from Gigabyte is certainly a must, but even under load the GPU temperatures stayed below 80C without going above 90% of the maximum GPU fan speed.

The only true problem we ran into was power draw. We measured a maximum power draw of 1717 watts from the wall, which not only exceeds what we would be comfortable with from a 1600W power supply, but also means that you should run your system from a 20 amp breaker if possible. Most house and office outlets will be on 15 amp circuits in the US, which may mean hiring an electrician to do some electrical work if you decide to use one of the very few 2400W power supplies that are available.

Quad RTX 3090 24GB in a desktop workstation benchmark performance power draw temperature

So, will we be offering quad RTX 3090 workstations? Outside of some very custom configurations, we likely are not going to offer this kind of setup to the general public due to the power draw concerns. On the other hand, triple RTX 3090 is something we are likely to pursue, although that has not yet passed our full qualification process quite yet. Even three RTX 3090 cards is going to give a very healthy performance boost over a quad RTX 2080 Ti setup, however, which is great news for users that need faster render times, or those working in AI/ML development.

Rendering Workstations

Puget Systems offers a range of poweful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: RTX 3090, Quad GPU, power draw, heat output
Juraj

Fantastic test. Would love to see what further performance loss there would be with 280W or even 250W and resulting noise.
I presume double NV-Link is still forfeited with GeForce cards, even with latest Studio Drivers?
That would make good argument to stick with 2x3090 in workstation and 2x3090 in nearby single node to keep 48GB pooled VRAM in Vray/Octane.

I bought AorusExtreme TRX40 with quad-PCI spacing specially because I wanted 4xAmpere GPUs, so my surprise with 3-slotted FEs was quite strong :- ). Not that I can get them in Europe now anyway..in near future.

Posted on 2020-10-21 10:54:32

Yea, dual NVLink doesn't work in Windows. Also, everything we tested won't benefit from NVLink anyways (although in a few cases the full software would - just not the benchmark). A few other things like Resolve is actually worse with NVLink since it requires SLI to be enabled, which causes Resolve to only see half the GPUs.

I hear you about supply. We're buying up every card we can through our disti supply lines, and still not keeping up with our own sales. Hopefully it improves soon!

Posted on 2020-10-21 16:09:13
k8s_1

Does dual NVLink work in Linux?

Posted on 2020-10-23 13:35:32
Eric Hjalmarsson

Is Redshift one of the programs that benefited from NVLink?

Posted on 2020-11-11 22:36:30
lemans24

Thanks Matt!!! Excellent article.

I am running Monte Carlo simulations for real time Option Pricing and I need the fastest cards possible.
The setup that I was going to get for 2021 comes in 2 stages:

1) dev station - dual 3090 Turbo style cards
2) gpu server - quad 3080 20GB Turbo style cards (RTX A5000 possibly)

I would love to run quad 3090 in a gpu server but I too already calculated that even with a 8-core server, quad 3090 cards would be cutting it close

Really looking forward to getting at least dual 3090 for workstation development but...including power draw and maybe with 300W limit, a small quad server
maybe possible but a quad 20GB 3080 Turbo cards looks very enticing from a price/performance point of view

Posted on 2020-10-21 14:30:59

Quad 3080 is still up in the air a bit since there isn't a blower-style card (yet). If the scaling is pretty good for your simulations (similar to OctaneRender), at best 4x 3080 should be about 70% faster than 2x 3090 (although with less VRAM). If scaling is just decent (similar to RedShift), then 2x 3090 could be 15% faster than 4x 3080. So it really depends on how well your code will scale across multiple GPUs.

Also, a 3080 20GB version is still just a rumor. I personally have a hard time believing NVIDIA will make one since it would cannibalize their 3090 24GB sales, but I hope I am wrong since that would be an amazing model.

Posted on 2020-10-21 16:05:06
lemans24

I think a 3080 20GB version is a done deal for Nvidia in 2021 and no doubt would bite a lot of 3090 sales but not all of them.
Price/performance would be better and a quad server version would be more palatable for power requirements as compared to quad 3090!!

Posted on 2020-10-21 18:49:03

have you considered undervolting for a test like this? I've been able to save upwards of about 50-80w power draw on a single card, i wonder if it would make the power and heat more manageable in a setup like this. Pure insanity these cards.

Posted on 2020-10-21 19:01:45

Not undervolting, but we did limit the cards to 300W max power draw for some of the testing. Definitely makes it more manageable, but if I was going to use it long term on a 1600W PSU, I would probably drop it down to 275W since we were still cutting the power draw pretty close.

Posted on 2020-10-21 19:06:32
Håkon Broder Lund

This is so sick! Wish I could justify a setup like this!

Posted on 2020-10-21 21:18:36
WCCFCommentsAreCancer

can't help but feel like this thing is begging for 3990WX

Posted on 2020-10-22 14:03:37

The problem with Threadripper is that there are no good, high quality motherboards that can do quad GPU. Triple GPU is easy enough (which is why Threadripper is what we tend to use for up to three GPUs), but I think there are only a few boards that can do quad GPU, and none passed our qualification for various reasons.

Posted on 2020-10-22 15:54:53
Ben Kamprath

Would be you able to elaborate on that, Matt? I've been curious about the Gigabyte TRX40 AORUS XTREME and thought it might be a good choice for a quad GPU Threadripper build. Not so?

Posted on 2020-10-22 23:37:48

That board would probably actually be pretty good - I believe the reason why we didn't decide to carry it was because it is absolutely massive. It is extra wide, so you need a special chassis to be able to fit it. If you can do that though, it would be a great choice!

Posted on 2020-10-23 00:36:48
Curtis Northcutt

Ben Kamprath Besides the spacing of the GPUs, what makes this board worth the 2.3x the cost? Why is the board so expensive?

Posted on 2020-10-23 00:50:28
Ben Kamprath

Curtis Northcutt I definitely hear you on the cost. I think there's likely some prestige pricing by Gigabyte going on there. That said, the board features seem excellent and I'm looking for something that gives me plenty of flexibility down the line.

By the way, big fan of your builds. I came across your site last year while researching quad-GPU setups. Great stuff.

Posted on 2020-10-23 05:04:56
Curtis Northcutt

Thanks Ben Kamprath :) I'm working on the next generation build now to continue to help researchers, but am unable to purchase these GPUs, from any location, by any means. If you have any suggestions, it will allow me to create the next post on the L7 Machine Learning Blog sooner for folks. https://l7.curtisnorthcutt....

Posted on 2020-10-23 19:20:24
Curtis Northcutt

Very interesting. What is missing from the various TRX40 pro motherboards??

Posted on 2020-10-22 23:43:41
Ernst Stavro Blofeld

No RDIMMs & no IPMI? Weak MKL also. Note also that in a quad-gpu setup, the mb has to provide 300W (75x4) through the slots. Allow another ~350 for the cpu and other systems. I don't know, but that's a bit too much for two EPS connectors. Note that Epyc boards do have a dedicated 8-pin connector to be used if you install more than 3 gpus (my EPYCD8-2T has it, anyway).

Posted on 2020-10-23 09:13:08
Zain Fadhil

Gigabyte Designare TRX40 could handle similar setup & its a PCIe 4.0 with 40gb/s TB card. I am sure the Asus MB was not the best option!

Posted on 2020-10-23 17:10:03
NoOneWantstoWatchYouStream

What about Epyc? And I'd like to see testing regardless to see if the PCIe 4.0 and the increased lanes per slot offered by Threadripper/Eypc offers any performance gain over the Intel PCIe 3.0 system with fewer lanes

Posted on 2020-10-23 00:05:46
Ernst Stavro Blofeld

I bet performance gains with pcie 4.0 would be scarce, if at all.

Posted on 2020-10-23 08:58:27
Yusuf Umar

How about Asrock TRX40 Creator? I've seen several builds using this with quad GPU on PCPartPicker

Posted on 2020-10-23 04:18:07
Ernst Stavro Blofeld

True, it has the correct layout and it's cheap. The main limitation of TRX40 setups is, in my opinion, their inability to use RDIMMs *and* the weaknesses of any amd processor at MKL, which is still important for many tasks.

Posted on 2020-10-23 09:08:35
Ernst Stavro Blofeld

I'd like to know such reasons. I don't like them any better than you (they all look game-ish and lack IPMI) but still I'd like to know if there are some other solid reasons.

Posted on 2020-10-23 08:57:33
Vlad N/A

I would like to know what penalty you get in terms of added heat with four cards compared to just one in this system. As I understand it, adding more cards creates heat that increases the temp of neighboring cards. How much added heat would you get after installing three more cards?

Posted on 2020-10-22 21:24:14

It definitely heats things up - that is why with one GPU you only saw a peak GPU temp of 67C, but with four it ranged from 73-80C (hottest card on top).

Posted on 2020-10-22 21:31:55
Ernst Stavro Blofeld

Maybe removing the drive cage could help. It blocks the airflow from the front fans.

Posted on 2020-10-23 08:59:11
Ernst Stavro Blofeld

I really wonder how a standard-sized blower cooler can manage to keep one card at 67C max, while pascal/turing cards with the same cooling system hit their throttle temp more often than not. Did you measure the rpms?

Posted on 2020-10-23 09:06:49
Zain Fadhil

and this a 2 slot card not a 3 slot one!

Posted on 2020-10-24 12:03:08
NoOneWantstoWatchYouStream

I would have thought an Eypc system would have been the perfect CPU for this (especially something like the Eypc 7H12) with x128 PCIe 4.0 lanes and this the ability to offer every videocard PCIe 4.0x16 bandwidth.

I know the Asus board used offers x16/x16/x16/x16, but that's through the use of not one, but two PLX chips and when all four cards are being utilized simultaneously that has to come into play considering the Intel CPU only offers x48 lanes with x8 being split between the Chipset and m.2 slot.

Posted on 2020-10-23 00:14:58

You certainly could get more performance than this config - In fact, we are seeing higher performance with 3 RTX 3090 even on a Ryzen platform than we did in this test. Honestly, we were more concerned about power draw and thermals in this testing, and the setup we used has been our tried-and-true quad GPU setup in the past. If it ended up that quad RTX 3090 was feasible for us to offer in our systems (which it's not), we probably would have looked into what we had to do to get Threadripper or EPYC working.

EPYC is just more difficult since there are very, very few options (if any) for quad GPU that also has workstation features like audio, Wifi, and plenty of USB ports.

Posted on 2020-10-23 00:41:03
Ernst Stavro Blofeld

I have nothing against pcie switches, except that they waste almost 50W each.

Posted on 2020-10-23 09:19:58
Curtis Northcutt

Hi Matt, this is a fantastic post. The main thing I'm surprised about is that you obtained 4 of the turbo blower-style cards. I haven't seen them anywhere, and I've been checking everyday. We need to purchase these for our academic institution to do research -- any pointers would be very helpful. cgn@mit.edu

Posted on 2020-10-23 00:43:57
Phúc Lê

I haven't found any information about the price for this blower model yet. Do you think it would be more expensive than the 1500$ of founder edition?

Posted on 2020-10-25 14:52:28
Lance Moody

Wondering if I bought the first 3 3090 system from you? Very excited to get it!

I bought my quad system 2(2/1080's and 2 Titan Pascals) almost exactly 4 years and trusted Puget again for my new system.

Posted on 2020-10-23 03:12:30
Curtis Northcutt

Amazing work Matt!!!! Which case did you use to fit two PSUs? And what is the wattage of the two PSUs?

Posted on 2020-10-23 04:55:17
Ernst Stavro Blofeld

It seems to be the Fractal XL. But afaik it does not have dual-psu capabilities (I may be wrong).

Posted on 2020-10-23 09:12:10
Ernst Stavro Blofeld

Now it would be interesting to know how the hell had you been able to get your hands upon four of these cards. I searched everywhere and even wrote to gigabyte, with no results.

Posted on 2020-10-23 09:01:38
Ernst Stavro Blofeld

Just another thing.. Would you test such configuration with a big Transformer? The whole deep learning community all around the world is craving such a benchmark, but no one has provided it as of yet.

Posted on 2020-10-23 09:02:32

Yes please, I’m also very interested in a solid Deep Learning benchmark with this config.

Posted on 2020-10-24 03:41:03
Zarck

What does it look like with Boinc or Folding @ home?

https://www.reddit.com/r/BO...

Posted on 2020-10-23 10:56:42
Daryl Rybotycki

Thank you for thinking outside the box.

I hope you can sell more than a few of these Beasts!

Posted on 2020-10-23 17:32:10
Ben Smith

Great article, and thorough, Matt. Did you limit GPU power to 300W using NVIDIA Control Panel UI or a Gigabyte UI? Which settings?

Posted on 2020-10-23 17:38:46

We just used MSI Afterburner for this test. I'm sure there are better ways if you wanted to do this long-term, but we aren't planning on selling anything with power limits enabled, so we just went with what we knew was easy to use.

Posted on 2020-10-23 17:46:30
Ben Smith

Understood. It would be easy to let it go at this, but it sorta captivates my imagination because power is a cost, and this seems like an opportunity - maybe even worth exploring in an article. Cost incentive is plain, but there is also in the back of my mind a pointing the finger at NVIDIA (for me anyway) that - rather than making sound architectural / engineering breakthroughs with this gen of cards, they simply wildly increased TDP. This "cop out" concern as speculative as I haven't researched what if any architecture advances were made, but it's there. Any thoughts? That said you'd probably want to test a wide variety of scenarios (and power control apps) with power limitations active before doing so long term. Thanks for your prompt reply and good work.

Posted on 2020-10-23 18:00:44

There are definitely major architecture advances with the 3000 series cards. In fact, if you look at the performance versus power draw, the new cards are actually more efficient than the 2000-series. They just also increased the power requirements so they could achieve even higher performance. I mean, the RTX 3090 (350W) is about the same performance as two 2080 Tis (2x 250W = 500W)!

That is one of the reasons why I'm excited to see what the RTX 3070 has to offer. It very well may beat the 2080 Ti, along with having decently lower power draw.

Posted on 2020-10-23 18:08:28
Ben Smith

Thanks Matt - good to hear! And one 3090 ≅ 2 x 2080 Ti! Impressive. Thanks again.

Posted on 2020-10-23 18:15:52
David MacLeod

Great review, are you able to confirm if these cards have P2P and RDMA disabled, like the previous generation (RTX Titan, 2080 Ti etc.)

Posted on 2020-10-23 19:07:47

I'm not sure, but that might be in our HPC-focused articles:

https://www.pugetsystems.co...
https://www.pugetsystems.co...

Posted on 2020-10-23 19:11:37
任柔

matt maybe we can take a Infrared photo, use it optimize the air duct

Posted on 2020-10-24 06:49:46
Zhenlin

Thanks for sharing this test! I wonder if you could try any deep learning benchmark on this setup. Quad 3090 server will be highly needed by DL researchers if it works.

Posted on 2020-10-24 16:16:12

This is at the same time very interesting and very entertaining: WELL DONE PUGET!

Posted on 2020-10-26 17:45:42
Felix Kütt

Hi Matt, what about the heat output? How much did the room heat up during load?

Posted on 2020-10-29 07:26:53

It puts out the same amount of heat that a 1700W space heater would. So in a small office, it will definitely get it nice and toasty in the winter pretty quick, or turn it into a sauna in the summer.

Posted on 2020-10-29 16:55:41
Felix Kütt

Right, so as I expected. Here in Estonia in the winter it would be pretty great, no need for an extra heater then, but might be a bit of a pain in the summer. In the meanwhile somewhere in the south it would probably need a air con in the room to make it viable any time of the year. For a professional freelance 3D artists(for example) working from home office I suppose the air con might hopefully be a tax write-off though. But something to keep in mind no doubt.

Posted on 2020-10-30 14:18:00

GDDR6 is supposedly less power hungry on the new "quadros" then GDDR6X is on the GeForce cards, probably going to have to go that route, for better multi-gpu thermals/power draw. Hmmm.

Posted on 2020-10-29 15:02:20
Felix Kütt

Probably, yes. But those Quadros most likely have more memory on them canceling out the difference.

Posted on 2020-10-30 14:20:54
Zain Fadhil

@Matt Bach
@Gigabyte website the 3090 blower card warranty period is not mentioned! all other 3090s card has 4 years warranty.
I contacted Gigabyte and was informed that the blower type ones has 3 years warranty. This card has the slowest
core clock @ 1695MHz (to control the temperature) also usually noise levels in blower types are much higher. Is this type of
cooling is really good for a 350W GPU? specially when stacking 3-4 gpus and doing long animations? are we going
to experience throttling after sometime? I am planning to buy 4 of these gpus & dont want to regret it! My other option will be
using dual 3slot 3090s directly on motherboard then the other two gpus will be on riser cables but this solution will consider
a wider case to host the gpus cage. Power draw is not a problem as our office has upgraded power outlets

Posted on 2020-10-30 12:48:18
Felix Kütt

The lower clocks enable lower power draw(assuming vcore is adjusted to be lower too) thus causing lower heat produced by the cards and less stress on power supply. You can always adjust the clocks and vcore(core voltage) yourself if you are worried about it. But buying from puget(edit: a whole pre-built system I mean) I'd expect them to make doubly sure that it's fine. So I wouldn't be too worried.

Posted on 2020-10-30 14:24:44
Zain Fadhil

My concerns are the durability (at least 5 years) & performance stability (overheating) of these cards when used under load for long periods of time. Its just the idea of a +350W being squeezed in a 2slot gpu.

Posted on 2020-10-30 15:41:06
Felix Kütt

Well, Voltage and frequency don't scale linearly. Even a small drop in frequency can enable a significant drop in voltage drawn, and the thermal rating(that wattage thing) is defined by voltage drawn. It's that power that has to pass through that causes the heating. But of course too aggressive under-clocking can end up being damaging in the long term as well.

Again though. It is not as if they are designed to be used only for a month or two.

Posted on 2020-10-31 08:51:48
vinsin

What configuration is better - buying 2 RTX 3080 or a single RTX 3090 in term of performance related to DL?

Posted on 2020-11-03 11:54:09

We have a whole article on that actually! https://www.pugetsystems.co... . Short answer is 2x 3080 is faster, but has less VRAM and it is of course two cards versus one which means more complexity, more chances for something to go wrong, etc.

Posted on 2020-11-03 20:01:57
dv

Is this setup even possible with AMD? I'm having a hard time finding a motherboard that will work with TR gen 3 and fit/support 4 GPUs

Posted on 2020-11-05 16:22:11
Curtis Northcutt

4 GPUs won't work with the 30-series. Ignoring the fact that it requires 2000W (or more) of power and most affordable PSUs stop at 1600W, the issue is that the current versions of the RTX 30-series use up 3-pins or are not-blower style. Until blower style are readily available, they will overheat and melt everything. You should use an EPYC processor and motherboard btw (not threadripper) if you want to use 4 of these GPUs. My blog has more info: https://l7.curtisnorthcutt.... I'll be updating with a post about the 30-series in the coming months.

Posted on 2020-11-05 16:27:10

That is the situation we are in as well, actually. Threadripper should be great for quad GPU (ignoring power issues), but there just are no boards that work all that well. I think Gigabyte has one, but it is massive and wouldn't work in any of our chassis I believe: https://www.gigabyte.com/Mo... .

That is one of the reason we are still doing (or rather, we were doing with the 2000-series) quad GPU with Intel-based platforms. Now that it is looking like triple GPU is as far as we will likely to be comfortable selling it isn't a big deal, but it sure would be nice to have something like a workstation-class EPYC board, or a not over-sized TRX40 board that could do quad GPU.

Posted on 2020-11-05 16:35:52
dv

Thanks Curtis, I've been using your blog - it's very helpful! Matt are you planning on offering a 3x GPU with a TR build at some point?

Posted on 2020-11-05 17:17:26

We actually already are! It is on the base Threadripper TRX40-E platform: https://www.pugetsystems.co... as well as on our solutions pages where it makes sense - like for DaVinci Resolve: https://www.pugetsystems.co...

We don't have it on any HPC systems yet, but that is something we are likely to do soon. That isn't as much my area, but I believe there are some concerns about AVX on AMD, and we are still seeing if we can do some sort of power limited quad RTX 3090 that would be reliable enough for us to be comfortable offering. Nothing stopping you from using Threadripper + 3x RTX 3090 if AVX or whatever other Intel advantage that may exist doesn't impact your work though!

Posted on 2020-11-05 17:24:19

I remain interested in seeing how RED RAW decode/debayer scales across multiple GPUs (if at all). The DaVinci benchmarks are a good overall proxy for illustrating performance, but I really wish the Resolve benchmarks were more explicit about 8K RED input feeding multiple 4K 12-bit HDR monitors (which describes my setup).

Posted on 2020-11-10 14:24:32
pete

Great work Matt! Any thoughts on how to get this rack mounted?

Posted on 2020-11-10 22:11:45
Ernst Stavro Blofeld

I have a 4-gpu system mounted in a Chenbro RM41300-F81. Works well, supports up to ssi-EEB mobos (in my case, Supermicro X11SPA-TF, Xeon Platinum 8260M, and four Turing gpus). Good thermals too.

Posted on 2020-11-11 08:09:48
Dmitrii Karpov

Thanks! Its excellent article! Does it make sense to use one 3090 Turbo for rendering or it will be better to consider the option with a fan?

Posted on 2020-11-11 23:27:07

I would use one of the "normal" cards if you are just using a single card. These blower-style are louder than the others, and really only beneficial in terms of cooling when you are stacking cards.

Posted on 2020-11-11 23:41:01
Dmitrii Karpov

Could yo tell me, please, why performance of this card on tests (per single card) less then same tests on another articles on this site?
I am going to buy this card (3090 turbo) to use 2 cards in stack in the future and wondered does it work fine except for noise problems.

Posted on 2020-11-12 12:20:24

It is hard to say since the platform is different than what we will be using for triple RTX 3090 configurations. This was more of a "will it work" set of testing than really trying to establish peak performance. I know there is some additional testing going on right now, so hopefully we will have articles with better performance numbers soon.

Posted on 2020-11-12 17:24:56
Dmitrii Karpov

Wish you success in future tests! )
Will the triple RTX 3090 configuration still use 3090 turbo?

Posted on 2020-11-12 18:54:01

Yea, anything more than two GPUs will use the Gigabyte Turbo card. Even two cards we will probably use it since getting all that heat out of the system as quickly as possible is incredibly important.

Posted on 2020-11-12 19:51:14
Dan

Matt Bach would the team be able to re-do this test with the https://www.evga.com/produc... ? I'm curious if subbing even one of these in makes a difference.

Posted on 2020-11-20 17:28:59