Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/816
Article Thumbnail

AutoDesk 3ds Max 2017 GeForce GPU Performance

Written on August 5, 2016 by Matt Bach
Share:

Introduction

When working in an application like 3ds Max, the ability to smoothly navigate a scene is critical for the creative process. Smaller scenes tend to not be a problem for even basic video cards, but as the complexity of the scene increases, so too does the demand on the video card. The difficult part is to determine what video you need to use in order to achieve a smooth framerate without spending your budget on a card that is significantly more powerful than you need.

 

 

 

When it comes to what video card to pick, most 3ds Max users will find themselves choosing between a professional-grade Quadro card or a consumer-grade GeForce card. Autodesk has historically made their stance clear that they only fully recommend and support professional cards, but this is complicated by the fact that their own Graphics Hardware Certification document has multiple GeForce cards shown as tested with no problems (although they are still not officially certified). In addition, you will find a great number of 3ds Max users, even on AutoDesk's forums, that are using GeForce cards with no problems.

In this article we will be benchmarking a number of GeForce cards - including the new GTX 1070 and GTX 1080 - to see how well they perform in 3ds Max. While we will not be directly comparing them to Quadro cards in this article, we do also have another article that focuses on Quadro performance that you can compare them to if you wish.

Test Setup

For our testing, we are going to use two different system configurations with the following hardware:

Since the CPU and overall platform can make a difference, we used two different systems to compare the performance of the different GeForce video cards. The first is a Z170 system using a Core i7 6700K which is typically what we would recommend for a general 3ds Max workstation. The second system uses a Core i7 6950X which has a higher number of CPU cores, making it better when rendering although it should be a bit slower for general design and animation due to it's lower operating frequency.

The different video cards we will be testing are:

To help with consistency - and since the benchmarks we performed ran for several days - we programmed a custom script using AutoIt to start 3ds Max, load the relevant project, change the view mode (Wireframe, Shaded, Shaded w/ Edged Faces), then run a script to rotate the view while recording the FPS (frames per second) of the viewport. We will be testing with three different models that should give us a range of different poly and vert counts, along with features such as high resolution textures:

Woman 003 (Copied 25 times)
172k Poly, 90k Verts

3ds Max 2017 Sample Files
​AWom0003-CS-US.max

P47 (Copied 252/504 times)
8.6/17mil Poly, 4.3/8.6mil Verts

3ds Max 2016 Tutorial Files
P47.max

Benchmark Graphics
1.9mil Poly, 1.2mil Verts

Ze da Tripa on CGArchitect Forum
benchmark_graphics.max

Results - Woman 003

Intel Core i7 6700K

Intel Core i7 6950X

For our first test, we are using the Woman 003 model from the "aXYZ HighRes Characters" folder in the 3ds Max 2017 sample files. This model uses high resolution textures and has a relatively low poly and vert count (only 172k and 90k respectively).

This is a fairly intense scene, as evidenced by the fact that we only saw about 30-35 FPS across the various video cards we tested. Interestingly, the video card didn't seem to make a very large difference. On the Core i7 6700K system, the difference between the fastest and slowest card was only about 2%. However, the Core i7 6950X system did see a bit bigger difference with the faster cards (GTX 1080 and GTX 980 Ti) being about 8% faster than the slowest cards (GTX 970 and GTX 1070). Either way, such a minor difference suggests that this scene is largely not limited by the video card, but rather something else in the system such as the CPU or RAM.

Results - P47 (252 Copies)

Intel Core i7 6700K

Intel Core i7 6950X

This model is only using standard shaders, and possibly due to this we are seeing some nice variety in performance. While viewing the model in Wireframe mode resulted in an extremely high framerate across the board (higher than you can physically see), both shaded and especially shaded with Edged Faces showed results that you might be able to visibly notice. 

Starting with the shaded results, we saw a jump in performance of about 20% between the GTX 970 and the rest of the cards we tested. While there was some variance between the other cards in shaded mode, there was no real pattern to it. Shaded with Edged Faces, however, showed more of a difference between the different cards. The GTX 970 was still the slowest (about 30-40% slower than a GTX 980), although this time there actually is a performance benefit to using the GTX 980 Ti and GTX 1070/1080. Compared to the GTX 980, these cards were an additional 15% faster on the Core i7 6700K system and 30% faster on the Core i7 6950X system.

Results - P47 (504 Copies)

Intel Core i7 6700K

Intel Core i7 6950X

Doubling the number of copies of the P47 model is a quick and easy way for us to test a large number of polys and verts. Interestingly, while the raw results are lower than the previous test, the relative performance between each card is actually very similar. Once again, with Shaded mode the GTX 970 is the slowest, but the other cards all perform roughly the same. The higher end cards like the GTX 980Ti and GTX 1080 are a hair faster than the other cards, but not by any amount you likely be able to notice.

Using Shaded with Edged Faces, the GTX 980 was about 36% faster than a GTX 970 on the Core i7 6700K system and about 20% faster on the Core i7 6950X system. For the higher end cards, they were all about 20% faster than the GTX 980 on the Core i7 6700K system and about 30-40% faster on the Core i7 6950X system. Out of these cards, the GTX 1080 was a hair faster than the others, but only by a few percentage points.

Results - Benchmark Graphics

Intel Core i7 6700K

Intel Core i7 6950X

Our final test gave us a pretty high framerate across the board which makes it not quite as accurate at the other tests, but we wanted to include it because it was one of the few scenes we found on the web that was created specifically for people to test their framerate in 3ds Max.

There are a few oddities in these results (such as the GTX 1070 slightly outperforming the GTX 1080), but overall it is pretty similar to what we saw in the previous tests. Again, the GTX 970 was the slowest by a decent margin - up to 20% in some cases and we still saw little evidence that you would see a noticeable performance gain when using a card faster than a GTX 980.

Conclusion

Summarizing all our results, we saw the following performance gains over the GTX 970 (which was consistently the slowest card) on the two systems we tested with:

Average % faster than
GTX 970
GTX 980 4GB GTX 980 Ti 6GB GTX Titan X 12GB GTX 1070 8GB GTX 1080 8GB
Wireframe 18% 25% 21% 23% 21%
Shaded 15% 18% 15% 15% 17%
Shaded w/ Edged Faces 19% 35% 32% 36% 37%
Average  17% 26% 23% 25% 25%

Averaging results to this level definitely loses out on some of the fine nuances, but it is a great way to see at a glance what kind of performance you can expect from the different cards. In fact, looking at it this way shows a couple of things that were not quite as obvious when looking at the results individually. For example, while we knew that the GTX 980 was slower than many of the other cards, but we didn't quite realize that it was by 6-10%. In addition, the fact that the GTX 1070 and 1080 were on average identical surprised us - although the GTX 1080 is still faster for Shaded and Shaded w/ Edged Faces.

Overall, there are a few conclusions we can come to based on our testing:

  1. If you are not working with scenes that have at least 6-7 million polys, even a mid-range GeForce GPU (like a GTX 970 or the new GTX 1060) should give you a framerate that is much higher than what the human eye can physically see.
  2. If you use high resolution textures (like the "Woman 003" test scene), you are much more likely to become CPU or RAM limited so the speed of your GPU shouldn't make a huge difference. This is why all these GPUs performed almost identically for that test.
  3. If you do work with very large scenes, upgrading to a GTX 1070 can give a great boost to performance. Not only is it less expensive than a GTX 980 or 980 Ti, it also has more VRAM and an overall newer architecture. If you need even more performance, a GTX 1080 can give you a bit more speed when using Shaded and Shaded w/ Edges Faces mode, but not by much.

One thing we want to make clear is that using a GeForce card in 3ds Max is not something we generally recommend (if only because AutoDesk's stance is to only officially support professional-series cards). So for most customers, we recommend a Quadro card and using our Quadro Performance article to help you determine which card to use. However, we understand that there are circumstances where it is worth the risk of not having software support from AutoDesk in order to use a GeForce card. In those isolated cases, our testing has shown that - as far as we can tell - there is no performance hit to using Geforce and you can easily meet and exceed the performance available from a similarly priced Quadro card.

Tags: GeForce, 3ds Max, Performance
Turismo

how about rendering with geforce?

Posted on 2016-11-14 05:38:08
GKadas

How about a proper archviz scene test instead of stupid copies?
Export a scene from an architect made in autocad or rhino...
I want to see a bigger scene like an airport with forest pro and loads of props.

These tests are not giving me enough detail to make a decision on spending my money on a gpu where the price difference is like 600£ between the different cards.

Posted on 2016-12-20 13:27:03

Copies do a decent job of pushing the hardware, but I agree that a scene from AutoCAD, Rhino, or Revit would be a great addition. This was our first round of testing in 3ds Max and since we had to develop it from the ground up we started with relatively simple tests. One of the issues we ran into is that we traditionally rely on our customers to provide us with projects to use in our testing, but for architecture (and engineering in general) there seems to be a lot of red tape preventing our customers from providing us with test projects. They want to help, but they have to follow their company rules and are not allowed to provide data to anyone outside their company.

If you have a project that you think we should use in our next round of testing, toss me an email at labs@pugetsystems.com . I'd love to get my hands on some real-world data to use in our testing.

Posted on 2016-12-20 18:35:07
human

What about Quadros for viewports?

Posted on 2017-02-17 10:14:05
human

nevermind i'm just blind :)

Posted on 2017-02-17 10:15:06
moayad

https://www.pugetsystems.co...

Posted on 2017-11-17 00:42:49
Muhammad Farooqi

.....One thing we want to make clear is that using a GeForce card in 3ds Max is not something we generally recommend (if only because AutoDesk's stance is to only officially support professional-series cards) ...
then why gtx 690,Titan, Titan x and titan z are recommended and certified range?
https://knowledge.autodesk....

are these 4 cards made on any special procedure?

Edited.. .I just read in the first paragraph that you did mention about autodesk recommendations like... they have listed geforce cards..

If we compare both.. geforce results and "THE PROFESSIONAL" grade quadro cards.. there is very very very minor difference. But i notice geforce is showing faster framerates then Quadros..

Quadro M6000 24GB = https://www.pugetsystems.co... is quite slower than
GTX 980 4GB = https://www.pugetsystems.co...

and price... Quadro M6000 24gb = $5000
and Geforce gtx 980 4gb = $401

Are you kidding me!!!????

Is that just a myth??? or market trends.. or commissions by autodesk recommendations..

Posted on 2017-02-23 11:26:24

My best guess why Autodesk pushes the Quadro line is due to driver stability more than anything else. On the GeForce side, NVIDIA pushes driver updates pretty often to continually optimize for different games and these updates can at times contain bugs that require a further driver update. Autodesk probably doesn't want to deal with the potential support calls from their users who have a problem caused by a driver bug so they stick to the more stable (or at least they believe to be more stable) Quadro cards and drivers. It means their users have to spend more money on hardware to get the same performance, but they are a software company so hardware costs probably isn't terribly high on their list.

Posted on 2017-02-23 18:23:41
Muhammad Farooqi

$5000/$400 = 12X 980 cards.. let's half it to 6x980 cards .. and put the cost of the rest of 6 cards on Workstations. create a rendering farm, and the same video what may be Quadro produces in 60 hrs will be ready in 10 hrs. who leads.!!!!!

those $4600 is still toooo high price for just a stable driver... while i think updating in drivers is mostly good then being stick with one!.

I own two cards 780 and 690... and i was dying to sale and get one quadro.. but now i realise.. Thanks GOD.. I DIDn't take that step. I would definitely get a even cheaper quadro (not in the terms of price..in terms of performance)...

Posted on 2017-02-23 22:22:02
Niko Nikolov

Your test is only half finished for me.Good,you have a big scene with multiple mesh copyes and the camera flies to test the vieport performance.BUT,while you are in that heavy scene,try to select one of the many meshes and manipulate/move its vertices/or exrtrude.See how long will it take to do the task.Than repeat the same with a quadro and i bet it woud be instant.

Posted on 2017-03-24 23:58:40
kahhou

May i ask is geforce 940mx can run 3ds Max and AutoCAD?

Posted on 2017-03-23 12:37:06
Min AViz

Can you compare GTX vs Quadro, what is difference?

Posted on 2017-03-27 04:09:10

We actually have several articles talking about Quadro and GeForce cards, but if you want the short story it is this: the Quadro series is NVIDIA's professional graphics card line, while the GeForce is the mainstream / gaming-oriented line. The Quadro cards cost more but have more specialized drivers, are more strenuously tested, and have more dedicated support. They are also more frequently certified by software manufacturers like Autodesk and Dassault to work properly with their applications. GeForce cards cost less, are updated more often (both hardware and drivers), and in many cases perform as well as Quadro cards - but sometimes they don't, and in applications where only a short list of cards are officially supported it is usually best to stick to that list.

If you'd like to read more, check out our articles: https://www.pugetsystems.co...

Posted on 2017-03-27 04:38:59
Jānis Veide

Just want to thank everyone at Puget for these tests! :)

Posted on 2017-11-17 07:05:14