Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/164
Article Thumbnail

AutoDesk AutoCAD 2013 GPU Acceleration

Written on October 10, 2012 by Matt Bach
Share:

Introduction


Edit 10/3/2013: Interested in how workstation cards perform in AutoCAD 2014? Check out our follow-up article: AutoDesk AutoCAD 2014 Professional GPU Acceleration


Unlike Adobe Photoshop and Premiere, AutoCAD 2013 does not have any specific features or effects that are GPU accelerated. Instead, almost everything that looks 3D uses the GPU in some manner. You still need to balance a powerful video card with a good CPU and RAM combination, but the video card you use for AutoCAD is an extremely important part of the performance equation.

Using the AutoDesk Certified Hardware webpage, we found that while almost every current NVIDIA Quadro and AMD FirePro video card is certified for AutoCAD, there are no current NVIDIA GeForce or AMD Radeon video cards certified.

In this article, we want to explore the performance differences between a wide range of different video cards, including the current generation AMD Radeon and NVIDIA GeForce cards that are not on the certified list. In order to do so, we will be using the Cadalyst 2012 v5.4 benchmark on AutoCAD 2013. This benchmark is a full system benchmark that tests 3D and 2D graphic performance as well as disk and CPU performance. For the purpose of this article, we will largely be focusing on the 3D and 2D results, but will also use the "Total Index" or overall score. More information on the benchmark and the individual tests can be found here.

Since visual aids are a great way to get a feel for a benchmark, we also recorded a single benchmark run using FRAPS to help you get a feel for exactly what is tested by this benchmark:

Test Setup

To make sure that the chipset and CPU did not affect our results, we used two separate testing platforms consisting of the following hardware:


To test a wide spectrum of GPUs, we tested the following cards (video driver version listed in parentheses):

NVIDIA GeForce (306.23) NVIDIA Quadro
(305.93)
AMD Radeon (12.8) AMD FirePro (8.982.8.1) Intel  (8.15.10.2761)
GTX 580 1.5GB Quadro 4000 2GB HD 7970 3GB W9000 6GB HD 4000 1GB
GTX 680 2GB Quadro 2000 1GB HD 7870 2GB W8000 4GB  
GTX 670 2GB NVS 450 512MB HD 7750 1GB W7000 4GB  
GTX 660 Ti 2GB     W5000 2GB  
GTX 660 2GB        
GTX 650 1GB        
GT 610        


AutoCAD 2013 was configured according to the Cadalyst benchmark readme.txt and was run three times per video card to ensure accurate results. At the moment, neither AMD nor NVIDIA has a specific driver or plugins for AutoCAD 2013 so just the base driver was used.

Results

3D Graphics Index 2D Graphics Index

Before we examine the overall results, there are two things to point out. The first is something we did not even intend to benchmark, which is that the Z77 system consistently scored higher than the X79 system. Considering that we were using the fastest CPU currently available for each platform and identical RAM, we can conclude that the Z77 platform (using Ivy Bridge CPUs) performs better than the X79 platform (using Sandy Bridge-E CPUs) in AutoCAD 2013. Second, 2D graphics does not appear to be affected by the video card at all. While there is some variance in the results, they are all minor enough to simply be normal testing fluctuations.

In the overall and 3D graphics scores, The NVIDIA cards performed roughly how we expected. The NVIDIA GeForce GTX 680 topped the charts, and the other GTX cards performing in line with their model numbers. One nice thing to see is that the previous generation NVIDIA GTX 580 placed below even the GTX 660. This is good news as it shows that the current generation cards perform better in AutoCAD than the previous generation cards.

The biggest surprise in our results is how poorly the AMD video cards tested. Relative to each other, the AMD cards are right where we would expect them to be, but they all are way under their NVIDIA counterparts. Even the top of the line AMD FirePro W9000 was only able to match the much cheaper NVIDIA Quadro 2000 and 4000. On the subject of workstation cards, the Quadro 4000 did OK, but we were surprised to see how little of a performance advantage it gave over the Quadro 2000.

The onboard Intel HD graphics did decently, actually performing better than the Quadro NVS 450. It's still near the very bottom of the charts, but the gap between it and the lower-end discrete cards was not as large as we expected. Don't expect to do any complex modeling with it, but if you just need it for very light tasks it should actually be able to keep up.

Desktop vs. Workstation Graphics

According to our benchmarking, the NVIDIA GTX cards are much faster than the NVIDIA Quadro or AMD FirePro cards, but what many do not realize is that FirePro and Quadro cards are not primarily about speed. The main feature of of worstation cards is their double precision preformance which allows the card to be many times more precise. As a roughly comparison, consider the difference you would get if you were to compute the area of a circle with pi being just 3.14 versus 3.14159. The difference is small, but it can easily compound over time. In addition, the NVIDIA Quadro 5000/6000 and AMD FirePro W8000/W9000 use ECC memory for their video RAM which is much mroe reliable than standard video memory.

When gaming, precision is not really needed since one small artifact every 1000 frames is not noticeable to the human eye. When you get into 3d modeling or simulation, however, one small artifact can cause big problems. In a professional environment, you want to be 100% sure that everything was completed properly the first time and that there are no small artifacts in the results. For example, if you are modeling a bracket that will be used in a million dollar football stadium, you want to be 100% confident in the load simulation results. Workstation cards are really the best way for you to be confident in the results.

So, the foremost question when it comes to Desktop vs. Workstation graphics is actually not about which has the best benchmark performance, but rather if speed or precision is more important to you. If you are a gamer that dabbles in 3d modeling, a GeForce or Radeon card will likely work great for you. If you are a professional that needs consistent results that you can be 100% confident in, it is likely worth the slightly slower Quadro or FirePro card for the assurance that there will be no minor problems with your model.

Another point to take into consideration is that NVIDIA Quadro and AMD FirePro cards are the only modern cards that AutoDesk has officially qualified for AutoCAD. Obviously the NVIDIA GeForce and AMD Radeon cards do work, but they technically are not endorsed by AutoDesk.

Conclusion

If you decide that a desktop class card is suitable for your needs, the GeForce GTX 660 does a very reasonable job, almost matching the more expensive GTX cards. For the best possible speeds, the NVIDIA GeForce GTX 680 came out at the top of our 3D and overall benchmark results. The AMD Radeon cards unfortunately did not do as well as we hoped with only the AMD Radeon HD 7970 being able to outperform even the low-end NVIDIA GT 610. At the moment, NVIDIA cards are the clear winner from a performance standpoint.


For workstation class cards, the NVIDIA Quadro 2000 did very well, almost matching the speed of the Quadro 4000. It certainly is not as fast as the NVIDIA GTX cards, but when high precision is required, the Quadro 2000 is a great choice. Like the AMD Radeon cards, the FirePro cards did not fare very well in our benchmarks. The FirePro W9000 roughly equal to the Quadro 2000, yet is much more expensive.

So in conclusion, the NVIDIA GTX 660 is a great way to get close to the performance of the GTX 680, yet costs substantially less. If you are a professional that needs consistant, precise results, the NVIDIA Quadro 2000 is a great balance of performance and cost.

Tags: AutoCAD
3drawing

Hi

Thanks a lot for this test, it's god! And its very relevant....

But for me i see some problem in the test program. I allso have a bit doubt if computer is set right with drivers?? Did you install the full DirectX drivers from Microsoft or only used the drivers from AutoCAD?? Did you setup the grafiphic Card right too Autocad??

For me the 3D model in the test program are too small, it do not challenge the graphic Card. Thats allso why we dont see any differents...
Beside that, it allso use dispsilh(display siluet in wireframe) wich is only CPU acceleratet so thats not fair in a GPU test to turn on dispsilh on shade and realistic. You say its testing in realistic and i DO NOT see any realistisc 3D. Yes i know the function "realistic" is turn on, but there are still nothing realistic in the test!!?? And remember to turn dispsilh off!

In the 2D test part. Have "whipthreat" bin set right in Autocad. It should be set for multicore support....?

Its like testing an old game on a very expensive graphic Card and a less expensive graphic Card....there not really a big difference....or a game there dont support multicore with a single core computer and a quard core computer, there are allso no difference. We have to turn on multicore support on the cpu(whipthreat) and realistic have to be used right for the GPU test.

Hope my comments give some idea for updating the test program...

Best regards

Peter, an 3D operator on Autodesk programs....

Posted on 2012-11-18 09:49:15
joneru

I hope this review will be updated based on the 3drawing's comment for a more graphics intensive 3D model.

Posted on 2012-12-29 03:22:01
Titi95

thank you for this test!
Can you test the card Quadro 5000 to see how it fits

thank you in advance!

Posted on 2013-01-09 15:51:12

We are planning on testing the Quadro 5000 and a few other cards (including Tesla and SLI/Crossfire configurations), but there are a few things we are waiting on before we run those tests. We also have quite a bit going on in-house at Puget at the moment, but if I had to guess when we will be starting that testing I would guess sometime mid-late February.

Posted on 2013-01-09 17:51:03
Meeglz

 Hi! Can you test gpu acceleration in Autodesk Inventor?

Posted on 2013-01-11 09:54:22

Relative performance between the different cards should be pretty much the same for Inventor. Most of the difference between Inventor and AutoCAD (as I understand) has to do with things that are more CPU intensive. Load simulations and collision detection are almost exclusively CPU-based so a better video card won't help you much.

The one thing with Inventor is that you can have very large assemblies, and you may need to have a good GPU to be able to display everything smoothly in realistic or shaded modes. Especially if you are displaying shadows, reflects, and complex lighting, a better video card definitely helps make rotating the model much better. But make sure you have plenty of CPU power to back it up as well. The smoothest transitions don't mean much if the CPU isn't able to keep up.

Posted on 2013-01-11 20:40:33
duf83d

Matt Bach said:

We are planning on testing the Quadro 5000 and a few other cards
(including Tesla and SLI/Crossfire configurations), but there are a few
things we are waiting on before we run those tests. We also have quite a
bit going on in-house at Puget at the moment, but if I had to guess
when we will be starting that testing I would guess sometime mid-late
February.

And where are benchmarks of Quadro 5000? I am still interested in performance of Autocad accelerated by iGPU and dedicated cards for regular consumer and for professionalists (Quadro and something from AMD I think).

Posted on 2013-06-22 09:45:41

We have a whole host of modern Quadro and FirePro video cards in now, for further testing of this sort, but it has been delayed due to the launch of several new technologies (Haswell, GeForce 700-series, new Intel SSDs, etc) which have been keeping our Labs department busy with product qualification and development.

Posted on 2013-06-23 22:23:36
Frank

Check Tom's for inventor benchmarks. They are drastically different. Inventor uses directx which is better for "gaming" cards while autocad uses opengl which is more suited to "workstation" cards.

Posted on 2014-04-01 17:45:31
Titi95

It's good I did my test between the Quadro 5000 and GTX 680 (OC 4GB), the GTX works much better on Autocad 3D

My next planned purchase very soon a ne
workstation will be the GTX Titan ...

Posted on 2013-06-28 15:37:36
oriuken

Hi! I've just bought the gtx 660 ti 2 gb ram but still can't have a decent orbit in autocad 2013, 2012, or 2007! (yes, i even tried with that one). I have an i7 2600, 16 gb ram, 2tb hd. I installed the latest nvidia drivers, directx 11 and autoCAD SP 1.1.
So I don't what I might be doing wrong. Could you please help me? Thank you!

Posted on 2013-02-14 11:44:53
oriuken

also I have an asus P8P67-M PRO (P67 chipset).

Posted on 2013-02-14 11:54:48
Jon

Are the "points" that are in the graph linear?

If so, that would mean the nVIDIA GTX 680 is only 35% faster than an onboard HD 4000?!?!! A GTX 680 should be at least 10X (1000%) as fast if it was using the full potential of the card. To get only a 35% increase seems to imply that the GPU is used so little that it's barely worth investing in one at all.

Comparison chart for reference: http://www.notebookcheck.ne...

Posted on 2013-02-19 08:16:05

The graphs are indeed linear, but please keep in mind that the tests we used were not limited to testing just the graphics cards. They are also affected by the CPU, RAM, etc - and since those didn't change, we are seeing here only the effect of changing the video card on overall performance.

And in a way you are right: the GTX 680 wouldn't be worth it, if all you are doing is AutoCAD! However, you can get effectively the same level of performance from the GTX 660 for half the price, while going lower than that would start to impact performance negatively.

Also, on the point of a GPU being "barely work investing in", you noted that you get up to a 35% increase - and that is very much worth while! Lets say you had a system that was $1000 without a video card, but you added a GTX 660 for $300. That is a 30% price increase, and for it you get a 35% performance boost; that is an excellent improvement!

Posted on 2013-02-19 17:50:45

That's a fair and valid point. We're just coming at it from different sides.

The AutoCAD-glass is half-full (or I'd argue 1/10th full ;-) perspective is that in the end there is a measurable increase which is a very true and appreciable gain.

I was just surprised to learn that the vast majority of the considerable power a GPU provides sits idle as AutoCAD is unable to take advantage of it. So, yes, my shock came from the GPU-glass is 90% empty point of view.

Another performance related point is that AutoCAD is also predominantly single-threaded. So in much the same way that a good chunk of the GPU remains untapped, if you have a multi-core system most of it will remain untapped by AutoCAD as well.

Autodesk Knowledgebase article: http://usa.autodesk.com/ads...

I also did not thank you for providing these excellent benchmark results! It's exactly the information I was looking for as I'm in a position to build a new system for an engineering friend that wants to optimize for AutoCAD use.

If you're looking for maximum performance at any price, then i7s, GTX680s and the like will, as the benchmarks illustrate, increase performance. But, in terms of most bang for the buck I don't feel the marginal gains in performance justify the price.

A screaming fast single core system would benefit AutoCAD the most. With that in mind, a mid range CPU with fewer cores that you can overclock would perhaps provide similar gains in performance for hundreds less.

If you decide to run more AutoCAD tests in the future (and I hope you do!) maybe adding a comparison between the equivalent of an i5-3570K/HD4000 vs i7-3770/GTX680 would be interesting.

Once again, I do appreciate the benchmarks as I was unable to find this information elsewhere!

Posted on 2013-02-19 19:50:54
Ívar Arason

I have made tests with several Cpus and Gpus, AMD 1055T, Intel 3570 and 3770. Gpus are GT640, GTX 660 OC and Radeon 7850. The Nvidias shine in comparison to the Radeon in all terms regarding graphic capabilities, but the strange thing is that the PC with i5 3570 and GT640 with soft mod Quadro driver performs almost to same results and the 3770 with GTX 660 OC card, the difference is negligible. I have also tested a laptop w. 3630QM and GT650M, softmodding Quadro driver for this laptop increased overall performance in graphics a lot.

Posted on 2014-01-19 02:47:09
Stijn Liekens

i have a macbook pro retina with gt650m and would like to softmod it to a quadro k2000m or k1100m any idea on how to do so? and if mac os will like it? (for adobe cc and solidworks->on windows partition)
thanks in advance

Posted on 2015-12-18 14:40:18
Ívar Arason

I am not sure how Windows running on Mac will react to a modded driver, but it is no problem to test it. I assume that you are intending to test this while running Windows in Boot camp, but I am pretty sure it will not work it you are running Windows in a virtual machine like Parallels.

Posted on 2015-12-19 19:28:29
Stijn Liekens

i am indeed not planning to use a vm on the macbook but what I would like to know is how to softmod the gt650m into a quadro k2000m or k1100m (could you perhaps send me your modded driver?, as I assume the will be the same as this is the same card but a different laptop)
And big thanks for the reply

Posted on 2015-12-20 22:31:39
Ívar Arason

Windows running in a virtual machine on a Mac is not using windows drivers, so you can not use a modded driver for that setup. The drivers come from the virtual machine and the Mac operating system (Mac OS X)

Posted on 2015-12-21 12:32:58
Stijn Liekens

I AM running windows installed via bootkamp. And NO VM is used. So how do I mod the drivers?? As you seem te be the only person I have found who has succesfully softmodded a gt650m. SO HOW ????

Posted on 2015-12-21 22:24:27
Ívar Arason

You need to download the driver setup package from Nvidia for the quadro card to mimic.
Then you need to find the Device ID of the graphics card in your computer, you find it in Windows. The ID is then used when you edit the "info" file (*.inf) in the driver setup package, when you see the inf file then you immediately understand what this is about. This "mod" is simply to fool the setup package to install the driver for what Nvidia means is unsupported hardware, but actually that is somewhat bullshit since the hardware in gaming cards and the quadro cards is actually same i.e. same GPU, but this is not just it. The much more expensive quatro cards do differ, but not much. The quadros have different memory and double precision calculation but that is not important for most users.
Now after editing the .inf file you should be able to install the driver and have the operating system understand that the graphic card is a Quadro card, but I am not saying this will work on a Mac running Windows in Boot Camp but worth testing, if not then you just install apropriate driver again.
http://archive.techarp.com/...

Posted on 2015-12-22 01:02:14
Stijn Liekens

Ok, in theory it sound quiet simple, but inpractice it is not.

BTW i'm only testing in bootcamp WINDOWS NOT osx

I have seached for .inf files and there are A LOT... So do you know which one I need to modify?

my gt650m's device ID is: NVIDIA_DEV.0FD5.00F2.106B = "NVIDIA GeForce GT
650M"

driverpackage: 361.43-notebook-win8-win7-64bit-international-whql
from the nvidia

but I do not know what the quadro k1100m of k2000m ID's are and in which file(s) to modify them.

Do you know this?
Thanks in advance

P.S. are you sure no hardstraps are located on the card?

Posted on 2015-12-22 15:57:25
Ívar Arason

Ok, in theory it sound quiet simple, but in practice it is not.

Well simple or complicated, computers are complicated and so is software that operates and runs this equipment.

Yes, this would only work in Bootcamp and at least not in Parallels.

The structure of the setup package has changed, you download the driver package and the download is a self extracting file that will extract a folder with all the files needed.

After extracting then you will find a file in the root of the package folder with the name ListDevices, you edit this file by adding the device Id of your Nvidia card, just put it in all sections (I think).

Now you can try and see if the setup will install the drivers. If not then there might be some edits needed in the *.inf files but I am not sure.

Nvidia 650M has GK 107 GPU and the counterpart (one of them) is Quadro K2000M and several others so you should be able to use the K2000M driver or most likely since it has the same GPU.

The driver package is containing several files used by the operating system to handle the graphic card, when installing the setup software

simply checks if the Device ID in the list in the ListDevice file is compatible with the device ID found in the computer if so the it will install in good faith. So to make the setup software have this good faith you simply edit the list of compatible devices and voila you should be done.

https://en.wikipedia.org/wi...

http://www.nvidia.com/downl...

Posted on 2015-12-22 16:50:18
Stijn Liekens

Ok after some experimenting I found out only the nvamwi.inf had to be modded and voila now a have a quadro k1100m. But after some testing I did not see an increase in preformance in the solidworks benchmarks?? but I will keep trying with other versions of the driver I.E. preformance autoCAD etc. and I will also try to install k2000m drivers :)

Posted on 2015-12-22 23:36:52
Ívar Arason

For SolidWorks there are other hacks that you might want to take a look at. Well, the world is a mafia that has one goal and it is to get into peoples pockets and find their wallet and punch a hole in it. In Solidworks Dassault decided to prevent RealView from working unless you would buy the more expensive Quadro, but a clever guy has hacked this feature and made socalled "realhack" that enables RealView in SolidWorks on the Nvidia gaming cards.
But for the Nvidia GT650M card and AutoCad, on my laptop with GT650M there was noticeable performance gain using the Quadro driver, but on a desktop the gain was negligible.
You mentioned "hardware straps" all devices have this feature, i.e it is perhaps not a strap but an ID or fingerprint to identify the component, in the graphics cards the ID is marked by resistors whose resistance make up the Device ID mark, some guys have broken the code on the cards (Nvidia at least) and some information is available on the internet so one can replace the resistors on a Nvidia gaming card and so to say convert it to a corresponding Quadro sibling, but this is not the whole story since the memory also differs (at least this is what I understand) the "pro" versions Quadro (Nvidia) and Firepro (AMD) use ECC memory or similar memory as are used on server grade computers and this is to my understanding to enable double precision calculation i.e. Pro precision (in some cases). To what degree this matters is to my understanding not important for general use of design software.
So, the main difference in performance between the gaming and pro graphics cards is so to say speed vs precision. The gaming cards are optimised with drivers for speed in gaming and the pro cards optimised towards precision with drives for precision, both versions have identical GPU's but different memories.

http://solidworks.burkesys....

Posted on 2015-12-22 23:56:56

This is actually a very timely discussion as we just launched a series of articles about performance in Solidworks. One focused on just Quadro cards: https://www.pugetsystems.co... and one looking at the performance of GeForce cards ranging from a GTX 950 all the way up to a GTX Titan X: https://www.pugetsystems.co...

We did a registry edit to enable RealView on the GeForce cards, but the big surprise was that using the "Shaded w/ Edges" view mode results in absolutely terrible performance with GeForce cards. When just using "Shaded" they did OK, but when you turn on edges even a Quadro K620 will often beat the highest-end GeForce cards. My guess that it is something to do with the Firmware on the Quadro cards rather than a driver-based optimization so you would have to physically mod a GeForce card and flash the firmware in order to get performance equivalent to a Quadro card.

Just as a side note, the higher double precision performance of Quadro cards shouldn't be a factor here. If it was, the performance of GeForce cards would be even worse than it already is.

Very interesting that AutoCAD seems to work just fine with GeForce cards (often faster than Quadro), while for Solidworks it looks like using a Quadro card is an absolute must.

Posted on 2015-12-23 01:35:18
Ívar Arason

Hello Matt,
I am not an expert in graphic cards but I find the results from the Solidworks performance test quite interesting.
Only to discuss the hardware briefly.
For the quadro cards in the test one would expect an ocean between the performance results between all the Quadro cards after looking at the harware specs but the test results show much less difference in performance than one would expect. The K620 and the K2200 seem to be the same cards but the K620 has DDR3 memory and the K2200 has GDDR5. The M4000 would be expected to be way above former two. Now just this comparison makes me wonder if Soidworks is relying so much on the graphic card at all in these tests all though the comparison with the Geforce cards implies that a comparison with Intel HD xxxx and AMD would perhaps clear the picture somewhat all though different card bios or firmware can play important role.
Lets now turn to the GTX cards, the Titan X and the top tier 980s should be dwarfing the 950 in performance no matter what one would be putting at them but this is not the case in this test less than 100% implies that this test is simply not showing correct results or what?
The difference in the hardware of the top cards both for the quadro and the GTX is huge in all aspects, bandwidth size of memory note the two lower quadros are PCIe 2.0 where all the others are PCIe 3.0.
But I admit, concluding that the firmware would be playing a role is a good candidate but I suspect there is also some other issue(s).
The picture is a snapshot from this site https://en.wikipedia.org/wi...

Posted on 2015-12-23 03:11:53
Stijn Liekens

after installing the quadro k2000m drivers I immediately saw a decrease of 25% render time, a nice bonus for changing 3 lines of text. But I will be looking into this hack you mention. And as far as ECC memory goes, I would love to have it but this would involve soldering ram and for that I just don't have the equipment.

I'm also looking into changing the firmware if possible to squeeze a little more power out of the laptop

Posted on 2015-12-26 23:21:27
Ívar Arason

(y) I look forward hearing results from replacing memory, but don't say I said that it is possible.

Posted on 2015-12-27 01:45:01
Leko

Guys,
I have switchable graphics.. So, I turned on the better graphic card, and even then when I check the performance tuner log, it says that I`m using the Intel HD 3000 graphics card....
Any help? Thanks

Posted on 2013-03-13 21:26:07

Since you have an Intel CPU, I'm going to assume you have an NVIDIA video card you are trying to use. What I would recommend doing is going into the NVIDIA control panel then going to "Manage 3D Settings" then "Program Settings". From here, select AutoCAD from the dropdown (or add it if it is not in the dropdown". Then under option #2, set the graphics to the "High-Performance NVIDIA Processor" and hit apply. This should make your system use the NVIDIA graphics whenever AutoCAD is running.

Posted on 2013-03-13 21:44:25
Leko

Thank you for your reply Matt.
Unfortunately, I am using AMD Radeon 6770M.
Thank you

Posted on 2013-03-14 01:41:04

Ah, OK. In that case, check out this document: http://h10025.www1.hp.com/e... . It's technically for HP laptops, but the AMD driver should be the same no matter what laptop you have. Specifically, check out the "Changing the switchable graphics settings in Dynamic Mode" section.

Posted on 2013-03-14 18:28:37
Leko

Thank you ! :)

Posted on 2013-03-14 21:13:17
Guest

Hi we have several Quadro 2000 and 4000 and have never never been able to do serious work with AutoCAD DWG containing 3D solids, even very small ones. We discussed that with Nvidia engineers and Autodesk and it looks like there is a driver problem, Nvidia Quadro are simply not suited to AutoCAD 2013.

Posted on 2013-04-07 10:11:35

That is very strange, as both the Quadro 2000 and 4000 are on the official certified hardware list for AutoCAD 2013:

http://usa.autodesk.com/ads...

In fact, not only are they certified but they are also in the 'recommended' category. Further, our benchmark testing above indicated that they work pretty well too.

Have you made sure you are using the specific driver release that AutoCAD was certified on? The link above has details, if you click on the names of the cards you are using, but it looks like driver 276.42 is the one to use for this software.

Posted on 2013-04-07 20:31:21
likeAboss

Is it possible to convert the scores to money? If it costs the company $25 an hour for the employee and by getting him an upgraded computer, how long will it take to recoup that money?

Posted on 2013-05-15 20:15:09
Craig Shupe

Ok I am a little late to this article, however I just literally got a pretty high end machine, 3.2 X79 6 Core, with a Nvidia Geforce GTX 770, 32 GB 2000 Pin memory and I am having more issues with hesitation then I did with my previous system with a low end i7 and a Quadro 600. I am getting very upset is there any one who could help me with some settings or something that I don't got quite right.

Posted on 2013-08-27 21:01:16

Is this a system you got from us here at Puget Systems? If so, contact our tech support folks and we can at least look at the hardware situation and make sure nothing is physically wrong.

If I had to guess, though, I'd suspect that it may be the transition from a professional-grade Quadro card to the consumer-grade GeForce. While the GeForce have as much raw power as the Quadro cards, they aren't optimized for some of the things that professional design applications do. For some users they work out just fine, but for more demanding CAD / CAM applications and the like a Quadro is highly recommended.

Posted on 2013-08-27 21:04:59
Craig Shupe

No its not one I got from you guys. It is weird. I cant even hardly select all of the objects in a simple drawing. Kind of getting irritated, any suggestions on setting in AutoCAD or in the Nvidia Control Panel that would be best to optimize its performance.

Posted on 2013-08-27 21:09:23

Hmm, I'm afraid I don't have enough hands-on experience with AutoCAD to know if there might be something you can adjust to help... but it definitely doesn't sound like things are working right, so I'd reach out to whatever company you purchased the system from. You might also try contacting AutoCAD for support, but since you are running a video card that isn't on their approved list there is a good chance they won't be able to help too much either.

Oh, you could always try putting in the Quadro 600 out of your old computer - to see if that helps. If it does, that would lend support to my theory from above.

Posted on 2013-08-27 21:14:55
Craig Shupe

Thanks I will do that.

Posted on 2013-08-27 21:16:20

Sorry, I had actually misread your post - I thought you said "No its one I got from you guys" (I missed the 'not'). So our support folks actually won't be able to help... though if you are local here in the Seattle area we do offer local repair services at our office, so you could bring the system in for diagnosis.

Again, sorry about the mis-read!

Posted on 2013-08-27 21:19:25
Yanoi Astaroth Gothkitten

I wonder if there is specific Autocad drivers for my GTX 670 2GB and boost speed. I mainly provide the use to design handguns, rifles and SMG's.

Posted on 2013-09-15 07:21:33

GeForce cards do not have specialized drivers available - from NVIDIA, only Quadro cards have application-specific drivers available.

Posted on 2013-09-16 03:13:37
abc

Hello,

Thanks for your
hard work. I have a GTX 670 and it runs great on AutoCAD 2013 64-bit, except
for high resolution renderings. Should AutoCAD use Windows RAM/Memory? Barely uses anything and the renderings take forever.
CPU however runs at 100%.

Autodesk forum post with full details and PC spec..

http://forums.autodesk.com/...

Posted on 2013-09-16 01:46:06

Programs will only use as much memory as they need, and if they are 64-bit then they can use as much as you have available if they need it. That tells me that if you aren't seeing high memory usage it simply isn't needed. If your CPU is pegged at maximum, then that is probably the limiting factor.

You can get some more details on CPU, GPU, and drive usage though. Pull up Windows Task Manager for CPU (and RAM) usage, and from within Task Manager you can pull up Resource Monitor for even more details as well as hard drive usage info. For the video card, a free program called GPUz will give you both GPU load and GPU memory usage info.

Posted on 2013-09-16 03:15:51
abc

Thanks for your response.
I have pasted the full details of my Autodesk post as follows...

I am rendering in AutoCAD 2013 and it is very slow given the specs of my
PC.

With the AutoCAD file open at idle, Windows Task Manager states 344 MB memory use for AutoCAD.exe

When I start the render, AutoCAD then uses about 430 MB of memory – so it
barely uses any extra memory! Approx. 86 MB /RAM

I have validated with Kingston that my 16GB of DDR3 memory is correctly
installed and operational. The PC is fast and I don’t have any other issues with other programs.

All up the, the total memory use for my PC during an AutoCAD render is only
14%

PC Specs:

1/ OS: Windows 8 64-bit

2/ Graphics Card: GeForce GTX 670 with Driver 320.49 (01/07/13)

3/ Memory / RAM: Kingston 16GB (8Gx2) DDR3 1600 MhZ installed in white slots DDR3_1 and DDR3_2 Blue slots DDR3_3 and DDR3_4 are vacant.

4/ Mother Board: Gigabyte GA-Z77-D3H

5/ Hard Drive: Samsung 840 Series 500GB SSD

6/ CPU: Intel i7-3770K

Settings:
1/ AutoCAD options, Hardware Acceleration has been enabled.

2/ WHIPTHREAD system variable is set to 3 to enable multi-threading (PS this
works – CPU runs at 100% during renderings). Tried at 0 also.

3/ Have installed the latest NVIDIA Control Panel and tried numerous ‘3D
settings’ including adding acad.exe to the list and ‘Let the 3D application
decide’ + others pre-sets.

4/ In the render pre-sets, there is a setting under Processing, Memory Limit
– by default this is 1048. I have tried doubling, tripling this setting etc
makes no difference.

5/ Windows Virtual Memory – by default set to ‘Automatically manage paging
file size for all drives’ + tried setting a custom size with the Initial size to
24,576 and the maximum size to 49,152. Also tried 4,000 / 8,000 and 8,000 /
16,000. Note I have 16GB of RAM installed. Does anyone know how to set this up properly? To my knowledge if you have ample RAM, just set it to Automatic?

6/ Have AutoCAD 2013 SP2 installed.

Conclusion?

1/ Maybe NVIDIA purposefully limit AutoCAD RAM usage on their GEFORCE cards / drivers to force customers to purchase their QUADRO cards? Can anyone confirm if QUADRO cards take advantage of a PC with allot of Windows RAM memory installed?

2/ Maybe there is an NVIDIA Control Panel setting that will work? Or, is it
possible (and straightforward) to use a QUADRO driver?

3/ Is there an AutoCAD RENDER or OPTIONS setting to increase RAM usage?

4/ Kingston’s response: “If everything works fine in AutoCAD except for
rendering, it would probably be hard drive or AutoCAD”.
5/ Maybe this is an AutoCAD limitation? ie; AutoCAD relies 100% on the GPU memory not Windows RAM memory?

Posted on 2013-09-16 06:13:51

I had seen that info on the link you sent, but it will be good to have here in case anyone else has ideas though. I would still encourage using the tools I mentioned to check things beyond just the CPU and RAM, to see if you can spot any other limitations on the hardware side. If not, it is probably just being limited by your CPU, which you indicated was running at 100% during renders.

Posted on 2013-09-16 06:17:45
Maria

One detail that did not get mentioned is how much memory is built into the graphics card. High-end programs usually need an average of 2GB of VRAM for the graphics card.

Posted on 2014-01-03 01:16:08
Ívar Arason

Look up the "Whipthread" system variable, adjust to "3" this enables few features in AutoCad to run multithreaded, but only very few.

Posted on 2014-01-19 02:54:29
Ívar Arason

This seems to be an everlasting discussion but a good one. Now, I have a Radeon 7850 card and a GTX 660 OC. the GTX 600 is rated higher than the Radeon 7850 but actually the Radeon performs slightly better in my computer. BUT when I mod Quadro drivers and install them with the GTX 660 OC card it speeds up to some degree. Comparing the Quadro specs to the Geforce specs it is clear that the GPU's are the same, i.e. each Geforce card seem to have a Quadro counterpart, but the Device IDs are different and of course the Quadros have ECC memories (at least some of them) I think this also goes for the AMD cards i.e. that Radeon cards also have a Firepro counterpart or twin. I have not tried modding a Firepro driver for my Radeon yet.
I would like to see some results from testing the gaming cards with modded drivers for comparison, this would be very interesting to see :)

Posted on 2014-01-08 14:51:54
Krisztián

I find it highly unlikely that any load test simulation would be off by any percent because of using ANY card...
I think that was very unprofessional from the writer
quite obvious that any algorithm would notice if the gpu is not precise enough and use the CPU instead,
there are failsafes for that in a professional, non beta product
I feel this could be only a problem for final renderings of intricate products that there could be edge or occlusion glitches that you only notice after the fact etc...
correct me if I'm wrong.

Posted on 2017-09-05 13:55:47
Chris

Hi. Would a last gen i3/i5 laptop class work well with this? Also, the iGPU should be better than GTX 650 or smth. I want it for my mom and she needs a laptop for college work.

Posted on 2017-10-04 07:07:55