Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/751
Article Thumbnail

Why you should use a Quadro video card in Solidworks 2016

Written on December 22, 2015 by Matt Bach


Currently, Solidworks officially supports NVIDIA Quadro and AMD FirePro discrete video cards so if you want to use a supported card (which we highly recommend doing) you will need to use a workstation-class GPU. However, if you are a student learning Solidworks or otherwise unable to use a Quadro card, it is entirely possible to use a GeForce or Radeon card instead. You may not be able to get support from Solidworks if you have a problem with their software (which makes it a very bad idea for a professional environment) but if that does not matter to you then it can be a way to get started with Solidworks on a computer you already own.

In this article we will be looking at a range of NVIDIA GeForce cards and comparing them to the three most popular NVIDIA Quadro cards for Solidworks. To fully test the cards we will be using a work around to enabling features like RealView which are normally disabled unless you have a workstation card. This is absolutely something we don't recommend doing if you are using Solidworks professionally, but it will help us determine the full performance differences between Quadro and GeForce cards. If you want to skip over our individual benchmark results and simply view our conclusions, feel free to jump ahead to the conclusion section.

Test Setup

For our test system, we used the following hardware:

Testing Hardware
Motherboard: Asus Z170-A
CPU: Intel Core i7 6700K 4.0GHz Quad Core 8MB 95W
RAM: 4x Crucial DDR4-2133 4GB (16GB total)
GPU: GeForce GTX Titan X 12GB (~$970 - 6144 GFLOPS)
GeForce GTX 980 Ti 6GB (~$650 - 5632 GFLOPS)
GeForce GTX 980 4GB (~$520 - 4616 GFLOPS)
GeForce GTX 970 4GB (~$350 - 3494 GFLOPS)
GeForce GTX 960 4GB (~$240 - 2308 GFLOPS)
GeForce GTX 950 2GB (~$160 - 1573 GFLOPS)

PNY Quadro M4000 8GB (~$740 - 2573 GFLOPS)
PNY Quadro K2200 4GB (~$600 - 1339 GFLOPS)
PNY Quadro K620 2GB (~$200 - 812.5 GFLOPS)
Hard Drive: Samsung 850 Pro 512GB SATA 6Gb/s SSD
OS: Windows 10 Pro 64-bit
Software: Solidworks 2016 SP 0.1

Our test platform is built around an Intel Core i7 6700K as that is the CPU that should give the best possible performance in Solidworks when rotating models. For the video cards we will be testing the full range of current GeForce GTX cards as well as the three most popular Quadro card for Solidworks. In the chart above we listed both an estimate price for the card as well as the peak theoretical floating point performance (which is single precision) so it will be very interesting to see if the driver and firmware differences between the cards makes Quadro faster than it's rough performance equivalent.

Remember that Quadro (and FirePro) cards are the only officially supported video cards, so in order to fully test the performance differences we had to do a bit of a workaround in order to allow us to compare the two product lines with features such as Realview enabled. However, we want to be clear that this workaround isn't something we would advise you do on your own machine.

To perform the actual benchmarking, we used a mix of AutoIt scripts and Solidworks macros to set the different quality settings, load the relevant model, and record the average FPS while rotating the model. Note that we tested with different LOD settings, but we found the difference to be marginal so to keep things simple we will only be reporting the results with LOD off (which usually results in a small drop in FPS). There are a number of different ways we could have recorded the FPS, but we opted to simply use a macro with a timer to rotate the model 45 degrees to the left and right for a set number of frames. From the number of frames and the total time it took to render those frames, we are able to determine the average FPS (frames per second). One key factor is that we made sure that every model started with the view set to front-top so that any reflections and shadows would stay in view while the model was being rotated.

For our test models, we chose the following models available from GrabCad.com that will give us results for a range of model complexities based on the total number of parts and number of triangles. These models are:

Steam Engine w/ Horizontal Beam
by Ridwan Septyawan
80 parts - .26 million triangles

by Andy Downs
364 parts - .5 million triangles


Audi R8
by ma73us
434 parts - 1.4 million triangles


One note that we would like to make is that if you do not know how many triangles the models you work with have, the easiest method we know of to find out is to simply save the model as an .STL file. During the save process, a window should popup with information about the model including the number of files, the file size, and the number of triangles.

1080p Results

Steam Engine with Horizontal Beam and Centrifugal Pump

To start things off we are going to look at the simplest of our test models which has only 80 parts and .26 million triangles.

With just the "Shaded" view mode the GeForce cards did O.K. although without RealView not even the most expensive GeForce card was any better than even a low-end Quadro. With RealView on the lower end GeForce cards were actually a bit better than the Quadro K620. However, to match the Quadro K2200 you would need a GeForce GTX 980 Ti which is actually a bit more expensive. At the upper end, none of the GeForce cards we tested were able to match the performance of the Quadro M4000.

Where the results get interesting is when we used the "Shaded w/ Edges" view mode. With edges enabled we saw absolutely terrible performance with the GeForce cards. Even a GeForce GTX 980 Ti or Titan X were nowhere near the performance of even a Quadro K620 which less than a third the price.


The Spalker model contains about 4.5 times more parts and twice the number of triangles than the previous model. Our results are somewhat similar to what we saw with the Steam Engine model, although using the "Shaded" view mode the GeForce cards performed much worse than a Quadro K620 without RealView. With RealView, all of the GeForce cards only barely outperformed the K620. Interestingly, all the GeForce cards performed roughly on par to each other with the exception of the GTX 950.

Switching to the "Shaded w/ Edges" view mode, we again see massive performance drops with the GeForce cards to the point that the Quadro K620 handily beats even the highest end GeForce card.

Audi R8

While the Audi R8 model doesn't have many more parts than the Spalker model, it does have about 3 times the number of triangles making it the most complex model we will be testing.

In "Shaded" mode without RealView, the results are pretty similar to what we saw with the other two models. While the GTX 980 Ti is able to match the Quadro K620, all of the other models see lower performance. With RealView enabled, however, most of the GeForce cards actually performed about on par with the Quadro M4000. Interestingly, after about the GTX 960, we did not see a significant increase in performance.

With "Shaded w/ Edges", the pattern of poor performance with GeForce cards continues. Once again, even the Quadro K620 will very handily outperform any GeForce card.

1080p GeForce benchmark in Solidworks 2016

4K Results

Steam Engine with Horizontal Beam and Centrifugal Pump

Even with our most simple model, we start to see a bit better performance out of the GeForce cards at 4K resolutions. While at 1080p we were mostly simply matching the performance of the Quadro K620, now the GeForce cards are often outperforming the Quadro K620 and at times even the Quadro K2200.

Starting once again with "Shaded" mode, all the GeForce cards give performance that is right between the K620 and K2200 without RealView. With RealView enabled, performance is even better with all the GeForce cards outperforming the Quadro K2200 although none were able to match the Quadro M4000

With "Shaded w/ Edges", even the highest end GeForce card still performs worse than the Quadro K620 without RealView enabled. With RealView on, the performance ranges from a bit worse than the Quadro K620 to a bit worse than the Quadro K2200 depending on the model of GeForce card.


With a bit more complex model, we again see performance higher than the Quadro K620 with the "Shaded" view mode. With RealView disabled, the GeForce cards at best matched the Quadro K2200 with most of them falling somewhere between the K620 and K2200 in terms of performance. With RealView on, however, only the GTX 950 performed worse than the Quadro K2200 with all the other performing a bit better (although still not the the level of the Quadro M4000)

Switching to "Shaded w/ Edges", the GeForce cards still mostly outperformed the Quadro K620, although none of them were able to match the Quadro K2200.

Audi R8

With the Audi R8 model, the results are a bit different than what we saw previously. Starting again with the "Shaded" view mode, we saw performance that ranged from matching the Quadro K620 to being a bit less than the Quadro K2200 without RealView. With RealView enabled, performance was actually signficantly better with even the GTX 950 outperforming the Quadro K2200. At best, however, even the highest end GeForce card was only able to match the Quadro M4000.

Using "Shaded w/ Edges", performance with the GeForce cards again takes a nose dive. Without RealView, none of the GeForce cards were able to perform even close to the Quadro K620. With Realview, however, the GTX 970/980 roughly matched the Quadro K620 while the higher end GeForce cards performed right between the K620 and K2200.

1080p GeForce benchmark in Solidworks 2016


One thing is obvious from our results: Quadro cards overall have much better performance in Solidworks than GeForce cards. At 1080p, not even the highest end GeForce card was able to match the very affordable Quadro K620. A GeForce GTX 980 Ti or Titan X should be still be able to give you about 60 FPS for medium models (around .5 million triangles) and about 30 FPS for very complex models (around 1 million triangles), but you would see significantly higher performance with a Quadro card of roughly the same price. If you use a 4K display, however, many of the GeForce cards are able to out-perform the Quadro K620 although given that the K620 only has 2GB of VRAM we would never recommend using it with a 4K display in the first place. Really, the Quadro K2200 is the lowest end card we would recommend for 4K and that card should always outperform even the highest end GeForce card.

If you only use the "Shaded" view mode, the GeForce cards usually give performance somewhere in-between the Quadro K2200 and Quadro M4000 at both 1080p and 4K resolutions. What is interesting is that at 1080p the performance difference between the different GeForce models is actually very minimal. At 4K, however, there is a benefit to having up to a GTX 970, but beyond that the performance does not increase significantly.

Using the Audi R8 model specifically (since that is the most complex model), we can make the following rough estimations of the performance of GeForce cards compared to Quadro cards if we average the results with RealView both on and off:

Shaded view mode 1080p 4K
GeForce GTX 950 2GB Equivalent to K2200 18% slower than K2200
GeForce GTX 960 4GB 5% faster than K2200 10% slower than K2200
GeForce GTX 970 4GB 3% faster than K2200 Equivalent to K2200
GeForce GTX 980 4GB 3% faster than K2200 5% slower than K2200
GeForce GTX 980 Ti 6GB 9% faster than K2200 7% faster than K2200
GeForce GTX Titan X 12GB 5% faster than K2200 2% slower than K2200
Shaded w/ Edges view mode 1080p 4K
GeForce GTX 950 2GB 40% of K620 32% of K2200
GeForce GTX 960 4GB 45% of K620 38% of K2200
GeForce GTX 970 4GB 54% of K620 52% of K2200
GeForce GTX 980 4GB 60% of K620 52% of K2200
GeForce GTX 980 Ti 6GB 65% of K620 62% of K2200
GeForce GTX Titan X 12GB 66% of K620 62% of K2200

Overall, with just "Shaded" mode the GeForce cards mostly performed within 10% of the Quadro K2200. However, the big thing we found is that Quadro cards have significant improved performance when using the "Shaded w/ Edges" view mode to the point that even a Quadro K620 will often give significantly better performance than the highest GeForce card.

The reason behind this is not easy to determine, although our best guess is that it has to do with the firmware and driver optimizations used with Quadro cards. Either way, if all you have is a GeForce card then Solidworks should function OK - although you will not be able to use features like RalView - but we would very highly recommend upgrading to an appropriate Quadro card as soon as possible. Especially if you use Solidworks professionally, the extra performance (not to mention simply using a supported card) means that you should almost never consider using a GeForce card instead of a Quadro card.

SOLIDWORKS Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Solidworks, GPU, Video Card, GeForce
Avatar AC

Nice article, any chance of a comparison to Firepro & Radeons?

Posted on 2016-03-31 05:35:24
Avatar John Leichty

I'd also be interested in that, particularly to see if the Radeons have the same shortcomings in the "Shaded w/ Edges" view mode.

Posted on 2016-04-02 19:23:14
Avatar Erik Charrier

Yes they do - especially with nicer eye candy settings enabled...

I've got an R9 390 as my secondary video card in a workstation with a pair of 4k displays.

I ran some usability comparison tests last year on SolidWorks 2016 and didn't see much of a difference between the r9 390 and k620. In some cases, I think the extra "muscle" in the r9 was outperforming the k620 in terms of usability. However, SolidWorks 2017 made some major updates to the display software that only seem to help the professional cards. In particular, transparency and edges with transparency feel like they've had an order of magnitude improvement.

My test file was a little over 0.2 million triangles with the .STL export methodology used in the article.

Posted on 2017-01-06 23:12:05
Avatar John Leichty

Great article, thanks for doing this comparison. I'm tempted to use a GeForce card since they're significantly cheaper and also offer the 2x DVI outputs I need.

Posted on 2016-04-02 19:22:09
Avatar dvm

One question for laptops regarding intel vs nVidia/Radeon: apart from intel reduced performance, are there other issues related to drivers compatibility? graphical misrepresentations or things like that?

Posted on 2016-04-03 12:26:05
Avatar cs744

For performance you did not correct enough. To gain the performance of a Quadro card in opengl you must change setting in the Nvidia driver profiles. You need to use Nvidia Inspector to change the profile information. The result will be performance much better than a quadro card of the same price.
Nvidia defaults to:
Profile "Dassault Systemes SolidWorks"
ShowOn Quadro

If you change this to:
Profile "Dassault Systemes SolidWorks"
ShowOn Geforce
ProfileType Application
Executable "solidworks/circuitworksfull/circuitworks.exe"
Executable "solidworks/driveworksxpress/driveworksxpressdt.exe"
Executable "solidworks/solidworks edrawings/emodelviewer.exe"
Executable "edrawings/emodelviewer.exe"
Executable "solidworks/photoview/photoview360_cl.exe"
Executable "solidworks/photoview/photoview360.exe"
Executable "photoview 360/photoview360.exe"
Executable "solidworks/sldbenchmarking/sldbenchmark.exe"
Executable "i386_sldworks.exe"
Executable "sldworks.exe"
Executable "solidworks/sldrx/sldrx.exe"
Setting ID_0x106d5cff = 0x00000000
Setting ID_0x10d48a85 = 0x00000034 UserSpecified=true
Setting ID_0x10f9dc81 = 0x00000011
Setting ID_0x10f9dc84 = 0x00000100
Setting ID_0x10fc2d9c = 0x00000000 UserSpecified=true
Setting ID_0x202fd652 = 0x00000001
Setting ID_0x202fe114 = 0x00000001
Setting ID_0x20320ce4 = 0x00000001
Setting ID_0x203691bb = 0x00000008
Setting ID_0x2045959a = 0x00000001
Setting ID_0x205f7e3b = 0x00000000
Setting ID_0x209fd306 = 0x003d1028
Setting ID_0x20c1221e = 0x00000001
Setting ID_0x20cbffc4 = 0x00000001
Setting ID_0x20d518cb = 0x00000001
Setting ID_0x20d59eda = 0x00000001
Setting ID_0x50528ab3 = 0x00000001

It will act more like a quadro card for Solidworks to use.

You should look into it.

Posted on 2016-04-03 20:00:01
Avatar Bryan Bravo

You should definitely make a youtube video of this process you would be saving companies a lot of $$$

Posted on 2016-05-03 14:09:45
Avatar Anonianus

Did you find more information?

Posted on 2016-06-01 15:09:49
Avatar Andres Eljadue Tarud

the best I can obtain with my budget is a i54690 and a GTX750, is it enought? im not rendering

Posted on 2016-09-29 03:08:45
Avatar Wally Banger

I just downloaded the Nvidia Inspector and tried to change some of this stuff. It has a Profile for the Dassalt software but I couldn't find the "ShowOn Quadro" entry. Also, when I hit "update profile" it didn't show the changes. Any more info would be very much appreciated...

Posted on 2016-10-07 21:49:01
Avatar BFeely

Could NVIDIA and/or Solidworks sue you for this?

Posted on 2016-11-05 13:52:39
Avatar Zix Raito

can you plz make a video on youtube to show me and other peaople how to do that

Posted on 2016-11-15 12:05:41
Avatar Huy Tran

Finally figured it out that you use the Geforce 3D Profile Manager.exe to export and import the profiles in text form so you can directly copy and paste the above. I did run into an issue where I exported the default list, to add in this profile, and it would not allow me to import it. But it would allow me to import a new text with this profile only, or re-import the original list, but not a modified list with this new profile in it.

*I understand now you have to replace the profile, since you can't have two profiles referencing the same application. That wasn't immediately clear to me as I didn't think they're all applied at once. I thought only one profile is working at a time.

Posted on 2017-06-08 23:11:12
Avatar Roger Klado

Is there an inspector profile edit hack that would allow my GeForce cards ( gtx 480 or gtx 1080 ) to be recognized by Autodesk Maya as fulfilling the Quadro requirement for 3d vision active shutter viewport viewing?

I am considering breaking down and getting an old Fermi or Kepler quadro but do not want to try sharing as much on the same system. ( at worst I imagine I might have to enable and disable depending on which card I would want to leverage )

Now that Maya supports directX that Pro quadro requirement is the bane of my desires for a pure stereoscopic modeling pipeline that would truly reflect the end user experience instead of referring to as much as an after thought. Even working on VR projects the stereo disrespect is very discouraging and short sighted.

Posted on 2017-07-01 11:28:11
Avatar Sam Curlett

What if you run 2x GTX 970 in SLI mode - how will that effect running solidworks?

Posted on 2016-05-16 10:39:32

Solidworks doesn't support multiple video cards, so you should not see any increase in performance with two cards

Posted on 2016-05-16 15:31:34
Avatar Robert Shield

Nice Article.
I'm looking at doing a about $1200 rig for both gaming and Solidworks. I intend on doing relatively light gaming and solidworks usage, but would like to have a powerful enough card that could handle more significant workloads in the future.
I was planning on going with the GTX 1070, but would like to know some other people's opinions as well.

Posted on 2016-07-27 15:35:19

Which side of things (gaming or SW) is more important to you? You'll basically have to pick the video card accordingly. If you want to go more on the gaming side, the 1070 is a good choice... just be aware that it may not do as well in SW, and isn't certified for that program. If you are more concerned with performance in SW, the Quadro M2000 should be in the same general price range as the GTX 1070 but would get you official SW compatibility.

Posted on 2016-07-27 15:42:44
Avatar Robert Shield

Thanks for the reply.
At this point, I think that I will be spending more time in Solidworks, but when I do game, I think that I will be pushing the machine harder. In solidworks, I think I am going to stay under 500 parts in each model, and, as of right now, I am focusing on High school robotics (FRC) which are relatively simple models.
I think I am going to go with the 1070.
What are the advantages of having official SW compatibility?

Posted on 2016-07-27 15:52:28

Mostly the ability to contact Solidworks if something isn't working right and get their tech support. If you are running a non-certified card then they will likely not help you, which can of course be frustrating if something isn't working (not displaying correctly, performing slowly, etc).

Posted on 2016-07-27 16:13:19
Avatar Dan Elliott

I recently purchased a workstation with the Quadro M4000 card and the Intel i7-6700 processor. Any tips or suggestions to fully optimize my new setup? It seems my Siemens NX 9 still acts sluggish even though it's using just a quarter of my memory and under 15% of my CPU.

Posted on 2016-08-12 20:44:10
Avatar jaybird

What speed ram are you using?
Also ssd or mechanical for the drive running solid works?

Posted on 2016-09-09 13:42:20
Avatar Peter Gough

Hi, I found that if you view individual SOLIDWORKS cores you often find it is running at 100 percent on one, as parametric modelling is done in series, only rendering best uses more cores

Posted on 2018-05-28 07:11:25

Yes, Peter - SW uses pretty much one core at max load when modeling, as you noted. Many other modeling applications have similar behavior as well. For that part of the workload, then, the number of cores doesn't matter... but things like clock speed, instructions per clock, and any sort of turbo boost modes all impact performance. Other aspects of SW are able to use multiple cores, though: simulation and rendering, for example. You might want to check out some of our SW articles that are focused on CPU performance instead of graphics cards, if you are interested in that sort of information.

Posted on 2018-05-29 16:24:20
Avatar Frank

For the function "Viualize" in SW2016, the performance is only related to the number of CUDA or not?
cause consider to use a GTX1080 for solidworks and some game:)

Posted on 2016-08-19 10:25:07
Avatar horrido

There is really no excuse for not properly supporting consumer grade cards. It's really a disservice to SW users. Really, how much FPS do you need when rotating a model? To top it off, some of these cards are showing low FPS only when driving ultra high rez displays (like 4k) with cards that don't have adequate memory. Current gen cards all have ample memory, the smallest being 3gb (which are budget cards). A GTX 1060 w/6gB should be PLENTY to drive SW, and can be had for under $200.

Posted on 2016-09-29 15:31:00
Avatar Caleb

Im building a workstations strictly for Solidworks and other work related programs (Word, Excel, Draftsight etc.) We have some very complex models that we work with and currently we have a GTX 680 in our computers. It works fine except on some models it rotates very slowly and is sometimes annoying when you are trying to get something done quickly. If we decide to go to a M4000 or M2000 in the new system what kind of performance gains will we see? will models load faster? Or will this only effect rotation and things like that? trying to find a straight answer on this thanks!

Posted on 2016-12-21 14:30:55

That's a really hard question to answer because large models are often actually CPU-limited so a faster video card will not always give better performance. Assuming the video card is what is holding you back however, we should see a really nice jump in FPS going from a GTX 680 to a M4000 (I would skip over the M2000 if you work with very complex models). Again, very hard to give a definite answer, but at a guess it could be anywhere from 50-150% higher FPS.

A better video card will only impact the FPS when rotating models, so things like opening, saving, or anything else shouldn't be impacted by the video card.

As an aside, I'm actually just getting started on re-doing our testing for SW 2017 and I've been looking for a good, complex model or two to add to our testing. If you think you might be a good fit (and there isn't too much red tape around the data), toss me an email at labs@pugetsystems.com . Then I could really tell you what kind of performance gain you might see!

Posted on 2016-12-21 19:24:41
Avatar Bojan Simonovic

I know why should not. In all 3D cad softvares i have problem selecting edge because quadro K620... When mouse is on edge its not highllighted, but when i move mouse away from edge so full body of mouse pointer is out of edge, only then is highlighted. And even when is highlighted its so sensitive that even mouse clicking can cancel highlight. Since i have installed k620 i work 2 times slower... PS putting all settings at maximum, straighten egdes, dont solve problem, its not quality issue...

Posted on 2016-12-22 09:43:46
Avatar Erik Charrier

The performance comparison seems to vary by SolidWorks version so a yearly update might be a good idea.

I use a gaming card to drive a second 4k display and found your conclusions matched my experience with SolidWorks 2016 between a k620 and r9 390. At times the r9 would give a better user experience. However, SolidWorks 2017 made some significant internal updates to rendering - especially for edges and transparency. Now the r9 becomes unusable on display settings that the Quadro has no trouble with.

Posted on 2017-01-06 23:25:06
Avatar Dëëpãk Rãwãl

can anyone suggest me a laptop for soldworks available in india? bugget would be below 1lakh. thank you..!!
minimum system specs:
i5 or i7
ram 8gb
graphic card 2gb
windows 8.1 or windows 10
storage 1TB

Posted on 2017-01-15 08:31:33
Avatar disqus_SQOztMVYcy

I have to say that I really enjoyed your article here and found it extremely useful. Though I have a bit of a problem. You specifically wrote this to help guide people on upgrading their graphics cards for solidworks without having to spend top dollar on the workstation cards that are approved and supported. I followed your recommendations and actually bought a GTX 960 although Solidworks will not even recognize it and is only running on my integrated graphics. This has completely crippled my work and now have to hunt down a correct card and see if I may be able to get my money back for the card that I just bought. I am more than a little frustrated with the flase hopes that were given here.

Posted on 2017-02-26 07:07:15

I'm sorry if you got the wrong impression from this article, but it was never intended to encourage people to use a GeForce instead of a Quadro for any serious Solidworks usage. Here are some of the key quotes from the article:

"Currently, Solidworks officially supports NVIDIA Quadro and AMD FirePro discrete video cards so if you want to use a supported card (which we highly recommend doing) you will need to use a workstation-class GPU. However, if you are a student learning Solidworks or otherwise unable to use a Quadro card, it is entirely possible to use a GeForce or Radeon card instead."

"This is absolutely something we don't recommend doing if you are using Solidworks professionally, but it will help us determine the full performance differences between Quadro and GeForce cards."

"One thing is obvious from our results: Quadro cards overall have much better performance in Solidworks than GeForce cards. At 1080p, not even the highest end GeForce card was able to match the very affordable Quadro K620."

"Overall, with just "Shaded" mode the GeForce cards mostly performed within 10% of the Quadro K2200. However, the big thing we found is that Quadro cards have significant improved performance when using the "Shaded w/ Edges" view mode to the point that even a Quadro K620 will often give significantly better performance than the highest GeForce card."

Posted on 2017-02-27 07:44:45
Avatar Rick Yarussi

Is there a risk that SolidWorks won't even recognize a GeForce, as one person commented below?

Posted on 2017-03-07 20:44:47

We have not experienced that, but if you are referring to the comment from "disqus_SQOztMVYcy" dated 10 days ago then I think a couple of things might be going on:

- If a desktop system he may have added a GTX 960 but connected his monitor(s) to the onboard Intel graphics outputs on his motherboard instead. In that situation any programs running will use the graphics processor to which your monitors are connected, and thus not utilize / recognize the GeForce card at all.

- If a laptop system, with both NVIDIA and Intel graphics, there may be something going on with the NVIDIA Optimus software that is supposed to manage which graphics processor is active at any given time. A lot of folks run into problems with that when gaming as well, if the software is for some reason running the Intel graphics when it should be making NVIDIA the active processor.

Even with the GeForce card properly detected, though, performance is generally far below a Quadro card. Because of that, plus SOLIDWORKS certification for certain video cards, we strongly recommend sticking with a Quadro card for this application.

Posted on 2017-03-07 20:50:21
Avatar Rick Yarussi

I need a PC for SolidWorks and Zemax. I won't be rendering in SW. Just basic stuff. So I don't think I need the "Shaded with Edges" mode, or fancy shading, etc. Given all that is there still any reason to go with a Quadro?

Posted on 2017-03-07 20:49:01

I would still tend toward a Quadro card. In fact, we usually recommend Quadro cards for Zemax systems as well - since we know that many folks running OpticStudio are also using SOLIDWORKS, AutoCAD, or a similar program alongside it. You might want to check out our recommended systems for that software, if it is a bigger factor in your workload than SOLIDWORKS:


Posted on 2017-03-07 21:08:57
Avatar Rick Yarussi

Seems like it depends on what you're doing... If you're using simple models in SW, and don't need Shaded w/Edges or photorealistic rendering, then wouldn't GeForce be faster (for a given card price)? It has more GFLOPS per $.

GeForce always gives more FPS in gaming. So I'm trying to understand why that would be different for SolidWorks. I think the difference is in the features like the Shaded w/Edges which are more computationally intensive?

Posted on 2017-03-08 01:47:09

I'm honestly not sure why, but SOLIDWORKS is an oddball in terms of GPU performance. You are right that GeForce cards have more raw calculation performance per dollar, but even in just Shaded view (without Edges, Realview, or Ambient Occlusion) on a small model a low-end Quadro card still beats a GeForce of similar price. As you ramp up the model size or screen resolution, the lowest-end Quadro card we tested (the K620) falls off in performance - but that is simply because it has less video RAM. If you can swing it, I would strongly recommend the Quadro M2000 as a baseline card for SOLIDWORKS... or possibly one of the new P-series Quadro cards, which should be coming out this month.

Posted on 2017-03-08 17:51:52
Avatar Francisco Gonzalez

so have a pc with a gtx 1080 for star citizen and some games. i also run my student solid works and it works ok. but lately ive been making bigger assemblies with more complex parts and im noticing the strain. so i was considering adding a quadro lower level used quadro. but one of my friends said his company has some used k5000s i can get for cheap from the owner. so i was thinking aboout getting one and using it on its own screen to hold my larger assemblies and sims. and the gtx on one or two other screens for looking at web pages or opening individual SW parts. can i do this? (only ever used one card before). if i stretch a window of SW from the screen with the quadro over to the gtx what will happen? o_O

Thank you for any help or knowledge.

Posted on 2017-03-29 03:13:07

We have had okay results when mixing GeForce and Quadro cards, though not that specific combination (a Kepler Quadro and a Pascal GeForce). Dragging 3D applications between them will probably *not* work, though. I know when doing that with onboard (Intel) graphics and a dedicated video card it will cause programs using graphics acceleration to crash.

Good to see a fellow Citizen, by the way :)

Posted on 2017-03-29 05:52:06
Avatar Francisco Gonzalez

So as long as i keep loaded applications in the screen i open them in then no problem? Or would i need like a p4000 to play nice with the 1080 (i can afford that with a bit of savings.) And how does it stack up to the k5000? Should i just have one card in at a time? What would you recommend?

Posted on 2017-03-29 06:28:53

The P4000 would be faster than the K5000, but if you can get a K5000 cheaply then that may not be bad. I'm honestly not sure if there are any issues with using an old Kepler based Quadro alongside a Pascal based GeForce, though; I don't think we've ever mixed cards that far separated from eachother, but that doesn't mean it won't work. Will your friend let you borrow one to try before buying?

Posted on 2017-03-29 16:00:46
Avatar Francisco Gonzalez

I do hope so. Ill see if the it guy will lend it to me for a bit. Ill try to get back to you guys about how this goes.

Posted on 2017-03-29 19:24:42
Avatar Francisco Gonzalez

If they're the same architecture. . Cant they sli? Can i have one beef up the other depending on if I'm doing work or play??? For some reason i feel like i said some heresy. 😅 thank you so much for help.

Posted on 2017-03-29 06:35:06

Definitely no SLI across different series of cards - you don't even want to do that across different models within the same series / product line.

Posted on 2017-03-29 15:59:11
Avatar Hannes Fischer

Very enlightening, any suggestions with regards to Quadro vs FirePro cards?

Posted on 2017-05-21 10:12:51

We typically use Quadro cards ourselves because in our experience the driver support is better. Bugs tend to get fixed faster by NVIDIA than AMD. FirePro (which is actually been re-branded to "Radeon Pro" now) can work perfectly fine from a performance standpoint, but because we put such an emphasis on reliability and stability we stick with Quadro.

Posted on 2017-05-22 15:48:33
Avatar Andrew

Any chance of updating this post with the pascal cards?

Posted on 2017-10-11 22:22:35
Avatar salamanka

any chance of updating this post with the new Titan Xp , and I was wondering if the titan xp can benefit from the realview mode or only certified gpu cards can ?

Posted on 2017-10-19 20:09:19

RealView won't be available on a Titan Xp, I'm sorry. I have heard of workarounds -often involving registry editing - to try and get RealView working on non-certified cards... but I would not recommend that, and the performance may not be very good even if you do get it working. For the price of a Titan Xp, though, you can get a very good Quadro card... like the P4000. That may not have the raw horsepower of a Titan, but it will do very nicely in SOLIDWORKS and similar applications.

Posted on 2017-10-19 21:02:21
Avatar Kristjan T.


this is very intreting article. I was reading about update what increased solidworks performance on Titan X and XP cards.


Is it possible to test it and valuate the results by your team?

Best wishes

Posted on 2018-01-07 19:05:25

We will be doing updated Solidworks testing, probably on version 2018, in the near future. I imagine we'll include the latest Titans in the mix, which would be the Titan Xp and V.

Posted on 2018-01-08 21:33:01