Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1147
Article Thumbnail

Agisoft PhotoScan 1.4.1 - Multi GPU Scaling

Written on May 2, 2018 by William George


PhotoScan is a photogrammetry program: an application that takes a set of images and combines them to create a 3D model or map. This article is part of a series looking at how different aspects of computer hardware affect PhotoScan performance. For more information on this software, or to see the other entries, check out our introductory article.

PhotoScan makes use of the video cards in a computer to assist with the computation of certain steps. As such, both the model of video card used and the number of them present in a system can have an impact on the amount of time those steps take. In this article, we take a look at how multiple GeForce GTX 1080 Ti cards scale in performance across a few CPU platforms - different chipsets supporting up to 2, 3, or 4 GPUs.

Methodology and Test Hardware

The primary GPU accelerated steps within PhotoScan are aligning photos and building a dense point cloud (a 3D representation of physical points that are seen from multiple angles in different photos). We have already looked at how different GPU models compare in these calculations, so now we are going to take the top-performing card from that roundup - the GeForce GTX 1080 Ti - and stack multiple cards in a single system to see how they scale. Different CPUs and chipsets also support different numbers of video cards, so we will look at 1-2 cards on Z370 (Intel Coffee Lake), 1-3 cards on X299 (Intel Core X), and 1-4 cards on X399 (AMD Threadripper). The goal here is not to find the best CPU for PhotoScan, which will be the focus of a future article, but instead to see how much performance gain there is from each additional card. In keeping with the rest of this series, all tests were conducted with "High" quality settings.

If you would like more details about the full hardware configurations we tested on, and the image sets we used within PhotoScan, simply .


Here are results for 1 - 2 GPU scaling on the Z370 platform, with an Intel Core i7 8700K processor:

And here are results for 1 - 3 GPUs on the X299 platform, with Intel's Core i9 7960X:

And finally, here are the results for 1 - 4 GPUs on the X399 platform, using AMD's Threadripper 1950X processor:


We can see in the graphs above that there is a substantial improvement in performance when going from one to two GPUs, on all three of these platforms. There is a smaller, but measurable, increase with a third card as well. However, there is almost no change going from three to four video cards on the Threadripper system (the only one we tested here which supported that many video cards). That seems to be pretty straightforward, though having seen these results I would like to test another quad-GPU system to see if the trend holds. This will likely happen when we get to testing dual CPU systems with Intel's new Xeon Scalable processors, which I hope will be soon.

One thing to keep in mind with the results shown above is that we did our tests at "High" quality. There is an option above that for building the dense cloud, "Ultra High", which will take substantially longer. Many users might also work with larger image sets, which also take more processing time. Because of that, the performance difference between video cards will vary somewhat based on your exact workload - but in general, having more video cards (up to 3 at least) should have a measurable impact on PhotoScan calculation times.


If you run PhotoScan with either High or Ultra High quality in the Build Dense Cloud step, on a single socket CPU workstation, then having 2 or possibly 3 video cards will be worthwhile. The fourth video card in our tests did not substantially impact performance, and even the third card is debatable: I would say it is probably more important if you like to build the dense cloud at Ultra High settings, since that substantially increases the processing time and therefore could benefit more from the time savings of a third GPU.

Photogrammetry Workstations

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: Agisoft, PhotoScan, Multi, GPU, Video Card, NVIDIA, GeForce, Pascal, Performance, Scaling, Comparison
Håkon Broder Lund

"Different CPUs and chipsets also support different numbers of video cards, so we will look at 1-2 cards on Z370 (Intel Coffee Lake), 1-3 cards on X299 (Intel Core X), and 1-4 cards on X399 (AMD Thunderbolt)."

Small typo there. Should be Threadripper, not Thunderbolt :)

Posted on 2018-05-02 23:57:33

Ack, good catch! Fixed :)

Posted on 2018-05-03 16:14:38
Glib G

William, thanks for your ongoing hard work, much appreciated :) I was curious how 2x (or more) GTX1060 6gb cards compares to a single GTX1070 or GTX1070ti? I know Photoscan doesn’t need sli specific setups to utilize multiple cards, so it’s possible 2 gtx1060’s could outperform and give better performance/value for their price. Your thoughts? Thanks.

Posted on 2018-05-15 16:10:26

I haven't specifically tested multiple GTX 1060 cards, but if they scale similarly to the GTX 1080 Ti cards then two 1060s should be about equal to a GTX 1070 Ti (or 1080). Adding a third would likely put performance around the level of a single 1080 Ti. Here are comparisons between those various individual cards:


However, looking at prices, I believe a 1070 Ti should cost less than two 1060s (at least the 6GB version, that I have tested). Going with a single card means less complexity and heat as well, plus it could leave room (in a large enough chassis) for a second and potentially even third 1070 Ti to be added later... whereas you are sort of stuck if you max out on 1060 cards to start with.

Posted on 2018-05-15 16:20:15
Glib G

Thanks for your input. I suppose it does make more sense to get a GTX1070ti vs 2x GTX1060 given considerations you noted and price. Thanks again for your advice :)

Posted on 2018-05-15 16:54:17

Thanks for the comparisson and the work you did. It would be nice to have the baseline results without GPU for each one, to see if it's worth the extra/s GPU/s or instead go for a faster CPU.

Posted on 2018-08-12 19:29:07

Not having a video card at all is a huge performance loss. We haven't looked at how different CPUs do on their own, but we did include CPU-only results on another article: https://www.pugetsystems.co...

As those charts show, the Build Dense Cloud phase can take 3 to 5 times longer without a video card even on a very powerful CPU.

Posted on 2018-08-12 22:51:14

Thank you, that's what I was looking for, it seems at least a single mid-range GPU is mandatory to have decent workload times. There isn't such a big different after that and having a high-end or multiple GPUs.

Posted on 2018-08-12 23:59:55

Yeah, a mid-range CPU and GPU together will do a decent job - everything above that is smaller, incremental improvements... though to folks who use PhotoScan frequently the time savings might well be worth it :)

Posted on 2018-08-13 16:45:40

Does this require SLI bridge? I understand that some compute loads don't need it.

Posted on 2018-10-05 10:43:54

No it does not :)

No compute workloads use SLI: it is only used for combining GPUs to increase rasterization performance... mostly in games, but potentially in some other applications which display real-time 3D graphics.

Posted on 2018-10-05 16:42:36
Alexander Liebert

Does it make a Difference if i use an external GPU or an "internal" GPU? Are there disadvantage for using a external GPU?

Posted on 2019-02-12 20:25:12

External GPUs have greatly limited bandwidth - usually PCI-Express 3.0 x4, at best, if using something like Thunderbolt 3 to connect to the main computer. As such, there will be some performance loss... just like there would be if the GPU was internal but running on a slower PCI-Express slot. We looked a little bit at that in a previous article:


Posted on 2019-02-12 20:43:04
Benjamin Van Het Bolscher

Would love to see some mixed GPU performance too. Maybe a 1080ti coupled with a 1070/1060..etc.. for people looking to upgrade as older generations get cheaper/second hand.

Posted on 2019-06-06 05:28:46

I am currently wrapping up GPU testing on Metashape (the new name for PhotoScan), and will have results for the whole RTX series from NVIDIA and three AMD Radeon cards as well. I did also include a pair of RTX 2080 Ti cards, to double-check GPU scaling again (which this article covered in more depth).

Unfortunately I don't think I will have the opportunity to go back and test a wide combination of different GPUs - there are just too many potential combos, and we need to move on to other testing. However, if there is any concern that mixing GPUs might not work at all for some reason then I could at least try one or two non-identical pairings. Maybe a 2000-series card with a 1000-series? And / or an AMD + NVIDIA combo?

Posted on 2019-06-06 17:52:34

Actually, scratch that AMD + NVIDIA idea... too much of a risk for Windows to throw up or other weird things to happen, and even if it does work I don't want to make it look like we would sell that combination or support it. The folks downstairs in tech support would [figuratively] kill me :)

Posted on 2019-06-06 18:07:51