Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1330
Article Thumbnail

NVIDIA NVLink Bridge Compatibility Chart

Written on January 15, 2019 by William George


NVLink is a technology from NVIDIA for creating a high bandwidth link between two compatible video cards. It can be used for many things, from basic SLI for faster gaming to potentially pooling GPU memory for rendering large and complex scenes. What NVLink can be used for depends on how software developers write their applications, and there is a lot of exciting potential for this technology.

However, NVLink requires a physical bridge between the cards in order to enable these capabilities - and NVIDIA isn't transparent as to which models of the NVLink bridge will work with which video cards. Several Quadro video cards support NVLink, along with some of the new GeForce RTX 20-series cards and the Titan RTX. NVLink bridges are available with Quadro or GeForce branding in various sizes.

If you have already set up a system with NVLink, you can find instructions for enabling and testing it on another article.

Compatibility Matrix

Here is a chart of all the NVLink compatible video cards, as of publishing time, along with the various NVLink bridges that are available:

NVLink Bridge →
↓ Video Card
Quadro GP100
2-Slot Pair
Quadro GV100
2-Slot Pair
Quadro RTX
5000 2/3-Slot
Quadro RTX
6000 2/3-Slot
GeForce RTX
Titan RTX
Quadro GP100 Works (Tested) Should Work Does Not Work Does Not Work Does Not Work Does Not Work
Quadro GV100 Does Not Work Works (Design) Does Not Work Does Not Work Does Not Work Does Not Work
Quadro RTX 5000 Does Not Work Does Not Work Works (Design) Does Not Work Does Not Work Does Not Work
Quadro RTX 6000 Does Not Work Should Work Does Not Work Works (Tested) Works (Tested) Should Work
Quadro RTX 8000 Does Not Work Should Work Does Not Work Works (Design) Should Work Should Work
Titan RTX Does Not Work Should Work Does Not Work Works (Tested) Works (Tested) Works (Design)
GeForce RTX 2080 Does Not Work Should Work Does Not Work Works (Tested) Works (Tested) Should Work
GeForce RTX 2080 Ti Does Not Work Should Work Does Not Work Works (Tested) Works (Tested) Should Work

For the combinations indicated as "Works", we specified whether we have actually tested it or it is just assumed to work because NVIDIA designed it to. For example, the Quadro 6000 RTX NVLink Bridge is clearly designed to work on the Quadro RTX 6000, as well as the upcoming 8000 - but we only have RTX 6000s in-house to test currently. Likewise, the RTX 5000 has its own bridge designed for its smaller NVLink connector - so it should work, by design. "Should Work" indications are used where we believe the combination will work based on other testing we have done, but we don't have samples on hand to verify for sure. "Does Not Work" should be self explanatory.


The main takeaways from our testing are as follows:

  • The older Quadro GP100 cards were equipped with the first generation of NVLink, and had less bandwidth per link than newer models. As such, the Quadro GP100 bridges do not work with newer cards.
  • The Quadro GV100 cards use the second generation of NVLink, and like the GP100 have two NVLink connectors per card. The newer RTX series bridges can fit on one set of connectors, but their plastic housings are physically larger and cannot be paired up to fully connect both sets of NVLink connectors. That means newer RTX bridges won't fully work on GP100 or GV100 cards.
  • Quadro RTX 5000 cards have a smaller NVLink connector than all the other cards, requiring a completely separate set of NVLink bridges. Because of this they do not interchange with any of the other existing video cards, whether RTX generation or older. This seems to be a design decision by NVIDIA, but is strange since the GeForce RTX 2080 has the same lower NVLink bandwidth (50 GB/s total, versus 100 GB/s on the RTX 6000, 2080 Ti, and Titan RTX) but uses the same full-length connector.
  • Quadro RTX 6000 and GeForce RTX NVLink bridges may look different on the outside, but they appear to be functionally identical. We've tested both types and found they work across most Quadro, Titan, and GeForce cards in this generation. We cannot yet verify that on the Quadro RTX 8000, because they are not yet shipping, but we have tested on a pair of RTX 6000s and all four bridge variants work. Likewise, we have tested them on GeForce RTX 2080, 2080 Ti, and Titan RTX successfully.
  • It is worth noting, however, that the LED logo on GeForce RTX bridges does not light up when used on Quadro RTX cards. The Titan RTX bridges also appear to have a NVIDIA logo, but as we don't have them in hand I am not sure if it lights up.


The good news here is that, if you are using hardware from the RTX generation - except for the Quadro RTX 5000 - it looks like you can use a GeForce, Titan, or Quadro branded bridge without needing to worry about compatibility. That is particularly helpful since they come in different sizes: 2- and 3-slot for Quadro and 3- and 4-slot for GeForce / Titan. I'm not sure why NVIDIA didn't just make 2-, 3-, and 4-slot solutions that were brand agnostic... but at least there don't seem to be any weird limitations on which bridges work with which cards. They are also all the same price, as of publishing time, if purchased directly from NVIDIA. The older GP100 and GV100 are a different story, and if you are using those you should just stick with the paired bridges built for each model, respectively.

Additional Resources

If you want more info about NVLink in Windows, check out another article we published on the topic.

If you need a walk through on enabling NVLink in Windows, or want to make sure it is work, we have an article covering those topics too.

If you are interested in NVLink, Linux, and machine learning, check out Dr Kinghorn's HPC Blog.

For a list of all articles where we've talked about NVLink, use our website's search function.

Tags: NVIDIA, NVLink, Bridge, SLI, Windows 10, GeForce, RTX, 2080, 2080 Ti, Titan, Quadro, 5000, 6000, 8000, GP100, GV100

I just found out that NVIDIA has gone and made Titan RTX NVLink bridges as well - which look just like the GeForce ones, but gold instead of silver (to match the color of the Titan RTX's cooler). I thought about updating the chart above, but if I add any more columns it will not fit in the same width... so I'm just adding a note here that I fully expect it to work in all the same places the GeForce RTX NVLink bridge does - which is to say, across all current RTX series cards.

Posted on 2019-01-25 22:05:55

I also just found out that NVIDIA pulled a stupid and made the Quadro RTX 5000's NVLink connector a smaller size. We knew it was already going to be a 1-link (50 GB/s) connection, rather than 2-link (100 GB/s) like the RTX 6000 and 8000. However, that same split exists on the GeForce side: RTX 2080 is 50 GB/s and 2080 Ti has the full 100 GB/s... but they have the same connector and share the same bridges. I don't know what led NVIDIA to change that for the Quadro RTX 5000, especially since it means yet another set of NVLink bridges (with the smaller connector). I am going to go ahead and update the chart above with this, since it is a pretty major issue.

Posted on 2019-01-25 23:49:39
mathieu soleil

I think it would be worth it to update the paper with those comments as well as the chart (and if needed to resize the width of the columns), because a good part of the readers won't read comments, only the article, an quickly. Also it would be very interesting to verify that a GV100 Nvlink bridge works with 2x2080ti, in particular because of 1) the absence of 2 slot size bridge among the consumer nvlink bridges, and 2) the 100G speed and 3) the cost difference.

Posted on 2019-01-26 15:49:44

I did update the chart, as well as some of the descriptive text in the article, based on the new info about the smaller Quadro RTX 5000 connectors and the Titan RTX bridges. I would be happy to test with the GV100's bridges, if we had any... but we don't, and they are so expensive (~$500!) that it isn't worth ordering them in. If you just need a 2-slot bridge, just the one designed for the Quadro RTX 6000 and it works (we have one and tested it here) with the GeForce RTX 2080 & 2080 Ti.

Posted on 2019-01-28 17:09:05
Graham P

Thanks for the article William. Do you have any feedback on the HB version of the Quadro RTX NVLink 2-Slot for use with 2 x RTX2080Ti cards as it appears to be the only 2-slot model I can source locally?

Posted on 2020-07-24 06:26:07

Yeah, as long as you are using the "HB" one - which is listed as being for the RTX 6000 & 8000 - that is the version that should work with the GeForce RTX cards. The RTX 5000 version is physically shorter, and will not work.

Posted on 2020-07-24 16:58:03
Heresy Ku

I had test Titan RTX NVLink bridge on 2 GeForce RTX 2080 Ti, it works.

Posted on 2019-02-27 09:08:52

Thank you for the heads-up! :)

Posted on 2019-02-27 21:57:11
Trevor Standley

Wondering if you can bridge a single 2080ti with a single RTX Titan.

Posted on 2019-04-09 00:55:26

You could physically put a bridge between them, but Windows won't allow you to enable SLI across different cards like that - so at least in Windows, you could not use any of the functionality of NVLink (since enabling SLI is what turns it on).

Posted on 2019-04-09 17:33:34
Trevor Standley

I don't care about SLI at all. I'm doing deep learning.
Why would I want to combine an RTX Titan with a 2080ti? Because the Titan has more memory. I can use that memory for running my two 4k displays in xorg and still have more than the 2080ti left over so my deep learning can use all the memory of that card when the two are used in a pair. The NVLink bridge really does speed up deep learning because a large portion of the time is spent communicating the parameters after each minibatch. But I've only ever tried it with 2 2080ti's. And when I do that I have the problem that Xorg is using several gigabytes of vram on the main GPU. Also, the way I am using Apex for fp16 training leads to an imbalance of vram usage. The main GPU gets hit much harder.

Posted on 2019-04-09 18:21:34

I can't speak to whether it would work on Linux, since that OS does not require putting the cards in SLI for NVLink to work. So... it might? I don't think we've tried mixing cards in NVLink on Linux, but that would be Dr Kinghorn's realm (I am not a Linux guy, myself).

Posted on 2019-04-09 18:23:34
Trevor Standley

Just tested it. It works with a 2080 ti and an RTX Titan. Shaves about 8% off the training time, and I get 24 GB on card 1 (which needs more to run my gui and for the way I train my models).

Posted on 2019-06-05 05:26:30

That is awesome! :)

Posted on 2019-06-05 16:03:36

Hey! Thanks for testing! I am interested in the same setup.
I just want to know in which way, without enabling SLI but having the NVLINK bridge, the two cards are communicating between them! Which is the difference having the NVLINK bridge and having just two cards separated and using together as well for GPU rendering Or deep learning?

Thanks a lot!

Posted on 2020-04-11 06:59:17
Trevor Standley

I use this with ubuntu, so SLI isn't even something you can enable.

Posted on 2020-04-12 02:53:00

Nice!!! So you just place the NVlink bridge and the system link the memories of both? Just trying to understand how it works. I will do the same, Titan RTX plus 2080 TI, and I need it for rendering, so I am not interested in SLI!

Posted on 2020-04-12 12:59:04

In Linux, if the NVLink bridge is connected and the cards are compatible (I didn't know you could mix the Titan RX and 2080 Ti - that is cool!) then it enables direct peer-to-peer (P2P for short) communication between them. That alone does not combine their memory or anything, it just lets them talk directly to each other (and access data stored in memory on the other card) much more quickly than if they had to talk over the PCI-Express bus. This in turn can provide performance benefits in some applications, and can be used to enable things like "sharing" (in a sense) of memory pools to tackle larger projects than a single card could fit on its own.

The main difference between doing this on Linux vs Windows is that on Windows you have to turn on SLI in order to enable that P2P functionality (even after installing the NVLink bridge)... and the Windows drivers have changed over the years, with some weird behavior and restrictions on card combinations in recent driver releases: https://www.pugetsystems.co...

Posted on 2020-04-13 18:19:28
Trevor Standley

In no operating system will the bridge make the two cards appear like one with the combined total memory. That's just not how it works unfortunately. The NVLink bridge simply isn't nearly as fast as the attached memory, so that isn't even technically feasible.

Posted on 2020-04-13 22:20:05
King Solomon

Does the NVLink work with Titan x.

Posted on 2020-01-27 01:32:53

If you are referring to the old Titan X from "Pascal" generation of GPUs, then no - that did not support NVLink.


It did support SLI but that is usually limited in functionality to speeding up frames in games that support it.

Posted on 2020-01-27 17:17:21
Donald Kinghorn

Hi Trevor, I've done a bunch of testing for ML/AI workloads ... see https://www.pugetsystems.co...

However, I never tried mixing GPU's with NVLINK, but, I don't recommend it. It could(?) work but the differing memory sizes could cause trouble. My gut feeling is that it will not be a good thing even if it did work ... I did a search and didn't come up with anything definitive. I really think the memory mapping would cause issues.

If you have both of these (great!) cards then you might have good luck running different training runs on each card at the same time. I'm referring to hyper-parameter tuning and such ...

I'll keep your question in mind the next time I am doing testing ... best wishes --Don

Posted on 2019-04-09 19:20:15
Trevor Standley

Thanks. If it did work, I see no reason it wouldn't be a good thing. The two cards are almost the same speed, and the extra memory of the RTX card is exactly what I need. I already use them without a bridge, and it works great. Experiments I did with two 2080ti's in a different machine showed that NVLink really helps. Maybe I'll try moving the bridge to my local machine one day.

Posted on 2019-04-09 19:32:46

If you do give it a try, please let us know how it goes :)

Posted on 2019-04-09 20:25:35
Donald Kinghorn

Yes! The only way to really know is to try it. I hear you about the memory on RTX Titan having 24GB is nice. The next time I am doing appropriate testing I will try it.

If you do a test moving the bridge from your other machine please post your results back in a comment here or maybe better on

That would be really interesting ... and like I said when I get a chance to try it I will do the same thing. Thanks --Don

Posted on 2019-04-10 19:39:14
Trevor Standley

Just tested it. It works with a 2080 ti and a n RTX Titan.

Posted on 2019-06-05 05:25:05
Donald Kinghorn

Nice! Thanks for the feedback

Posted on 2019-06-05 16:26:53
Anh Nguyễn Trần Bảo

Did you have any Quadro RTX 8000s? If yes, then test them with NVLink bridges to make this complete. (This is because motherboards for Xeon W-3175X has 2 PCIe 3.0 x16 slots and 2 PCIe 3.0 x8 slots (mechanically x16) placed interleavely, all are dual-slots)

Posted on 2019-04-20 04:35:26

We have only one RTX 8000, so I can't test NVLink with it, but it is basically the same as the RTX 6000 just with twice the VRAM.

Posted on 2019-04-20 04:41:20
Anh Nguyễn Trần Bảo

And, NVIDIA does make Quadro RTX 8000 NVLink bridges. That means the result could be different.

Posted on 2019-04-20 11:05:14

The Quadro RTX 6000 and 8000 share the same bridge, though:


So given how widely inter-compatible most of the other bridges are, I cannot imagine the one specifically designed for the 6000 & 8000 working on the 6000 but not the 8000. And indeed, all the other bridges that worked on the 6000 in our testing should also function on the 8000 without issue :)

Posted on 2019-04-22 15:50:25

Ugh, it looks like NVIDIA changed their plans and *did* in fact make two separate sets of NVLink bridges for the RTX 6000 and RTX 8000. They are purely cosmetic differences: the RTX 8000 bridge is reflective, to match the styling on the heasink shrouds those cards come with.

Okay, rant time: It is ridiculous that NVIDIA is making 8 functionally identical parts when they could simply have 3. All that is needed is a 2, 3, and 4-slot NVLink bridge for the normal cards (GeForce RTX 2080, 2080 Ti, Titan RTX, RTX 6000, and RTX 8000) since they all share the same NVLink connector. Instead they have made two each for GeForce, Titan, RTX 6000, and RTX 8000 - with the only difference being styling, so that they match the look of the cards. I guess this is what happens when windowed chassis are so popular: you end up with form more important than function. The RTX 5000 legitimately needs its own bridges, since they made its connector smaller, but that too was unnecessary: the RTX 2080 also has a more limited NVLink connection, but it still uses the full-size connector so the same could have been done with the RTX 5000. Bah!

Posted on 2019-04-24 16:10:49
Anh Nguyễn Trần Bảo

So..can you try it now? This is to ensure the compatibility of bridges for Quadro RTX 8000s.

Posted on 2019-08-03 02:47:55

I don't have access to two RTX 8000s at the moment - they are expensive, and we try not to keep them in stock but rather order them as needed. However, we did have a couple in a while back and as far as I can recall they worked just fine in NVLink with the various RTX-generation bridges we have around. All of the NVLink bridges from this generation (RTX) appear to be completely interchangeable across cards - with the lone exception of the Quadro RTX 5000 and its matching bridge, which has a shorted edge connector. If you are buying brand-new cards and a bridge to go with them, though, getting one with a matching name to the cards you purchase will mean that the visual appearance matches the cards (if you care about that sort of thing).

Posted on 2019-08-07 18:04:07
Donald Kinghorn

just to give you a a little more reassurance that the RTX bridges inter-operate. Last week I tested 4 RTX2070 Super's with NVLINK using one bridge that was from a box labeled for RTX Quadro 8000 and one that was labeled for RTX Quadro 6000. No problems ...

Posted on 2019-08-07 18:47:25
Paulo Gouveia

Hello guys
Im thinking to build a system with 4 x Quadro GV100 connected in pairs with 2 NVLINK bridges.
Any expected issue with this config?

Posted on 2019-05-17 11:29:12

What OS? In Windows, at least last time I tried, the GP100 & GV100 cards required a separate GPU for video output when put into NVLink.

Posted on 2019-05-17 15:18:20

Man thanks!! Great resource. Really looking for a 2-slot forr those of us with MATX setups.

Posted on 2019-06-22 05:21:28

Sorry for bumping an old post, but has anybody managed to make nvlink work without attaching a display ? I have three rtx cards and I would like to use one card (x16) attached to my display and the other two (x8,x16) linked via nvlink and function as pure compute cards(without attaching a display to either of these other two cards)

Posted on 2019-07-21 19:04:38
Donald Kinghorn

That's a completely reasonable thing to want to do, but ... If you are using Windows I'm not sure it will work! (It's not a problem if you are going to use Linux) William may have a more definitive answer. Windows ties the SLI driver to NVLINK and SLI *has* to be enabled. So if you know that you can enable SLI in that config without a display on the paired cards then yes, otherwise no.

It looks like the new 2070 "super" has NVLINK. I hope to be testing those soon and I want to do Win10 and Linux for that. William will be testing too. We will keep this config case in mind.

Posted on 2019-07-22 15:45:50

If memory serves - and it has been several months since I played around with this - Windows won't allow you to put cards in SLI unless there is a monitor attached to one of them. So you might try connecting a monitor to one of those secondary cards (making sure to also have a NVLink bridge in place) and then see if you can put them in SLI. Once that is done, the big question is whether they will *stay* in SLI when the monitor is disconnected :/

I may give this a try in the near future, just to see how it goes. If I am able to, I will post back here with the results.

Posted on 2019-07-22 17:34:35

Thank you both for the replies. With only one nvlink allowed in regular desktop systems, it might make sense for users to get three cards , add nvlink to two of them while keeping one extra card as the main OpenGL GPU and get full viewport performance while the renderers are firing away on the nvlinked ones.

@William : With regards to your question of SLI staying enabled once you detach the monitor : the answer is somewhat of a yes and no. When I detached the monitor with windows running, the Nvidia panel reported that I need to change the setting etc (in the SLI section) because - I don’t remember the exact words - the configuration has changed. Something like that.
Task manager continued to report the nvlinks as enabled.
I didn’t boot the system to check if the situation would change.

I managed to make it work with a little bit of Windows/Nvidia trying to over ride each other.
The solution is four words : dummy display port adaptors - to make windows/Nvidia think a monitor is attached, thereby offering the ability to enablle SLI and thus nvlink.

Will post my steps as soon as my network stops acting up.

Posted on 2019-07-22 22:29:03

Awesome, I'm glad you found a work-around! I never considered that sort of setup in my development of this tool, but I would love to know if it reports your configuration properly (two GPUs in NVLink and one not): https://www.pugetsystems.co... (wow, that is a long URL!)

Posted on 2019-07-22 22:41:41
Donald Kinghorn

Awesome! I stopped in at Labs and we were talking about your question and Matt suggested using dummy display port adapters! :-)

Posted on 2019-07-22 23:07:03


I checked nvlink satus with the test program @William posted and the result is affirmative.

Here is the result in text since I cannot post unless I have an account at this site :
Link Enabled
NVLink appears to be enabled and functioning on this system!

Here is the average total bandwidth over each detected NVLink:
NVLink Pair:
Bandwidth: 92.775 GB/'s

Posted on 2019-07-23 17:04:00

• Here is how I have setup the 3 Titan RTXs.

◇ I have an ASUS x299 SAGE MB with 7 PCI-e Slots.
▪ The configuration of the Slots are as follows :
• 1 - x16
• 2 - x8
• 3 - x16
• 4 - x8
• 5 - x16
• 6 - x8
• 7 - x16

▪ I have installed the RTX cards in slots 1(x16),4(x8) and 7(x16)
▪ Slot 1 RTX is attached to my main display
▪ Slot 4 RTX card has a dummy Display/DVI plug because a Monitor is required to be attached to one of your Nvlinked cards for nvlinks to work. The dummy plugs mitigate that issue by essentially emulating a monitor.
• No drivers were required for the dummy plug in Windows 10 pro
▪ Slot 7 card is headless (no monitors or dummy plugs attached)
▪ Slot 4 and 7 are connected by GEFORCE 3 slot Nvlink (Titan Nvlinks were sold out).
• 3 slot was chosen because there is a gap of one slot between the cards.
◇ Nvidia's FE RTX cards, including Titan RTX are thicker than double slot cards because of their non-blower designs. This necessitates keeping a gap of one slot between the cards, assuming the PCI-e slots are placed in single slot gaps.

• Next when I booted into Windows, I enabled SLI in the Nvidia Control Panel.
• That's when the quirks started.
◇ Nvidia prompted me to close some apps. I chose to 'end task' these applications using the task manager.
◇ Yet after I was done, Nvidia displayed a 'none' entry in the list and the continue button refused to respond.
▪ A google search later, It seems apparently a service named "ApplicationFrameHost.exe" was the culprit. I killed it using in the task manager.
▪ Again Nvidia continued to display a "none' option.
▪ I killed the NVidia control panel using the task manager, reopened it and pressed apply again.
▪ This time, it worked and SLI was enabled...
▪ except..
▪ now Windows decided that the "dummy' monitor was the main display and would just shift all open windows to this non existent monitor.
▪ But pressing Win+P key did the trick ( you might have to press it a few times ) Once I had gotten my monitor registered as the main display, I disabled the non existent monitor (emulated by the dummy plug)
◇ Side note :
▪ It seems Nvidia counts the slots bottom up ( or is it that it might be giving precedence to Nvinked cards over non Nvinked ones ? ). So while Windows showed the cards from top to bottom ( slot 1 = GPU 0 and so on) the NVidia Control panel showed the bottom most card as GPU 0 and so on.
▪ Weird

Posted on 2019-07-23 17:14:31

That is a great guide / walk through, thank you for sharing all this!

Posted on 2019-07-23 17:40:28
Donald Kinghorn

I want to second that ... thanks for posting back!

Posted on 2019-07-24 01:56:29
Pierre Tisseur

Do you know how this motherboad can managed 8*8+16 pci lanes?

Posted on 2019-09-02 16:04:27

Awesome! I'm glad that little utility worked properly in this situation. Its core is a NVIDIA-made utility, I just put a script around it to make it easier to use and to display results in a more friendly way :)

Posted on 2019-07-23 17:39:21

William M George Hi there. I have two NVIDIA TITAN RTX and I am looking to NVLink them. I wanted to buy one of those customized Bridges instead of NVIDIA's ones.

The one I am looking to buy is the ASUS ROG NVLink - 4 Slot Bridge.

I have realized that ASUS's website says 50GB/s instead of 100GB/s, but they claim that their bridge supports RTX 2080 Ti, which should have the same speed that for TITAN RTX with their own respective NVIDIA Bridges.

Do you know if I can buy this Bridge without concern? Or should I contact ASUS to clarify? Thank you!

Posted on 2019-08-16 16:54:33

Oh man... other OEMs are getting in on the NVLink bridge action too? Assuming they follow NVIDIA's spec, I don't see how they could *not* work - but I bet if you tried to contact Asus they would stick with the line from their website: "Supported Graphic Cards: RTX 2080 Ti, RTX 2080" (https://www.asus.com/us/Gra...

You are right that the 2080 Ti has the same number of NVLink connections as the Titan RTX, so if it works for one it should work for the other. I guess I would just recommend buying from somewhere with a good return policy, in case it doesn't work, and using a tool like our NVLinkTest to make sure it is working when you get it set up:


Posted on 2019-08-16 17:00:13

Thank you! Unfortunately I will have to buy this one from ASUS because it is the only type of NVLink Bridge that I can get down here in South America.

I have attemped ordering NVIDIA's very own TITAN RTX Bridge from their website but it only ships to North America. Importation costs from Ebay also doesn't help either.

I have access to only those OEM Bridges. I think I will be ordering the ASUS one and testing out with your tool to see if it works at the same bandwidth that of NVIDIA's ones.

Posted on 2019-08-17 00:07:00

Ah, that makes sense. Well, I hope it works out for you! Best of luck :)

Posted on 2019-08-19 16:10:02

Hello George! Here is an update. I believe it is safe to say that the OEM Bridges, at least the ASUS one is indeed compatible with the TITAN RTX, despite what it says in the box.

8/23/2019 4:39:11 PM

NVLink appears to be enabled and functioning on this system.

[P2P (Peer-to-Peer) GPU Bandwidth Latency Test]
Device: 0, TITAN RTX, pciBusID: 65, pciDeviceID: 0, pciDomainID:0
Device: 1, TITAN RTX, pciBusID: b4, pciDeviceID: 0, pciDomainID:0
Device=0 CAN Access Peer Device=1
Device=1 CAN Access Peer Device=0

***NOTE: In case a device doesn't have P2P access to other one, it falls back to normal memcopy procedure.
So you can see lesser Bandwidth (GB/s) and unstable Latency (us) in those cases.

P2P Connectivity Matrix
D\D 0 1
0 1 1
1 1 1
Unidirectional P2P=Disabled Bandwidth Matrix (GB/s)
D\D 0 1
0 526.56 5.47
1 5.89 533.09
Unidirectional P2P=Enabled Bandwidth (P2P Writes) Matrix (GB/s)
D\D 0 1
0 529.30 46.93
1 46.97 538.05
Bidirectional P2P=Disabled Bandwidth Matrix (GB/s)
D\D 0 1
0 549.47 9.10
1 8.99 543.74
Bidirectional P2P=Enabled Bandwidth Matrix (GB/s)
D\D 0 1
0 538.03 92.26
1 93.68 540.73
P2P=Disabled Latency Matrix (us)
GPU 0 1
0 2.45 105.95
1 101.18 2.48

CPU 0 1
0 2.00 47.86
1 47.29 1.93
P2P=Enabled Latency (P2P Writes) Matrix (us)
GPU 0 1
0 3.79 1.33
1 1.34 5.03

CPU 0 1
0 2.05 1.19
1 1.11 1.88

NOTE: The CUDA Samples are not meant for performance measurements. Results may vary when GPU Boost is enabled.


https://uploads.disquscdn.c... https://uploads.disquscdn.c...

Posted on 2019-08-23 19:44:07

Awesome - that is excellent news, and it reinforces my belief that these are pretty much universal (saving for the smaller NVLink edge connector on the Quadro RTX 5000 and the older version of the bridge from the GP100). Thank you for posting your results :)

Posted on 2019-08-23 19:49:16
Thiago Ribeiro da Motta

Wait, are you really getting about 93GB/s transfer speed with the ASUS ROG 4-Slot NVLink? It seems I'm only getting about 50GB/s with this setup, which I figured were caused by the ASUS NVLink, since, as stated by ASUS itself:

The all-new ROG NVLink Bridge connects two GeForce RTX NVLink SLI-ready graphics cards with a 50 GB/s link.

Is there something I have to do in order to enable speeds up to 100GB/s? Also, only half of the 4 supposed links are visible, which seems really odd already.

nvidia-smi nvlink -s
Link 0: 25.781 GB/s
Link 1: 25.781 GB/s
Link 0: 25.781 GB/s
Link 1: 25.781 GB/s

nvidia-smi nvlink -c
Link 0, P2P is supported: true
Link 0, Access to system memory supported: true
Link 0, P2P atomics supported: true
Link 0, System memory atomics supported: true
Link 0, SLI is supported: true
Link 0, Link is supported: false
Link 1, P2P is supported: true
Link 1, Access to system memory supported: true
Link 1, P2P atomics supported: true
Link 1, System memory atomics supported: true
Link 1, SLI is supported: true
Link 1, Link is supported: false
Link 0, P2P is supported: true
Link 0, Access to system memory supported: true
Link 0, P2P atomics supported: true
Link 0, System memory atomics supported: true
Link 0, SLI is supported: true
Link 0, Link is supported: false
Link 1, P2P is supported: true
Link 1, Access to system memory supported: true
Link 1, P2P atomics supported: true
Link 1, System memory atomics supported: true
Link 1, SLI is supported: true
Link 1, Link is supported: false

nvidia-smi nvlink -p
Link 0: 00000000:82:00.0
Link 1: 00000000:82:00.0
Link 0: 00000000:02:00.0
Link 1: 00000000:02:00.0

nvidia-smi nvlink --status -i 0
Link 0: 25.781 GB/s
Link 1: 25.781 GB/s

NVIDIA-SMI 418.56 Driver Version: 418.56 CUDA Version: 10.1

https://uploads.disquscdn.c... https://uploads.disquscdn.c...

Posted on 2019-09-17 18:15:44
Thiago Ribeiro da Motta


Posted on 2019-09-17 18:28:43

What GPUs are you using? Only the 2080 Ti, Titan RTX, and Quadro RTX 6000 and 8000 can do the full speed. They have two nvlinks each. The vanilla 2080 and 2070 super only have one, so they will only get 50 gbps. I'm not sure about the 2080 super yet.

Posted on 2019-09-17 18:37:56
Thiago Ribeiro da Motta

Using 2x Titan RTX. I endded up adding more info on the first post to make things clearer.

Posted on 2019-09-17 19:36:39

I just looked over your additional info (sorry for the slow reply, I've been out of the office). I am wondering if the measurements you are looking at might only be considering unidirectional bandwidth? Or, alternatively, if the bridge you have is either not quite seated correctly or is defective - only allowing one of the two sets of NVLink connections to work?

I would try the following:

- Reseat the whole bridge, with the system turned off (take it off and put it back on again, to make sure it is in place properly)

- If you are running Windows, try the NVLink test tool we have put together (I am more familiar with the way it formats results): https://www.pugetsystems.co...

Posted on 2019-09-23 18:16:31
Ziling Zhang

I have the same setup as you. 2x Titan RTX and a 3 slot Geforce RTX NVlink bridge. And I saw 25Gb/s per link, and 50Gb/s total too. It could be that my motherboard's PCI-E lane distribution is 16x and 8x for the two card, (optimal is 16x/16x but two slot apart, too hot for these), is it possible that the PCI-E lane count influence the NVLink speed?

Posted on 2019-11-10 11:05:19
Donald Kinghorn

I think what you are seeing from nvidia-smi ouput is correct. You have 25GB/s on each of the 2 lanes. That's 50GB/s total (in 1 direction!). That means that your bi-directional bandwidth is ~ 100GB/s. NVLINK is a duplex link.

Take a look at https://www.pugetsystems.co...

Posted on 2019-11-11 18:37:44
kul singh

Ryzen 9 3900x
2x rtx 2070 super
Ssd 500gb
HDD 2tb
Ram 3200 mz 32gb
This system is good for both 4k video editing and 3d rendering sir

Posted on 2019-09-21 08:39:03
Donald Kinghorn

Bidirectional bandwidth is ~100 GB/s for Tu102 i.e. 2080Ti, RTX Titan, Quadro 6000 8000
For Tu104 which is 2080 and below it is ~50 GB/s Tu102 has 32 i/o lanes and Tu104 has 16

It looks like 2080 Super is Tu104 ...

Posted on 2019-09-17 22:21:28

Do you know if the GV100 bridge will work with the Tesla V100? The connectors look identical and the cards are very similar, so you'd imagine so, but even Nvidia customer support don't seem to know what NVLink connectors are designed to work with the V100.

Posted on 2019-12-11 14:43:53

We don't have any Tesla cards here to test, but my understanding is that the V100 and GV100 are almost the same - just with the V100 lacking any video outputs. So I think your best bet would be the bridges from the GV100, but I'd try and get them from somewhere with a good return policy just in case :)

Posted on 2019-12-11 17:15:15

Thanks. I'll report back if the GV100 bridge works.

Posted on 2019-12-12 14:35:47

I have had confirmation from PNY that the V100 supports the GV100 bridge (P/N NVLINKBRIDGE2-PB), however upon trying to install a setup with these parts I've found that the NVLink connectors on the cards are blocked by the backplate and side cover. There's no mention of this problem in the manual for the bridge, as it is geared to the GV100 only. I'm waiting to hear back from PNY support. https://uploads.disquscdn.c...

Posted on 2020-01-27 10:09:53

Oh, that is a bummer! They just didn't cut out the heat spreaders (and maybe the heatsink shroud too, from the look of it?) in such a way that the bridge could be installed.

Posted on 2020-01-27 17:18:41
Wojciech Fus

Can you confirm whether Quadro RTX
6000 2/3-Slot will still connect two 2080Ti successfully?

I heard drivers might have disabled the Nvlink functionality in 2080Tis.

Thanks a lot for help!

Posted on 2020-10-15 10:55:57

Within the last month I tested a pair of RTX 2080 Ti cards in SLI / NVLink using a GeForce-branded 3-slot NVLink bridge and multiple recent driver revisions. It worked just fine. I haven't specifically tried a Quadro bridge recently, and don't have one handy, but electrically they should be the same as far as I am aware. They certainly worked in the past, as shown above :)

What does *not* work anymore, and hasn't for almost a year now, is *dual* NVLink in Windows (four cards in two pairs). This used to work, but more recent drivers have messed it up: https://www.pugetsystems.co...

Posted on 2020-10-15 16:23:37
Dylan Dale

So it would be possible to use a NVLink on a RTX2080 Super matched with some quadro cards? Would the Quadro RTX 4000 be able to link to the 2080?

Posted on 2021-05-02 01:01:12

No. The chart above is intended to show that some of the RTX 20 Series and Quadro RTX series cards can use various NVLink bridges (even ones that aren't a match for their specific brand) the actual pair of cards connected by such a bridge has to be an exact match. So two RTX 2080 Super cards together, or two Quadro RTX 6000 cards, but not one of each. Moreover, the RTX 4000 isn't NVLink capable at all (the RTX 5000 is, but it uses a smaller, custom NVLink bridge that is *not* interchangeable with any of the others). Hopefully that clarifies things for you :)

Posted on 2021-05-02 22:06:27
Travis Santelmann

I cannot get my SLI 2080Ti’s to function with the (RTX6000/8000 2-Slot) Nvlink bridge.

This is my system.

7980XE direct die @5Ghz
Evga X299 Dark motherboard
DDR4 4000Mhz CL15
(2) 2080Ti’s both on same bios

Both cards work, and both cards function 100% individually. I can see them in windows device manager, and the nvidia control panel even says 2x2080Ti. I have tested games on the cards individually to confirm function.

However, I cannot enable SLI in the nvidia control panel at all. I have performed a clean windows install, after DDU cleaner did not work. Then I moved on to multiple different nvidia drivers, after all these steps failed to fix my issue, I moved on to testing (3) different motherboard bios for my X299 Dark lol. I was working in a tight build so I had to go with a Quadro RTX60**/80**(2) slot nvlink bridge unfortunately, and now I am regretting my decision about that. Both cards are even getting PCI-e 3.0 x8, and as I said before both cards work fine individually in games and applications. (Just not together in SLI)

And I still can’t enable SLI. No option exist in the driver whatsoever.

I may have to go with standard GeForce NVlink RTX (4) slot bridge, and just rearrange my custom loop to make that card spacing fit. Then I would even get X16/X16 too.. Instead of X8/X8 and no SLI support at all lol.

RTX6000/RTX8000 (2) Slot bridge is not working in SLi. Buyer be warned. I wanted to share so you don’t waste hours building a system in to a custom water loop. only to realize (It doesn’t work)

Posted on 2021-06-22 19:49:23

Oh wow, I'm sorry that didn't work out for you! It sounds like you've already tried all of the troubleshooting steps I would have advised as well. I'd be really curious to hear if the option to turn on SLI is there if / when you try the 4 slot bridge, or if there is something else going on that is interfering with it entirely. :(

Posted on 2021-06-25 20:07:23
Travis Santelmann

I actually fixed it. I didn’t have the bridge plugged in all the way lol.

Posted on 2021-06-26 15:55:50
Travis Santelmann


RTX6000/RTX8000 Nvlink 2 slot bridge DOES WORK!!!

I was having issues before, and after days of trouble shooting, it turns out that my nvlink just wasn’t plugged in all the way..

Amazing how we can look past the simplest of things.

(User error was all the caused issues)

RTX6000/RTX8000 2 slot bridge does work with SLI 2080Ti’s!!


Posted on 2021-06-26 01:34:18

That's great! We've all been there (user error) :)

Posted on 2021-06-28 17:42:13