Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1266
Article Thumbnail

How to Enable and Test NVIDIA NVLink on Quadro and GeForce RTX Cards in Windows 10

Written on October 26, 2018 by William George


NVLink is a technology from NVIDIA for creating a high bandwidth link between two of their video cards. It can be used for many things, from basic SLI for faster gaming to potentially pooling GPU memory for rendering large and complex scenes. What NVLink can be used for depends on how software developers write their applications, and there is a lot of exciting potential for this technology.

However, as of the time we are publishing this, there is no easy way to tell if NVLink is enabled and functioning within Windows 10. Further, the way to enable NVLink is different depending on what sort of video cards you have. We have put together this guide to help: covering the different ways to enable NVLink, along with a small utility to easily test and make sure it is working as expected.

If you have already set up your system for NVLink, you can skip ahead to download and run the NVLinkTest utility.

Installing the Physical NVLink Bridge

Regardless of what cards you are using, for NVLink to function they must be connected by a physical NVLink bridge (or in some cases, two bridges). The bridge needs to be the right size to reach between the cards, which can potentially be 2, 3, or 4 PCI-E slots apart.

In addition to being the right size, the bridge you use also needs to be compatible with the video cards. For example, older Quadro GP100 bridges do not work on the newer GeForce RTX series cards. On the other hand, the GeForce bridges do seem to work on the Quadro GP100 cards - except that they are physically larger, so you can only fit one bridge instead of the two that the GP100 is designed to use. The safest option is to stick with bridges specifically designed for the video cards you are using. Some other combinations may work, but are not likely to be officially supported by NVIDIA.

Here are examples of what NVLink bridges can look like, as well as what it looks like to have them installed on cards:

NVLink Bridges from Quadro GP100 and GeForce RTX cards

NVLink Bridges from Quadro GP100 and GeForce RTX cards

NVLink Bridge installed on two GeForce RTX 2080 cards

NVLink Bridge installed on two GeForce RTX 2080 cards

Enabling NVLink on GeForce and Quadro RTX Video Cards

Once the physical link is securely installed, it is quite simple - though not obvious - how to enable NVLink on GeForce and Quadro RTX series cards. While NVLink itself is never mentioned in the NVIDIA Control Panel (so far as we could find) all you need to do is enable SLI. To do so, open the NVIDIA Control Panel, go to "Configure SLI, Surround, and PhysX" under "3D Settings" in the menu tree on the left, select "Maximize 3D performance" under "SLI configuration", and click Apply. Here is what that should look like, once SLI is enabled:

NVIDIA Control Panel Screenshot Showing SLI Enabled on GeForce RTX 2080 Video Cards

SLI itself is mostly used for gaming, and in the past we have recommended avoiding it for other applications - but if you want to use NVLink, this is the way to enable it on GeForce RTX 2080 and 2080 Ti cards, as well as the Titan RTX and Quadro RTX 5000, 6000, and 8000 models. Once enabled, you can test to make sure it is working.

Enabling NVLink on Quadro GP100 and GV100 Cards

NVLink originally debuted on the Quadro GP100 video cards, and is also found on the GV100. Both of these models require a pair of NVLink bridges for full performance, along with a more complex setup process than simply turning on SLI:

1) A third video card needs to be installed - ideally a Quadro from the same generation or newer than the cards you are bridging. This is required for video output, because enabling NVLink on the GP100 and GV100 cards will turn off their video outputs.

2) With both physical bridges installed and the third card connected to your monitor(s), open up the Windows command line. An easy way to do this is to right-click on the Start icon (not the normal left-click) and select Command Prompt.

3) Once that is open, navigate to "C:\Program Files\NVIDIA Corporation\NVSMI"

4) Run "nvidia-smi.exe -L" to see which numbers (starting with 0) are assigned to the two video cards you want to bridge.

5) For each of those cards, run "nvidia-smi.exe -i # -dm TCC", where # is the number of the GPU you wish to have in NVLink.

6) Once you have successfully run that command on both cards, reboot the system and test to see if NVLink is working. As mentioned before, TCC mode disables direct video output from those GPUs - so you need to have any monitors hooked up to additional cards.

TCC Mode Being Enabled on Quadro GP100 Video Cards

This is how TCC is enabled on Quadro GP100s via the command line in Windows 10.

Download NVLinkTest to Verify NVLink Functionality

As of the publication of this article, there is no way to check NVLink status in the NVIDIA Control Panel. However, NVIDIA does supply some sample code in their CUDA Toolkit which can check for the peer-to-peer communication that NVLink enables and even measure bandwidth between video cards. You can download that toolkit, install Visual Studio, compile the sample code, and then run it - or...

We have compiled one of those sample programs and put together a simple GUI to make it easy to run in Windows 10. Just click the Download button below, save the linked ZIP file, and extract its contents. Inside you will find three files: a Readme.txt, NVIDIA's p2pBandwidthLatencyTest.exe, and our NVLinkTest.exe. Run that last program and it will report whether NVLink is working or not.

9/16/2021 Update - We have added a CUDA 11 version of this utility, for use with newer NVIDIA drivers.

Download - CUDA 10 Download - CUDA 11

NVLinkTest Screenshots

Here are screenshots showing what NVLinkTest looks like while running. The first shows the intro screen, the second shows a successful run - where NVLink is enabled - and the remaining two show the errors you will get if CUDA is not available (which means you don't have a NVIDIA GPU or the NVIDIA drivers are not correctly installed) or if you do not have any active NVLink connections.

NVLinkTest.exe intro screen with basic instructions

Intro screen with basic instructions

NVLinkTest.exe NVLink enabled result screen

NVLink enabled - success!

NVLinkTest.exe CUDA not available error screen

Error if CUDA is not available

NVLinkTest.exe NVLink not enabled result screen

Result if NVLink is not enabled

Additional Resources

If you want more info about NVLink in Windows, check out another article we published on the topic.

If you are interested in NVLink, Linux, and machine learning, check out Dr Kinghorn's HPC Blog.

For a list of all articles where we've talked about NVLink, use our website's search function.

Need help with your Puget Systems PC?

If something is wrong with your Puget Systems PC. We are readily accessible, and our support team comes from a wide range of technological backgrounds to better assist you!

Contact Puget Systems Support

Looking for more support guides?

If you are looking for a solution to a problem you are having with your PC, we also have a number of other support guides that may be able to assist you with other issues.

Puget Systems Online Help Guides

Tags: NVIDIA, NVLink, Enabled, Disabled, SLI, Windows 10, GeForce, RTX, 2080, 2080 Ti, Quadro, GP100, GV100, Bandwidth, Test, NVLinkTest, p2pBandwidthLatencyTest, CUDA, Screenshot, How to, Enable, Verify
Perry White

I've been watching the p2pBandwidthLatencyTest.exe process run for 15 minutes or so. How long should this run before results are given?

Posted on 2019-01-30 17:35:13

Only a few seconds. Did you run it from the utility I linked to above, or directly from the command line? Also, do you have the latest NVIDIA drivers installed?

Posted on 2019-01-30 18:02:34

I have asus rtx 2080 and 2080 super I do NVlink and is not enable WHY ?

Posted on 2019-12-26 18:01:19

The 2080 and 2080 SUPER are different cards, and I think NVLink requires a pair of identical cards.

Posted on 2019-12-26 19:50:38

It's a simple card car with little improvements.
It's really stupid .

Posted on 2019-12-26 20:21:40

can I nvlink 2x RTX 2080 super ?

Posted on 2019-12-27 22:21:05

Yes, any two of the same model card that supports NVLink (which the 2080 SUPER does). I recommend using single fan, blower-style cards for multi-GPU configurations.

Posted on 2019-12-27 22:35:02

not sure if someone here can help me but i installed my two gpus, enabled nvlink, and confirmed with the nvlink test program. but when i play games, particularly rainbow six siege which according to several nvlink tests and reviews appears to scale very well, i actually get worse frames on nvlink setup compared to single gpu setup. is there some additional settings that i'm missing? or how can i troubleshoot this so i can get nvlink to work properly?

Posted on 2020-01-09 07:48:35

Hmm, it has been so long since I've actually done any game-focused testing that I am not sure what to recommend in this situation. I play games myself, but have never owned a dual GPU / SLI / NVLink setup. Maybe try asking in more gaming-centric online forums, like on Reddit, or on Twitter or YouTube channels where dual GPU is featured regularly... like JayzTwoCents?

Posted on 2020-01-09 18:43:07

thank you for the reply, i've been looking all over trying to find a forum where this might be more appropriate but to no avail, so i figured i'd give it a shot here since the review article was thorough and a great read. given the plethora of reviews out there citing such great scaling with NVLink i feel like it would be a simple error on my part, either in setting it up in nvidia control panel or something but alas, perhaps it was not meant to be :) thank you for your reply, greatly appreciated!

Posted on 2020-01-09 22:53:07

Hey this is me again.
I bought two RTX 2080 super and nvlink and it doesn't sync .
what can I do ?

Posted on 2020-01-15 18:59:46

Can you describe what you mean by "it doesn't sync"? What steps have you gone through - both in terms of hardware setup / installation and drivers / settings? And what OS are you running?

Posted on 2020-01-15 19:02:15

I have a windows 10 .
I have a motherboard ASUS Z390 Rog Maximus XI Hero ATX .
2 RTX 2080 super and nvlink everything is connected and work .
I checked for updates to Nvidia and windows (No updates)
I go to control panel and no setting appears sli.
I downloaded the software for sticky and it seems not to work .
try aura sync and all sync but 1 of card no sync .

Posted on 2020-01-15 19:32:46

It's like not seeing the second card

Posted on 2020-01-15 19:34:11

Hmm, do two cards show up in Windows' Device Manager? And in the NVIDIA Control Panel?

Do both cards have their own power connections from the power supply?

Posted on 2020-01-15 19:47:01

Both cards have connections for power supply
My power supply 1000 w
Can't see the card in Control Pane / task manger .

Posted on 2020-01-15 20:04:50

Have you installed the latest NVIDIA drivers, from their website? If not, I would recommend doing that - and then checking device manager again to see what shows up. If both cards are properly seated in their PCI-E slots, powered up, and working then there should be two entries in Device Manager, under the Display Adapters section. Maybe take a screenshot of that after making sure the latest drivers are installed and post it here?

Download link: https://www.nvidia.com/Down...

Posted on 2020-01-15 20:09:06

Make sure the the NVLink is snapped on firmly. Make sure you actually see lights on the nVlink adapter

Posted on 2020-02-04 21:39:27

It is worth noting that Quadro NVLink bridges don't have lights, and if you use a GeForce or Titan bridge on a pair of Quadros I don't think it will light up (I am not certain, but that is what sticks in my head from our previous testing).

Posted on 2020-02-04 21:59:33

but if I turn my gtx 1080 he see aboth

Posted on 2020-01-15 20:08:00

I'm sorry, but I didn't understand this message - can you rephrase it?

Posted on 2020-01-15 20:11:30
Ankur Chatterjee

i m really curious after watching linus tech tips vedio where the 2 most expensive nvidia losses its display output capabilities while NVlinked n instead of using a same generation card used just as display output(other than the 2 quadros ) can we use the display output from the motherboard itself then also will the nvlink work??

Posted on 2020-07-03 05:20:02

I've never actually tried that, mostly because the systems I usually deal with NVLink on don't have onboard video capabilities. If I end up using it on a system that does, I'll try to remember to give that a shot.

Posted on 2020-07-06 17:12:09
Ankur Chatterjee

its just waste of graphics card as a sole purpose of display out n all the cpu's with integrated graphics are lower tier so it will be useless the solution i found is using a pass through vedio capture card more specifically ELGATO 4K PRO MK.2 so it will be much appreciated if ur team test this out n i will be egarly waiting for the results n your reply cuz i dont want to waste money as personally owning a rig is expensive n also their is no other option for consumer motherboards supporting RTX quadro 8000 sli.

Posted on 2020-07-12 11:22:52

I don't think I understand what you are suggesting. The Elgato capture cards are for taking in an external video signal - for example from a camera, console, or another computer - so that it could be recorded or streamed via the computer hosting the Elgato. We use them in the streaming PCs we sell, and I own one of their external models for similar use at home. But they can't provide video output on their own, as they aren't a graphics card.

Can you perhaps rephrase your idea in another way, so I can better understand?

Posted on 2020-07-13 20:16:03
Ankur chatterjee

so lemme try again , i was thinking of passing through the vedio output With the help of my INTERNAL ELGATO 4K PRO MK.2 PCIE CAPTURE CAR(i.e vedio signal from quadros into PCIE bus to elgato to a monitor) from the same PC having the 2 quadros 8k i hope you understand what i m try to experiment cuz it will be great to solve the problem of no display via nvlinking the quadros and i was also trying display options in the bios n set it to motherboard instead of PCIE devices n it worked sorta (RESOLUTION N BANDWIDTH ISSUES),so in conclusion i still don't have any solution for using the quadros in NVlink without the help of another graphics card of same generation as solely as a vedio output

Posted on 2020-07-22 21:24:22

I think I understand what you are suggesting, but I don't think it will help. The Elgato capture cards don't create a video signal of their own, they simply take in an input (it could conceivably be from a video card in the same system), allow recording / streaming of that input, and then pass it along as an output as well. If you don't have a working video output from the system to begin with, then the Elgato isn't going to help.

However, if you are using a pair of Quadro RTX 8000 video cards in SLI / NVLink then you should have working video outputs from one of them. I'm still not entirely clear on how LTT chose to set their cards up, but in my own experience with Quadro RTX cards (*not* the older GP100 & GV100, those behaved differently) I was able to get NVLink working simply by connecting them with a NVLink bridge and then enabling SLI in the NVIDIA Control Panel. That should leave the video outputs on the "primary" card in the SLI pair completely active & usable.

Is that how you are enabling NVLink on your configuration, or are you doing it a different way?

Posted on 2020-07-23 17:26:02
Ankur chatterjee

yeah u solved my problem thanks sir , thou i m pretty sure you used a intel cpu because when i used the threadripper 3990x it dint but when i used the intel xeon 3175x it worked fine moreover i guessed they used a non certified NVLINK server MOBO by nvidia, anyways i appreciate ur time on this THANKU VERY MUCH

Posted on 2020-07-27 05:33:15
Bill H

I am having a very tough time finding a nvlink for my RTX2080Ti cards. Could I implore anyone who might have one to sell or where I could find one please?

Posted on 2020-10-10 01:59:00

In case anyone out there does have a bridge available for sale, it might help to know what size you need. How far apart are the slots in which your pair of RTX 2080 Ti cards are installed? (2, 3, or 4 slots - counting from the one where the first card is installed, as slot 0, to the one where the second card is located)

Posted on 2020-10-12 17:10:56

I have a pair of 2080 Supers connected with a Titan RTX bridge (which I thought should work? The LED logo lights-up...) but I just can't get the system to recognise the connection - the control panel just says "configure sound, PhysX" (no mention of SLI) and your test utility linked above doesn't detect anything. Using the latest 456.71 studio driver. Any suggestions most appreciated.

Posted on 2020-10-23 14:21:07

Hmm, I haven't tested SLI / NVLink in the very latest driver (the one you mentioned) but I did verify that it worked on a pair of 2080 Ti cards in release 456.38 (the one that accompanied the 30 Series release). I am not in the office today, but sometime next week I can probably give it a shot on the new driver.

If you have some time to play around with things, though, the top two things to try would be:

- Reseating the NVLink bridge, to make sure it is firmly in place on both cards (only do this with the system powered off, of course)

- Try an older driver revision, in case it is something NVIDIA changed in the last release. They messed up multiple NVLink pairs in a driver release a while back, so they could easily mess up single pairs now if they wanted (see: https://www.pugetsystems.co...

Posted on 2020-10-23 17:06:56

Thanks so much for the advice.

I will try re-seating the bridge and then some older drivers (particularly 456.38). I have tried 456.71 and 432. I saw posts complaining that NVlink didn't work after 419, so I tried that too (had to hack the inf files because it gave the 'no matching hardware' error, the driver then tried to install but failed).

Do I need an 'SLI enabled' motherboard? There seems to be conflicting advice on this point. (currently I'm using an AMD B550 chipset).

Thanks again, I really appreciate the help.

Posted on 2020-10-24 00:41:11

Yeah re-seating did nothing, neither did other driver versions. I think I may just have the wrong motherboard. Lesson learnt.

Posted on 2020-10-26 09:59:00

I'm sorry that didn't help! I'm honestly not sure about the motherboard - I've tested NVLink on Intel Core & Core X and AMD Threadripper... but never Ryzen that I can remember, and definitely not the B550 chipset. That is a really interesting question, though; I may look into that.

Posted on 2020-10-26 17:00:58

Yeah, I read some articles saying that NVlink was handled entirely by the cards and the bridge, and that your motherboard was irrelevant so I didn't think to double-check, but I suspect that's not the case. Guess that's my bad. Thanks for the help just the same!

Posted on 2020-10-27 04:19:54

I don't have a B550 board available to test on, but I just checked dual RTX 2080 Ti cards with a NVLink bridge on X299 (Core i9 10980XE) using driver 456.71 and they went into SLI and tested correctly with our NVLink utility. I can't test on a pair of 2080 Super cards, as we only have one here in our lab, but I've never seen 20 Series cards behave differently from each other... so I suspect your conclusion about it being the motherboard may be right. It is worth noting that the NVLink bridges sometimes really have to be pressed into place rather hard - it can look like they are seated when they are not. I know you already checked that, but I thought I'd mention it just in case. Although you did say the logo lights up, so it is getting power and must at least be partially on... :/

Out of curiosity, what exact motherboard do you have? And which slots are the two video cards in? And lastly (sorry for the barrage of questions): what size is the NVLink bridge you are using?

Posted on 2020-10-29 18:55:27

Apologies for the slow reply, and thanks again for showing an interest, I appreciate it.

It's an Asrock B550 Phantom Gaming 4, so pretty much the bottom-of-the-line board (spent all my money in GPUs since GPU performance is what I need for rendering) so it's no surprise it doesn't support a lot of high-end features. That said I haven't found a single B550 board yet that does claim to support SLI.

The board only has two PCIe slots - one Gen 4 and one Gen 3, both 16x (which is at least better than my Gigabyte X570 UD board that only offers x4 on the second and third PCIe ports!). I'm using a 4-slot Titan RTX NVlink bridge.

I also have an older X58 board with twin 20 series cards in it that does support SLI, but I don't know whether it's worth hunting-down a rare (and increasingly expensive) 2-slot bridge for such an old system...

I'm not expecting a performance boost from NVlink, but I am interested in the fact that NVlink allows the cards to pool memory which is desirable for large/complex scenes.

Posted on 2020-11-11 01:12:23

I finally had a moment to look up that board, and I think I may see the problem. The second PCI-Express x16 length slot is really only a PCIe Gen 3 x4 connection... which means it is coming from the B550 chipset, while the main x16 slot is PCIe 4.0 and coming from the Ryzen CPU itself. I bet the fact that both slots aren't originating from the same PCIe controller location is part of why this isn't working for you.

I'll still try to give this a shot on one of the B550 boards we are carrying, when I can get my hands on a sample. It is the Gigabyte B550 Vision D, and the two x16 length slots both come from the Ryzen CPU (so they act as one PCIe 4.0 x16 or two x8, if both slots are occupied). I expect that will allow use of SLI / NVLink, and hope to test it soon.

I'm sorry to be the bearer of bad news :(

Posted on 2020-11-14 00:46:42

Well I think the writing was on the wall for my board, but like i said it's bare-bones mobo anyway.

Interested to see how the Gigabyte B550 Vision D performs though.

Posted on 2020-11-17 03:03:09
Justin M

Thanks a bunch for this article. Just wanted to confirm that the GV100 bridge works with GP100s. However, SLI does not need to be enabled (and can't be) for NVLink to work. I'm getting a 50 GB/s increase in P2P bandwidth with a single bridge using the above test. This is with Quadro drivers ver. 461.09 and TCC enabled.

Posted on 2021-01-22 18:27:33

Nice! Yeah, on those particular cards my experience was that you have to use the TCC method, rather than the SLI method (which is used on GeForce and "normal" Quadro cards). I'm glad this article was helpful :)

Posted on 2021-01-22 20:42:09
Tane van der Boon

Hi William M George, thanks for the article and answers. Sorry if I am stupid and missed this, but I wanted to confirm if NVLink supported memory pooling for the RTX 8000 series cards? For the deep learning workloads I am using, ideally the system (Nvidia-smi) would see this as a single GPU with a pooled set of memory that can be utilised. Is this possible? and if not supported by the RTX 8000 are there other cards that do support it? Thanks!

Posted on 2021-03-09 19:13:45

NVLink on its own does not pool memory. All it does is enable high-speed, peer-to-peer communication between the linked video cards. To actually do anything with that link, like sharing memory resources, software has to be specifically written to detect the connection and utilize it in whatever way the developers want to. As far as I am aware, that description applies to all cards that have NVLink support (including the RTX 8000).

Posted on 2021-03-09 21:54:55
Marek Kovac

Is it possible to use Quadro cards with TCC enabled for openCL computing? To be more specific, I'd like to merge 2x A5000 to get 48GB of ram for Houdini simulations. Not sure if Houdini can see the cards if they are in TCC mode. There is a way to assign specific GPU inside "houdini.env" file to openCL compute only, problem is, I'm not sure, if windows see TCCed GPUs as one device to assign.

Thanks for any info :)

Posted on 2021-08-02 13:03:21

So I can't really answer this question for you, as I don't have any personal experience with Houdini, but I can give you some info that may help:

1. First off, I haven't tried TCC mode on newer Quadros like the RTX A5000s, so don't know if they work in that mode - or even if they do, whether that mode will enable NVLink (assuming you bridge the cards physically) like it did with the older GP100 and GV100 models.

2. Beyond that, I also don't know if Houdini will "see" the cards in TCC mode or not. I suppose you could test it out, but please note that GPUs in that mode no longer provide display output - so you would need a third video card in order for that to work at all.

3. Moreover, just enabling TCC doesn't do memory pooling: you would also need the cards to be NVLink'd (as alluded to in #1 above) and then on top of that Houdini itself would need to be able to recognize the NVLink connection and make special use of it for you to see any benefit. Again, without hands-on time with that software I am not sure if it has such capabilities or not.

Posted on 2021-08-02 16:16:05
Marek Kovac

Thank you for sharing the info!
I'm aware that I need a third GPU for this. Also I thought that with TCC mode and NVLINK enabled, GPUs will merge into one device so to speak, otherwise I don't see a difference betweet Quadro and Geforce GPUs, because even 3090 (not supporting TCC) can memory pool if software (like Redshift) utilize it.
I was thinking that TCC is doing the memory pool under the hood so that software do not have to worry about it and it just sees One device with 48GB of RAM. I might be wrong.

Posted on 2021-08-02 16:38:26

"I was thinking that TCC is doing the memory pool under the hood so that software do not have to worry about it and it just sees one device with 48GB of RAM."

So far as I am aware, that is not the case. TCC mode is rather weird, and the only practical benefit I am aware of from it is (or at least, was) letting Quadro cards get NVLink'd without needing to use SLI mode (like GeForce cards require, and which can be very finicky). I'm sure there is more to it than that, but I don't believe that automatic memory pooling is part of it.

Posted on 2021-08-02 17:18:05
Marek Kovac

Hm that is interesting, will have to read more about it, I can't find any clear information about it all, it's like they make it confusing on purpose..3090 has nvlink but only SLI, it is basically Titan card but no TCC.. :D oh damn you nvidia

Posted on 2021-08-02 21:52:30
Adam Hendry

Can you please provide another "NVLinkTest.exe" for CUDA 11 and/or provide the source code so users can rebuild the tester for new toolkits?

Posted on 2021-08-20 18:49:51
Adam Hendry

Please create a new program for CUDA 11. It would help if you could provide users the source code to your program (perhaps hosted on GitHub) so we can update it for newer versions of CUDA. It should be noted that the newest NVIDIA graphics drivers (which ship with NVIDIA Control Panel) no longer have the "Configure SLI, Surround, PhysX" link under "3D Settings", especially for Quadro cards. Instead, simply setting TCC mode with nvidia-smi.exe is sufficient. However, since your program only supports CUDA 10, it gives the false impression that NVLink only works on CUDA 10. It works on CUDA 11, but your checker program must be updated.

Posted on 2021-08-23 23:09:41

I'm looking into this - unfortunately the new CUDA 11 utility formats results quite differently, so it will take some more work to get my utility to work with it properly.

Posted on 2021-09-09 18:01:01

Okay, can you try this utility? It turns out I didn't need to update our script (so far as I can tell) but rather just compile the new version of p2pBandwidthLatencyTest from the CUDA 11 samples.


I should also note that when I was working on this today I tested it out on a pair of RTX A5000 cards on a Threadripper Pro system, and I still had to put them into SLI via the NVIDIA Control Panel in order for NVLink to function. I think you could use TCC as well, but my understanding of that (from the last time I tried it) was that it would disable the video outputs on the affected cards. Do you run a third card for video output?

Posted on 2021-09-15 23:27:17