Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1221
Article Thumbnail

Setting Graphics Card Software to Display 10-bit Output

Written on August 17, 2018 by Jeff Stubbers


True 30-bit color output (10-bit per channel R,G,B) in editing programs which utilize 10-bit per channel through OpenGL can only be attained when utilizing workstation graphics cards, such as the AMD Radeon Pro or the Nvidia Quadro cards at this time.

However, after our article on An Introduction to Understanding 8bit vs 10bit Hardware, there have been questions about how to enable DirectX 10-bit after you have ensured you have the correct hardware. This is a quick guide for setting up 10-bit for full screen DirectX programs - such as games - through your graphics card software once you have both a 10-bit per channel capable graphics card (Nvidia Quadro / AMD Radeon Pro, and some Nvidia GeForce / AMD Radeon) and a 10-bit per channel monitor connected to that graphics card.  Please note that at this time it is important to connect your 10-bit monitor to your graphics card output with a DisplayPort cable for the best possible image to be displayed.

Nvidia 10-bit Setup

For Nvidia Quadro graphics cards:
1. Right mouse click on an empty part of your desktop to get the right mouse menu. Then choose "NVIDIA Control Panel."

2. From the left column, choose "Display - Change resolution."

3. From the right column, under "3. Apply the following settings.", select the radio button for "Use NVIDIA color settings."

4. From the "Output color depth:" drop-down menu, select (10-bit per channel RGB) "10 bpc."

5. In the bottom right, choose "Apply" push button to accept the changes. That's it!

AMD 10-bit Setup

For AMD Radeon Pro graphics cards:

1. Right mouse click on an empty part of your desktop to get the right mouse menu. Then Choose "AMD Radeon Pro and AMD FirePro Settings."

2. From the "AMD RADEON PRO AND AMD FIREPRO SETTINGS" window, select the "Display" button.

3. From the following window, in the lower left, select the "Color Depth" dropdown, and choose "10 bpc". That's it!


It is worth noting that these specific changes are designed to enable 10-bit for programs that utilize DirectX.  The majority of programs that will benefit from DirectX 10-bit look to be games at this time, so these changes can also be possible for some "consumer desktop class" Nvidia Geforce, and AMD Radeon graphics cards limited to these DirectX programs.  See this article on enabling 10-bit per channel in Photoshop which uses OpenGL, rather than DirectX for 10-bit per channel color.  10-bit through OpenGL currently requires workstation class graphics cards (AMD Radeon Pro / Nvidia GeForce), and will not work with consumer class Nvidia GeForce, and AMD Radeon graphics cards, until the graphics card manufactures decide to make some changes to enable this.

I hope this helps!

Tags: Graphics card, Monitor, 8-bit vs 10-bit, Settings, 10 bit, 30-bit
Larry Page

This also works with consumer Nvidia cards. I did this with my 1060 GTX!

Posted on 2018-08-27 04:34:32

Interesting - 10 bit support is not supposed to be available on GeForce cards so far as I am aware. I have a 10-bit capable monitor at home myself, and tried this on my GeForce GTX 1080, and the only option I could select was 8bpc. :/

Posted on 2018-08-29 21:50:45
Luca Pupulin

Hi All,
10-bit Output should be possible on most consumer (gaming) cards using DirectX,as noted in the article,whilst you need a professional-grade graphics card to display 10 bit per channel using OpenGL,as far as I am aware....

Posted on 2018-09-07 16:03:40
Jeff Stubbers

Correct - for 10-bit output (other than DirectX 10-bit output) a professional-grade "workstation class" graphics card such as the AMD Radeon Pro, or Nvidia Quadro would be required, along with a 10-bit monitor. More details on that will follow in an upcoming article.

Posted on 2018-09-07 16:21:45
Luca Pupulin

Hi Jeff,
thank you for your confirmation and for your articles;you guys at Puget are really awesome!


Posted on 2018-09-07 16:37:49
Usman Dawood

I know this is an old post but, I just tried this and I was able to do select 10 bit for one of my monitors that connects via DP. Could this just be an error?

Posted on 2019-02-21 23:40:05

Consumer NVIDIA cards can do 12 bit if your panel supports it. I have done this with a Samsung MU8000 series 65" TV and NVIDIA 1050ti/1030 graphics cards. I have also been able to do 10 and 12 bit color on the above-referenced Samsung TV with AMD RX 560 4GB graphics cards. The problem for many people in getting higher bit rates is their HDR TV very likely might be using an 8 bit panel the emulates 10 bit. If this is so then the only option you will see in the in the card settings is 8 bit. The one limitation is to run this high of a bit rate in 4:4:4 color you will be limited to 30 fps or lower. If you are willing to back off 4:4:4 then you can run 60 fps. Since I am only concerned with HTPS type of performance I don't mind the 30 fps limit. Also, I would recommend using an RX 5XX series card as it gives the option for both 10 and 12 bit while NVIDIA cards only give a 12 bit option. Many panels will do 10 bit but not 12 bit.

One thing I have found is that using the DP option to connect usually provides access to higher fps option at higher chroma subsampling ratres.

Posted on 2019-04-13 18:22:43
Oliver Banasiak

You write " and some GeForce cards". If I'd like to use 10 bit in AE, PP and PS, is there a card (2080TI?) that can output 10bit OpenGL in editing applications? So much confusion on this one! I hate the fact I need a 5000$ card to use a feature that should be on the top end RTX cards as well. I do 3D and GPU rendering so need super fast cards but Quattro is limited in cuda cores so two cards simultaneously is best option!

Posted on 2018-10-04 12:23:29
Jeff Stubbers

Hi Oliver,
Unfortunately GeForce graphics cards only offer 10-bit for DirectX programs - such as games. As you note, professional programs like Adobe Photoshop access 10-bit per channel through OpenGL - which unfortunately currently requires a workstation class card like the Nvidia Quadro line or AMD Radeon Pro cards. One of the main differentiating features between the GeForce and Quadro graphics cards has been that Quadro graphics cards offer 10-bit per channel through OpenGL, where the GeForce only offers 10-bit for DirectX, and 8-bit per channel through OpenGL. I wish the graphics card manufacturers would just enable 10-bit per channel for both lines of graphics cards. That would make things much easier for all involved.

Posted on 2018-10-04 14:22:04
Oliver Banasiak

Thanks for the reply. Then why is it stated in the article that (and some GeForce cards)? Thanks again, will research my needs a little further since a RTX Quadro is out of the question because of it's insane price point since, for my applications, I'll get roughly the same performance out of the RTX 5000 as I do from a RTX 2080TI. Damn nVidia.....

Posted on 2018-10-04 14:26:45
Jeff Stubbers

The article is about answering the questions about how to enable DirectX 10-bit through the manufacturers software. "After our article on An Introduction to Understanding 8bit vs 10bit Hardware, there have been questions about how to enable DirectX 10-bit after you have ensured you have the correct hardware. " I see your point though, and will update to clarify this is a guide to enable 10-bit *for DirectX programs* to reiterate that point. Thank you!

Posted on 2018-10-04 14:36:36
Sven Hülsemann

I have a GeForce GTX 1050 Ti (connected to my Dell U2713H via mini-DP) and can see the 10bit but only in nuke (9.0 v1)! It doesn't work in photoshop or after effects or anywhere else on my windows 10 system. Tested it with a testimage from https://github.com/jursonov....

Posted on 2019-06-24 22:46:14

Thanks for this incredibly informative article!
I am looking to gets 10-bit monitor, like the NEC PA272W-BK-SV, for color-critical work in Nuke (and occasionally Resolve).

I have a PC w/ 2 x GTX 1080Ti’s that I use primarily for C4D+Redshift. I have PCIe lanes available for one or two more cards.
Can I get one of the lower-end NVIDIA Quadros like a P620 for getting a 10-bit signal path via DisplayPor to my monitor?
My other option was to get a Blackmagic Card or something similar, but I believe that converts RGB to YUV, or am I wrong?

Would software like Nuke or Resolve be throttled by having to send a 10-bit signal to a 2GB VRAM gpu? Or more precisely, am I limiting the use of the ‘beefier’ GTX cards by the software when adding another gpu to handle the video signal?

Posted on 2018-10-07 07:55:20
Jeff Stubbers

Yes, it is possible to add an Nvidia Quadro to a system with GeForce GTX graphics cards in order to get 10-bpc from the Quadro graphics card.

You should not see a performance hit in DaVinci Resolve so long as you set Resolve to only use the two GTX 1080 Ti graphics cards for compute in the Preferences. Otherwise the low amount of VRAM on the Quadro P620 graphics card would limit you.

Unfortunately, we have not tested Nuke so I am not quite sure how Nuke would react. I recommend contacting their support to find out how their program would respond.

The Blackmagic Decklink cards I am familiar with should allow you to set your choice of either RGB, or YUV output. These types of cards are designed to provide a proper video signal, which is really what you want for critical color monitoring, and is highly recommended in the professional color grading field. The Quadro has 10-bit support, but video I/O cards like the Decklink are designed specifically for passing unaltered video signal for film and broadcast television to your monitor, and a great choice if you don't want to mix graphics cards which could potentially slow things down.

Posted on 2018-10-09 00:21:33

Thank you very much for the response, Jeff!
The clears thing up. I think I will stick with going the Blackmagic Decklink route... as what I'm looking for is the unaltered video signal.

Posted on 2018-10-09 00:31:08
Loro Husk

Hi fco.gee

Which decklink are you going to get for color critical work in 10 bit?

Posted on 2018-10-15 00:33:13

Hi Loro Husk,

I was looking at the BlackMagic DeckLink Mini Monitor 4K. It's $195 and has a 6G SDI and HDMI 2.0a output. For the NEC PA monitor I would use the HDMI.

According to BlackMagic's site this supports up to a 2160p30 signal w/ HDR, Deep Color (up to 12-bit), and metadata via the HDMI.

It also lists support for Resolve, Premiere, Photoshop, and Nuke, among other software... which is what I'm looking for.

I do want to call and confirm whether this will cover all of my needs.

The color space output specified by the Mini Monitor is Rec 601 and 709. While the NEC PA monitor can cover about 98.6% of AdobeRGB color space... which is about 15% wider than Rec 709. But I believe that is where the 14-bit 3D LUTs of the monitor come into play.

I don't know for sure though. Color science / theory is a complicated thing that I'm only now beginning to get some education on.

What I would like to figure out is whether the BlackMagic card limits the accuracy I would have if was working in AdobeRGB space or P3, and would the same limitations exist via a Quadro GPU. I don't really have a need for these color spaces at the moment, but I am trying to get a better understanding on how these things work.

Hope that was of some help.

Posted on 2018-10-16 18:46:03

BlackMagic DeckLink Mini Monitor 4K doesn't support 10bit HDR. You really wasted money. Of course, package wrote it can support, but it's not real. So I didn't buy it. I'm also considering Quadro or Radeon Pro now.

Posted on 2019-09-22 19:54:56
Alessandro Peter

I'm using that card to grade HDR10 contents. It fully output HDR metadata to my LG OLED when editing on Resolve, and TV fully recognize it as HDR.

Posted on 2019-11-19 12:53:43

I heard it from professional tech team of official seller in Korea. I think your LG OLED might turn on as fake HDR mode. Please check it again. and I found how to grade HDR with my Geforce GPU with nVIDIA studio driver. We actually don't need any Quadro and Radeon Pro. Of course, it's not for professional worker.

Posted on 2019-11-19 13:51:42
Alessandro Peter

So, better believe the word of a Korean reseller and assume Blackmagic puts fake labels and advertisements while my TV fools me showing me HDR mode when it’s just kidding about.
I came here just to comment on your wrong statements about the card, because as I already says, it works as expected. I do grade HDR10 and external monitor switch to HDR automatically due to metadata sent over HDMI. TV shows exactly the picture I see later on my other HDR devices (iPad Pro, iPhone 11 Pro.. and of course the OLED tv itself)

Posted on 2019-11-19 14:53:01

Then, What is your product name? I meant "Blackmagic DeckLink Mini Monitor 4K does not correctly support HDR" The professional user what I know posted on below link. But as it's Korean, please translate.

Posted on 2019-11-20 02:28:34
Alessandro Peter

Yes. That card. I’m using it since Davinci 14 and it does output rec.2020 10bit 2160p30. You have different options available in Davinci for the monitoring output format. Never tried with other software (eg. Premiere Pro), but for sure the card is able to output HDR. Seriously you believe that a giant company like Blackmagic would sell a product that doesn’t support all specs advertised?
You can simply post in their forum, the nice community of professionals will address all your doubts

Posted on 2019-11-20 03:07:50

Ok, Thank you for your kind answer. He who I mentioned may not know everything. :)

Posted on 2019-11-20 04:16:30
Marco Geração

Have you tried - by experience - using a quadro and a geforce card in the same PC ? I'm not sure what you say is possible. And if it is, it will depend on the OS you're using.

Posted on 2018-11-09 14:54:02

Hey Marco, Jeff talked about that a bit a few posts up, but I can expand on it a bit. We have used mixed Quadro/GeForce at times, but unless you are only using software that you know plays well with mixed GPUs (like DaVinci Resolve), I would avoid it if you can. It technically shouldn't be a problem for most applications, but I wouldn't call it stable. When we have done it, it worked great for us before we shipped the system out, but little bugs and issues tend to pop up over time. It was usually caused by Windows or software updates getting confused about the drivers and resulted in software either giving an error when starting up or other random issues. I don't think we ever had an un-fixable issue, but it definitely was annoying each time and took some work to get fixed.

Posted on 2018-11-09 17:16:15
Marco Geração

Hi there, Matt ! Thanks for the insightful reply.

Posted on 2018-11-09 17:24:10
Fahim Khan.

I am using titan Z for LG 34" monitor while K5000 for Eizo Radforce MX315W monitor. I know Eizo has 10 bit. No matter how many times I changed the settings to 10 bit for Eizo monitor, it goes back to 8 bit. Is there conflict with titan Z and K5000 as I have them both in the same PC?

Posted on 2019-03-19 08:50:53
Jeff Stubbers

Hi Fahim, While it may be *possible* to run both a Geforce and Quadro graphics card in the same system it is recommended to avoid doing so since it can result in instability. If you feel you must go this route, I would remove all graphics card drivers, and only install the Quadro drivers if you are looking for 10-bit support from the Quadro card. If for some reason that does not allow the Geforce to work properly (should not be an issue, but could), then I would recommend installing the Geforce drivers first, then the Quadro drivers second - to essentially overwrite common files, but leave the Quadro drivers available for the Quadro card. Again, not an ideal situation, and would not recommend this kind of setup. Rather I think you would be better off with just the Geforce, then add a 10-bit I/O card like a Blackmagic Decklink card for 10-bit support to your 10-bit monitor (keeping in mind these cards only display a photo or timeline when editing the file - otherwise the screen will be blank). If still experiencing issues, I recommend contacting tech support team of your computer manufacturer. Wishing you the best!

Posted on 2019-03-21 00:50:32

I've often wondered if an eGPU box might be a possible work around.

Posted on 2019-04-06 11:55:16
Cuauhtemoc Vega Torres

gtx and rtx series are now able to output 10bit.

Posted on 2019-04-28 05:30:25

In OpenGL, or only in DirectX applications? I've not yet seen 10-bit support in OpenGL (which many professional / prosumer apps use) on Nvidia cards outside of the Quadro series.

Posted on 2019-04-28 05:33:13
Gianky Pitzalis

Good evening and congratulations for all your interesting articles! If possible I would like information. In my pc I have an RTX 2060. I am considering the purchase of a 10 bit and 4k monitor to be used with the enabled adobe softwares and I was thinking of buying a Blackmagic Decklink. What I would like to do is use the Decklink only to connect the monitor, while the RTX 2060 use it to process the production process. Can they be used together? Is there a procedure to follow, for what I would like to do, to use both at the same time? From what I understand Decklink uses HDMI for 10 bits, so how is the connection between Decklink and monitor for example, between HDMI and Display Port in the monitor or between HDMI and Display port always of the monitor? Thank you in advance! Greetings.

Posted on 2019-05-02 21:47:47
Jeff Stubbers

While it is possible to connect both a GeForce and a Blackmagic Decklink card to the same monitor, I don't think it would be a usable, or enjoyable experience. You would have to toggle between the inputs on the monitor to get to your OS desktop, and between the Blackmagic Desklink output. While possible, that would seem near unusable practically speaking. The Blackmagic Decklink cards only display the photo or timeline you are editing for color correction purposes. When you are not editing a photo or timeline, the output from that card will be blank - so you could not use it to drag a program over as an extended desktop. So it typically works best to either have a Quadro graphics card to a single monitor, or a GeForce graphics card to your main "work" monitor, then a separate color accurate 10-bit monitor that you connect your I/O card (Blackmagic Decklink) to for color correction viewing. I hope this helps!

Posted on 2019-05-02 22:09:44
Gianky Pitzalis

Thanks for the reply! So if I understand correctly, for what I would like to do, or use the Decklink to view the 10 bits and the GeForce for data processing, I need two monitors. I also read in the "Hardware Recommendations" section, which every time I should export the image to display it with 10 bits.
So your advice would be to buy a Quadro? Do the new Rtx Quadro support 10 bits like the P series? Last question ... can the Rtx 2060 run on an 8bit 2k monitor?
Thanks again.

Posted on 2019-05-03 12:39:40
Jeff Stubbers

Correct. The GeForce graphics card would connect to a standard monitor for your programs. Then, the Decklink card would send the 10-bit per channel signal to a separate 10-bit monitor.
If you would prefer to use a single 10-bit monitor rather than a dual monitor setup, a Quadro graphics card would allow you to do that. Yes the RTX Quadro offers 10-bit per channel (R,G,B), just like the Quadro P series. Yes the RTX 2060 can support 8-bit 2K or 4K monitors. The specs for those cards should be listed on the individual manufacturers website.

Posted on 2019-05-09 19:37:48
Gianky Pitzalis

Thank you so much!

Posted on 2019-05-12 20:46:34

Great answer I’ve never seen before, Jeff.

Posted on 2020-04-21 14:11:38
Zhen Jie Ho

Is there a conclusive list of current-gen Geforce cards that support 10 bpc output? I'm on a Zotac GTX 1070 Ti but following the Nvidia 10-bit setup instructions above, there is only the 8bpc option in the dropdown. Do I have to have a proper 10-bit/8-bit FRC monitor for that option to be available in the dropdown? I was assuming it would be available despite the hardware limitations since 10bpc is just DirectX compliant.

Also, many thanks for this article. It has been very helpful in deciding whether to go workstation-class(which IMO is a stupid proposition considering prices of high-end enthusiast-class cards), or 10 bpc enthusiast-class cards with a separate 10-bit I/O card(which I'm now going with).

Posted on 2019-05-21 03:24:21
Carlos A. Pinto

Is the AMD Radeon Pro 580X that comes with the new Mac Pro, capable of 10-bit color?

Posted on 2019-06-11 18:40:53

You would need to ask Apple. The Radeon Pro line should support 10-bit displays, but Apple often gets custom versions of CPU and GPUs made just for them. Only they will be able to tell you for sure (if the person you end up talking to even knows what 10-bit display out even is).

Posted on 2019-06-11 18:57:52
Jon H

Perhaps pertinent, broader NVidia support: https://www.anandtech.com/s...

Posted on 2019-08-04 20:38:06
Jeff Stubbers

As noted at the top of this article highlighted in blue, Nvidia has updated the GeForce graphics card "Studio" driver to now enable 10-bit per channel (30-bit). So Nvidia Quadro, or 10-bit per channel I/O cards should no longer be necessary to enable 30bit (10bit per channel) color.
Again, Nvidia's article link: https://www.nvidia.com/en-u...

Posted on 2019-08-20 17:26:21

how do I get the settings to stick for nvidia cards? whenever i restart the settings revert back to default

Posted on 2019-09-10 08:26:45
Jeff Stubbers

Thank you for asking. Unfortunately, I don't know the solution to the drivers resetting themselves, but would recommend contacting Nvidia regarding this question, as they would be the ones that should know, as it is their driver that is setting these selections. https://www.nvidia.com/en-u... I checked with them, and they wanted to make sure the latest OS updates were installed, then suggested removing the GPU driver, and re-installing. YMMV. I hope this helps!

Posted on 2019-09-11 20:43:45

I know this is an old post, but I have a new question. I'm testing a Gigabyte Aero 17 HDR laptop, which has a 10-bit HDR display built by AU Optronics and NVIDIA RTX 2070 Super Max-Q GPU that uses either Intel's Optimus or Advanced Optimus tech (not sure which). Because that system routes the image through the integrated Intel graphics (I believe), you can't set NVIDIA resolution parameters through the NVIDIA Control Panel app. That means the method you've outlined won't work for the laptop's own display. The question is, how do I actually set or check that the panel is set to display 10-bit color?

Posted on 2020-07-07 14:35:35
Jeff Stubbers

Hi steventimothy. I appreciate the question. If your system has an Nvidia graphics card, I am surprised the Nvidia Control Panel would not be installed. Unfortunately, we do not offer laptops, so I cannot check for you. I do recommend contacting your laptop manufacture though, as they should be able to show you how to check and set that on their system. One check you could perform is right mouse click on your Windows 10 desktop, and choose "Display Settings" from the right mouse menu. From the Display settings dialog, scroll down and choose "Advanced display settings": For a desktop, this will show your resolution, refresh rate, *Bit depth*, and color format. This will show the bit depth currently set. Beyond that I recommend contacting your laptop manufacturer for more information on their hardware, as they would be the most familiar with it. I hope this helps!

Posted on 2020-07-07 15:02:55