Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1221
Article Thumbnail

Setting Graphics Card Software to Display 10-bit Output

Written on August 17, 2018 by Jeff Stubbers
Share:

Introduction

True 30-bit color output (10-bit per channel R,G,B) in editing programs which utilize 10-bit per channel through OpenGL can only be attained when utilizing workstation graphics cards, such as the AMD Radeon Pro or the Nvidia Quadro cards at this time.

However, after our article on An Introduction to Understanding 8bit vs 10bit Hardware, there have been questions about how to enable DirectX 10-bit after you have ensured you have the correct hardware. This is a quick guide for setting up 10-bit for full screen DirectX programs - such as games - through your graphics card software once you have both a 10-bit per channel capable graphics card (Nvidia Quadro / AMD Radeon Pro, and some Nvidia GeForce / AMD Radeon) and a 10-bit per channel monitor connected to that graphics card.  Please note that at this time it is important to connect your 10-bit monitor to your graphics card output with a DisplayPort cable for the best possible image to be displayed.

Nvidia 10-bit Setup

For Nvidia Quadro graphics cards:
 
1. Right mouse click on an empty part of your desktop to get the right mouse menu. Then choose "NVIDIA Control Panel."
49409

2. From the left column, choose "Display - Change resolution."
49407

3. From the right column, under "3. Apply the following settings.", select the radio button for "Use NVIDIA color settings."

4. From the "Output color depth:" drop-down menu, select (10-bit per channel RGB) "10 bpc."

5. In the bottom right, choose "Apply" push button to accept the changes. That's it!


AMD 10-bit Setup

For AMD Radeon Pro graphics cards:

1. Right mouse click on an empty part of your desktop to get the right mouse menu. Then Choose "AMD Radeon Pro and AMD FirePro Settings."

2. From the "AMD RADEON PRO AND AMD FIREPRO SETTINGS" window, select the "Display" button.
49414

3. From the following window, in the lower left, select the "Color Depth" dropdown, and choose "10 bpc". That's it!
49414

 

It is worth noting that these specific changes are designed to enable 10-bit for programs that utilize DirectX.  The majority of programs that will benefit from DirectX 10-bit look to be games at this time, so these changes can also be possible for some "consumer desktop class" Nvidia Geforce, and AMD Radeon graphics cards limited to these DirectX programs.  See this article on enabling 10-bit per channel in Photoshop which uses OpenGL, rather than DirectX for 10-bit per channel color.  10-bit through OpenGL currently requires workstation class graphics cards (AMD Radeon Pro / Nvidia GeForce), and will not work with consumer class Nvidia GeForce, and AMD Radeon graphics cards, until the graphics card manufactures decide to make some changes to enable this.

I hope this helps!

Tags: Graphics card, Monitor, 8-bit vs 10-bit, Settings, 10 bit, 30-bit
Larry Page

This also works with consumer Nvidia cards. I did this with my 1060 GTX!

Posted on 2018-08-27 04:34:32

Interesting - 10 bit support is not supposed to be available on GeForce cards so far as I am aware. I have a 10-bit capable monitor at home myself, and tried this on my GeForce GTX 1080, and the only option I could select was 8bpc. :/

Posted on 2018-08-29 21:50:45
Luca Pupulin

Hi All,
10-bit Output should be possible on most consumer (gaming) cards using DirectX,as noted in the article,whilst you need a professional-grade graphics card to display 10 bit per channel using OpenGL,as far as I am aware....

Posted on 2018-09-07 16:03:40
Jeff Stubbers

Correct - for 10-bit output (other than DirectX 10-bit output) a professional-grade "workstation class" graphics card such as the AMD Radeon Pro, or Nvidia Quadro would be required, along with a 10-bit monitor. More details on that will follow in an upcoming article.

Posted on 2018-09-07 16:21:45
Luca Pupulin

Hi Jeff,
thank you for your confirmation and for your articles;you guys at Puget are really awesome!
Cheers,

Luca

Posted on 2018-09-07 16:37:49
Usman Dawood

I know this is an old post but, I just tried this and I was able to do select 10 bit for one of my monitors that connects via DP. Could this just be an error?

Posted on 2019-02-21 23:40:05
PCRAMK'

Consumer NVIDIA cards can do 12 bit if your panel supports it. I have done this with a Samsung MU8000 series 65" TV and NVIDIA 1050ti/1030 graphics cards. I have also been able to do 10 and 12 bit color on the above-referenced Samsung TV with AMD RX 560 4GB graphics cards. The problem for many people in getting higher bit rates is their HDR TV very likely might be using an 8 bit panel the emulates 10 bit. If this is so then the only option you will see in the in the card settings is 8 bit. The one limitation is to run this high of a bit rate in 4:4:4 color you will be limited to 30 fps or lower. If you are willing to back off 4:4:4 then you can run 60 fps. Since I am only concerned with HTPS type of performance I don't mind the 30 fps limit. Also, I would recommend using an RX 5XX series card as it gives the option for both 10 and 12 bit while NVIDIA cards only give a 12 bit option. Many panels will do 10 bit but not 12 bit.

One thing I have found is that using the DP option to connect usually provides access to higher fps option at higher chroma subsampling ratres.

Posted on 2019-04-13 18:22:43
Oliver Banasiak

You write " and some GeForce cards". If I'd like to use 10 bit in AE, PP and PS, is there a card (2080TI?) that can output 10bit OpenGL in editing applications? So much confusion on this one! I hate the fact I need a 5000$ card to use a feature that should be on the top end RTX cards as well. I do 3D and GPU rendering so need super fast cards but Quattro is limited in cuda cores so two cards simultaneously is best option!

Posted on 2018-10-04 12:23:29
Jeff Stubbers

Hi Oliver,
Unfortunately GeForce graphics cards only offer 10-bit for DirectX programs - such as games. As you note, professional programs like Adobe Photoshop access 10-bit per channel through OpenGL - which unfortunately currently requires a workstation class card like the Nvidia Quadro line or AMD Radeon Pro cards. One of the main differentiating features between the GeForce and Quadro graphics cards has been that Quadro graphics cards offer 10-bit per channel through OpenGL, where the GeForce only offers 10-bit for DirectX, and 8-bit per channel through OpenGL. I wish the graphics card manufacturers would just enable 10-bit per channel for both lines of graphics cards. That would make things much easier for all involved.

Posted on 2018-10-04 14:22:04
Oliver Banasiak

Thanks for the reply. Then why is it stated in the article that (and some GeForce cards)? Thanks again, will research my needs a little further since a RTX Quadro is out of the question because of it's insane price point since, for my applications, I'll get roughly the same performance out of the RTX 5000 as I do from a RTX 2080TI. Damn nVidia.....

Posted on 2018-10-04 14:26:45
Jeff Stubbers

The article is about answering the questions about how to enable DirectX 10-bit through the manufacturers software. "After our article on An Introduction to Understanding 8bit vs 10bit Hardware, there have been questions about how to enable DirectX 10-bit after you have ensured you have the correct hardware. " I see your point though, and will update to clarify this is a guide to enable 10-bit *for DirectX programs* to reiterate that point. Thank you!

Posted on 2018-10-04 14:36:36
fco.gee

Thanks for this incredibly informative article!
I am looking to gets 10-bit monitor, like the NEC PA272W-BK-SV, for color-critical work in Nuke (and occasionally Resolve).

I have a PC w/ 2 x GTX 1080Ti’s that I use primarily for C4D+Redshift. I have PCIe lanes available for one or two more cards.
Can I get one of the lower-end NVIDIA Quadros like a P620 for getting a 10-bit signal path via DisplayPor to my monitor?
My other option was to get a Blackmagic Card or something similar, but I believe that converts RGB to YUV, or am I wrong?

Would software like Nuke or Resolve be throttled by having to send a 10-bit signal to a 2GB VRAM gpu? Or more precisely, am I limiting the use of the ‘beefier’ GTX cards by the software when adding another gpu to handle the video signal?

Posted on 2018-10-07 07:55:20
Jeff Stubbers

Yes, it is possible to add an Nvidia Quadro to a system with GeForce GTX graphics cards in order to get 10-bpc from the Quadro graphics card.

You should not see a performance hit in DaVinci Resolve so long as you set Resolve to only use the two GTX 1080 Ti graphics cards for compute in the Preferences. Otherwise the low amount of VRAM on the Quadro P620 graphics card would limit you.

Unfortunately, we have not tested Nuke so I am not quite sure how Nuke would react. I recommend contacting their support to find out how their program would respond.

The Blackmagic Decklink cards I am familiar with should allow you to set your choice of either RGB, or YUV output. These types of cards are designed to provide a proper video signal, which is really what you want for critical color monitoring, and is highly recommended in the professional color grading field. The Quadro has 10-bit support, but video I/O cards like the Decklink are designed specifically for passing unaltered video signal for film and broadcast television to your monitor, and a great choice if you don't want to mix graphics cards which could potentially slow things down.

Posted on 2018-10-09 00:21:33
fco.gee

Thank you very much for the response, Jeff!
The clears thing up. I think I will stick with going the Blackmagic Decklink route... as what I'm looking for is the unaltered video signal.

Posted on 2018-10-09 00:31:08
Loro Husk

Hi fco.gee

Which decklink are you going to get for color critical work in 10 bit?

Posted on 2018-10-15 00:33:13
fco.gee

Hi Loro Husk,

I was looking at the BlackMagic DeckLink Mini Monitor 4K. It's $195 and has a 6G SDI and HDMI 2.0a output. For the NEC PA monitor I would use the HDMI.

According to BlackMagic's site this supports up to a 2160p30 signal w/ HDR, Deep Color (up to 12-bit), and metadata via the HDMI.

It also lists support for Resolve, Premiere, Photoshop, and Nuke, among other software... which is what I'm looking for.

I do want to call and confirm whether this will cover all of my needs.

The color space output specified by the Mini Monitor is Rec 601 and 709. While the NEC PA monitor can cover about 98.6% of AdobeRGB color space... which is about 15% wider than Rec 709. But I believe that is where the 14-bit 3D LUTs of the monitor come into play.

I don't know for sure though. Color science / theory is a complicated thing that I'm only now beginning to get some education on.

What I would like to figure out is whether the BlackMagic card limits the accuracy I would have if was working in AdobeRGB space or P3, and would the same limitations exist via a Quadro GPU. I don't really have a need for these color spaces at the moment, but I am trying to get a better understanding on how these things work.

Hope that was of some help.

Posted on 2018-10-16 18:46:03
Marco Geração

Have you tried - by experience - using a quadro and a geforce card in the same PC ? I'm not sure what you say is possible. And if it is, it will depend on the OS you're using.

Posted on 2018-11-09 14:54:02

Hey Marco, Jeff talked about that a bit a few posts up, but I can expand on it a bit. We have used mixed Quadro/GeForce at times, but unless you are only using software that you know plays well with mixed GPUs (like DaVinci Resolve), I would avoid it if you can. It technically shouldn't be a problem for most applications, but I wouldn't call it stable. When we have done it, it worked great for us before we shipped the system out, but little bugs and issues tend to pop up over time. It was usually caused by Windows or software updates getting confused about the drivers and resulted in software either giving an error when starting up or other random issues. I don't think we ever had an un-fixable issue, but it definitely was annoying each time and took some work to get fixed.

Posted on 2018-11-09 17:16:15
Marco Geração

Hi there, Matt ! Thanks for the insightful reply.

Posted on 2018-11-09 17:24:10
Fahim Khan.

I am using titan Z for LG 34" monitor while K5000 for Eizo Radforce MX315W monitor. I know Eizo has 10 bit. No matter how many times I changed the settings to 10 bit for Eizo monitor, it goes back to 8 bit. Is there conflict with titan Z and K5000 as I have them both in the same PC?

Posted on 2019-03-19 08:50:53
Jeff Stubbers

Hi Fahim, While it may be *possible* to run both a Geforce and Quadro graphics card in the same system it is recommended to avoid doing so since it can result in instability. If you feel you must go this route, I would remove all graphics card drivers, and only install the Quadro drivers if you are looking for 10-bit support from the Quadro card. If for some reason that does not allow the Geforce to work properly (should not be an issue, but could), then I would recommend installing the Geforce drivers first, then the Quadro drivers second - to essentially overwrite common files, but leave the Quadro drivers available for the Quadro card. Again, not an ideal situation, and would not recommend this kind of setup. Rather I think you would be better off with just the Geforce, then add a 10-bit I/O card like a Blackmagic Decklink card for 10-bit support to your 10-bit monitor (keeping in mind these cards only display a photo or timeline when editing the file - otherwise the screen will be blank). If still experiencing issues, I recommend contacting tech support team of your computer manufacturer. Wishing you the best!

Posted on 2019-03-21 00:50:32
Pixelsmack

I've often wondered if an eGPU box might be a possible work around.

Posted on 2019-04-06 11:55:16
Cuauhtemoc Vega Torres

gtx and rtx series are now able to output 10bit.

Posted on 2019-04-28 05:30:25

In OpenGL, or only in DirectX applications? I've not yet seen 10-bit support in OpenGL (which many professional / prosumer apps use) on Nvidia cards outside of the Quadro series.

Posted on 2019-04-28 05:33:13
Gianky Pitzalis

Good evening and congratulations for all your interesting articles! If possible I would like information. In my pc I have an RTX 2060. I am considering the purchase of a 10 bit and 4k monitor to be used with the enabled adobe softwares and I was thinking of buying a Blackmagic Decklink. What I would like to do is use the Decklink only to connect the monitor, while the RTX 2060 use it to process the production process. Can they be used together? Is there a procedure to follow, for what I would like to do, to use both at the same time? From what I understand Decklink uses HDMI for 10 bits, so how is the connection between Decklink and monitor for example, between HDMI and Display Port in the monitor or between HDMI and Display port always of the monitor? Thank you in advance! Greetings.

Posted on 2019-05-02 21:47:47
Jeff Stubbers

While it is possible to connect both a GeForce and a Blackmagic Decklink card to the same monitor, I don't think it would be a usable, or enjoyable experience. You would have to toggle between the inputs on the monitor to get to your OS desktop, and between the Blackmagic Desklink output. While possible, that would seem near unusable practically speaking. The Blackmagic Decklink cards only display the photo or timeline you are editing for color correction purposes. When you are not editing a photo or timeline, the output from that card will be blank - so you could not use it to drag a program over as an extended desktop. So it typically works best to either have a Quadro graphics card to a single monitor, or a GeForce graphics card to your main "work" monitor, then a separate color accurate 10-bit monitor that you connect your I/O card (Blackmagic Decklink) to for color correction viewing. I hope this helps!

Posted on 2019-05-02 22:09:44
Gianky Pitzalis

Thanks for the reply! So if I understand correctly, for what I would like to do, or use the Decklink to view the 10 bits and the GeForce for data processing, I need two monitors. I also read in the "Hardware Recommendations" section, which every time I should export the image to display it with 10 bits.
So your advice would be to buy a Quadro? Do the new Rtx Quadro support 10 bits like the P series? Last question ... can the Rtx 2060 run on an 8bit 2k monitor?
Thanks again.

Posted on 2019-05-03 12:39:40
Jeff Stubbers

Correct. The GeForce graphics card would connect to a standard monitor for your programs. Then, the Decklink card would send the 10-bit per channel signal to a separate 10-bit monitor.
If you would prefer to use a single 10-bit monitor rather than a dual monitor setup, a Quadro graphics card would allow you to do that. Yes the RTX Quadro offers 10-bit per channel (R,G,B), just like the Quadro P series. Yes the RTX 2060 can support 8-bit 2K or 4K monitors. The specs for those cards should be listed on the individual manufacturers website.

Posted on 2019-05-09 19:37:48
Gianky Pitzalis

Thank you so much!

Posted on 2019-05-12 20:46:34
Zhen Jie Ho

Is there a conclusive list of current-gen Geforce cards that support 10 bpc output? I'm on a Zotac GTX 1070 Ti but following the Nvidia 10-bit setup instructions above, there is only the 8bpc option in the dropdown. Do I have to have a proper 10-bit/8-bit FRC monitor for that option to be available in the dropdown? I was assuming it would be available despite the hardware limitations since 10bpc is just DirectX compliant.

Also, many thanks for this article. It has been very helpful in deciding whether to go workstation-class(which IMO is a stupid proposition considering prices of high-end enthusiast-class cards), or 10 bpc enthusiast-class cards with a separate 10-bit I/O card(which I'm now going with).

Posted on 2019-05-21 03:24:21