Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/1243
Article Thumbnail

How to enable 30 bit in Photoshop

Written on October 3, 2018 by Jeff Stubbers

30 bit setup in Photoshop

To setup a 30-bit workflow in Photoshop choose Edit, > Preferences > Performance. From the "Graphics Processor Settings" section choose "Advanced Settings..." button. This will bring up the Advanced Graphics Processor Settings window. In the Advanced Graphics Processor Settings window, select the checkbox for "30 Bit Display", then choose the "OK" button.

You will need to also have a workstation class graphics card (AMD Radeon Pro / Nvidia Quadro) in your system, and connect to a 10-bit monitor using a DisplayPort cable connection.




How to enable 30-bit in Lightroom

Unfortunately, Adobe Lightroom does not offer a full 10-bit per channel workflow support at this time.  However, you can make a feature request vote with Adobe at this link to have them include it in the future:

How to enable 30-bit in After Effects

Adobe After Effects can only support 10-bit per channel output with an I/O card (such as a Blackmagic Decklink or similar) monitoring card. Surprisingly, After Effects does not support 10-bit per channel through a workstation class (AMD Radeon Pro / Nvidia Quadro) card.

How to enable 30-bit in Premiere Pro

The good news is that there is Nothing to enable in Premiere Pro, 10-bit per channel output should always be on by default. Adobe Premiere Pro supports 10-bit per channel output by using either an I/O card (such as a Blackmagic Decklink) which offers 10-bit per channel though HDMI / SDI, or with a workstation class graphics card (Quadro / AMD Radeon Pro). With workstation class graphics cards, 10 bit per channel is only available over DisplayPort, not HDMI.

How to enable 30-bit in Illustrator

Unfortunately, Adobe Illustrator does not offer a full 10-bit per channel workflow support at this time.

Which graphics cards offer 30-bit color in Photoshop?

As noted in an earlier article about setting up graphics card software to display 10 bpc output, both workstation class graphics cards (AMD Radeon Pro, & Nvidia Quadro), and consumer class graphics cards (AMD Radeon, Nvidia GeForce) graphics cards give you the ability to set 10 bpc (10-bit per channel R, G, B) for full screen Direct X programs through their driver software to allow a greater number of colors to be displayed for programs that utilize DirectX - again provided you are connected to a 10-bit display.

However professional programs like Adobe Photoshop along with others tend to utilize OpenGL for 10-bit per channel color, and currently only workstation class Nvidia Quadro, or AMD Radeon Pro graphics cards offer 10-bit per channel color through OpenGL.

As Nvidia itself notes regarding 10 bpc output:
"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI."

10-bit per channel work-around

If you have other programs that would benefit from a consumer class AMD Radeon or Nvidia GeForce graphics card, or simply would like the price to performance these graphics cards offer in certain situations, there is a work-around to get 30-bit color in professional programs while using one of these cards by utilizing a 10-bit I/O add-on card, such as a Blackmagic Decklink card. These cards will provide a 10 bpc signal to a monitor. So it technically is possible to have an 8 bpc GeForce or Radeon graphics card in your system for general use, and also have one of the 10-bit I/O type of card to supply 10 bpc to a 10-bit screen. One thing to be aware of though is that the screen attached to the 10-bit I/O card will only supply an image of the photo during editing to the screen attached to this card. When not editing a photo / video, the screen will not display anything else. So after factoring in the cost of the additional 10-bit I/O card, and the fact it only outputs to the screen it is connected to while editing a photo or video, it may or may not be beneficial to go this route over simply getting a workstation class graphics card that can output all content to all screens in 10 bpc.

Is this an 10-bit or 30-bit monitor?

First, it is worth noting that monitor manufacturers will list their 30-bit monitors as "10-bit". Yes this is confusing! The "10-bit" nomenclature the monitor manufacturers use really refers to 10-bit per channel. So that means 10-bit Red, 10-bit Green, and 10-bit Blue channels, which equates to a total of 30-bit values (10+10+10 bits per channel R,G,B) - the value amount of which can be seen in the chart on this page. Likewise an "8-bit" monitor is really a 24-bit total, as it is also referring to 8-bit Red, 8-bit Green, 8-bit Blue channels, and 8+8+8 = "24-bit".

There are 10-bit monitors that have multiple inputs, and not all of those inputs will necessarily support 10-bit per channel. So please check with your individual monitor manufacturer specifications to see which input ports are 10-bit supported.

It is also worth mentioning there are some monitors advertised as offering 10-bit color output, but are not true 10-bit, but rather 8-bit+FRC. 8-bit+FRC (Frame Rate Control) monitors are 8-bit monitors that essentially fake the output of a 10-bit monitor by flashing two colors quickly to give the illusion of the color that should be displayed if it were a true 10-bit monitor. For example if the color that should be displayed on the 10-bit monitor is number 101 in the Look Up Table (LUT), and an 8-bit monitor is only capable of displaying color number 100, or 104, an 8-bit+FRC monitor would flash the color number 100 and number 104 quickly enough that in theory one should not notice the flashing. It's goal is to fake the human eye into thinking it is really color number 101. To do this the 8-bit+FRC monitor would flash between color number 100 for 75% of the time, and color 104 for 25% of the time, to give the illusion of color number 101, similar to how rapid succession still shots work to give the illusion of motion when displayed one after the other quickly enough. If color 102 needed to be displayed, an 8-bit+FRC monitor would flash between displaying color number 100 for 50% of the time, and color number 104 for 50% of the time to give the illusion of color 102, as opposed to a true 10-bit monitor which would be able to simply display color number 102 from the LUT.

I hope this helps!

Tags: Photoshop, Graphics card, Monitor, 8-bit vs 10-bit, Settings, 10 bit, 30-bit

I need an advice. What would be an inexpensive card to work with Photoshop 10bit color space? Do I need a Quadro such as the Quadro P1000 or can I use a GTX such as a GTX 1050Ti? Thanks for your help.

Posted on 2018-10-07 15:56:41
Jeff Stubbers

Hi Luca, Thank you for asking! Yes you would need a Quadro card such as the Quadro P400 (or higher of course) for 10bit per channel in Photoshop - paired with a 10-bit per channel monitor. The GeForce GTX graphics cards can offer 10bit per channel for full screen DirectX applications (games), but only offer 8-bit per channel in professional windowed programs like Photoshop, which utilize 10bit per channel color through OpenGL.

Posted on 2018-10-08 13:24:51
Martin Hagen

Whats up with the new Nvidia RTX2080 / RTX2070? Do they got 10Bit OpenGL Support, maybe? No infos found so far....

Posted on 2018-10-09 11:24:32
Jeff Stubbers

Hi Martin, No unfortunately the Nvidia RTX cards do not support 10bit through OpenGL. This is a differentiator that Nvidia uses to separate the consumer class vs. workstation class cards, so you will need to go with an Nvidia Quadro instead for 10bit through OpenGL for professional programs like Photoshop.

Posted on 2018-10-09 20:02:38

What about Titan V? Thanks in advance.

Posted on 2018-10-10 10:42:04
Jeff Stubbers

Same thing, unfortunately. If the Nvidia line graphics card is not a Quadro, then it will not offer 10 bpc through OpenGL.

Posted on 2018-10-10 13:55:46

Thanks for the very helpful article breaking down of 10 bit monitors in Photoshop. My remaining questions centers around how to drive a 8bit + FRC in Photoshop. First, is 8bit-FRC an acceptable middle ground between 8bit and true 10 bit or should it be avoided entirely? Second, do you drive an 8bit+FRC monitor with a 8bit or 10bit signal from the computer? For example to drive a 8bit+FRC monitor, do you use a GeForce or a Quadro.

Lastly, it would be very helpful if you published an article comparing best monitors across various price points.

Posted on 2018-11-07 18:22:21
Jeff Stubbers

Hi Greg, thank you for the compliments and thank you for asking your questions and suggestion! I would recommend avoiding 8bit + FRC if you can avoid it. However due to the rather high cost of true 10-bit monitors, I can completely see why one would choose to go with an 8 bit + FRC monitor. At least 8bit+FRC is going to offer more colors (even if done through *tricks*) than standard 16.7M color 8-bit monitors.
If you do use an 8-bit + FRC monitor, yes you would want to drive that monitor with a 10-bit per channel signal (Nvidia Quadro / AMD Radeon Pro / 10-bit I/O card). Otherwise that higher than 8-bit per channel monitor would be limited to only showing up to 16.7M colors supplied by an 8-bit per channel source (Nvidia GeForce / AMD Radeon). I hope this helps!

Posted on 2018-11-07 19:25:51
Victor Josue Quintana R.

Been reading your articles since I saw some of your posts in different forums. I'm trying to build a new rig for myself, mainly for photo editing (Lightroom, Capture One, Photoshop), some video editing and color correction (learning Premiere and want to get into Resolve) and perhaps some coding for interactive applications (Processing comes to mind). I've posted on Reddit/buildmeapc and PCPartPicker and got two very different builds. One with AMD (Ryzen 7 2700) and another with Intel (i7 9700K). But since I already own a 10-bit monitor, and I do want to take advantage of it (Asus PB328Q, and an old Dell UltraSharp 24"), I wanted to ask about the Quadro/Radeon Pro cards in comparison to their Geforce/Radeon brethren.
I've not gamed in several years, but would I be losing out on a lot if I went with a Quadro/Radeon Pro over a Geforce/Radeon card? I think the Quadro P2000 5GB may be in the price range of an RTX 2070.
Any help appreciated, and keep up the fantastic work!

Posted on 2018-11-14 15:21:41

Hey Victor, for the software you listed, I would definitely use the Intel Core i7 9700K. It should be about 18% faster than the Ryzen 2700X in Photoshop and Lightroom, and almost identical in Premiere Pro or DaVinci Resolve. I know you've been reading our articles, but just in case you missed them, https://www.pugetsystems.co... should return all the articles we've recently done that included the i7 9700K and Ryzen 2700X.

As for GeForce/Radeon vs Quadro/Radeon Pro, it really comes down to how important 10-bit is for you. You need a workstation card (Quadro/Radeon Pro) to get 10-bit support, although be aware that Lightroom don't support 10-bit at all. I'm not sure about Capture One either. Performance wise, a Quadro P2000 will be much slower than a RTX 2070 - more in line with a GeForce GTX 1060 I think. However, that shouldn't matter too much in Photoshop, Lightroom, or even Premiere Pro if you don't work with R3D or 6K+ media since those applications don't really need a high-end GPU. But if you ever get heavily into DaVinci Resolve, I would expect around 60% higher performance with the RTX 2070 over the P2000.

Posted on 2018-11-14 17:41:07

Hello, I use lightroom and photoshop daily, and although of the two only photoshop offers 10bpc support, as you correctly stated, it works only at magnification/zoom higher than 67%, so I'm wondering what's the point of having a quadro (or a radeon pro) over a gtx/rtx.. ( My primary monitor is an eizo cs2420)

Any suggestions? Many thanks in advance.

Posted on 2019-01-05 02:35:29
Jeff Stubbers

Hi Gio, 3 things are required for a 10-bit workflow (10-bit graphics card, 10-bit software, & 10-bit monitor). If any one of these items does not support 10-bit, you will not see 10-bit color values on your screen. So if all 3 of these are capable of 10-bit, you will have a 10-bit workflow. As noted, you need a Quadro (or AMD Radeon Pro) graphics card to fulfill the "graphics card" portion of the 10-bit workflow, as the Geforce (or AMD Radeon) only offer 8-bit for programs like Photoshop. If all 3 are not going to be 10-bit, then there would not be a benefit for having the other components 10-bit - unless at some point you were planning on updating all 3 components (graphics card, software, monitor) to 10-bit at some point.

*Also important to note that 10-bit monitors may have some input connection types that do not support 10-bit (many have DVI as a connection type that does not support 10-bit, while supporting 10-bit with their DisplayPort input). So please be sure to also use the monitor input that is compatible with 10-bit on your 10-bit monitor as well. I hope this helps!

Posted on 2019-01-07 16:16:14
David BalaĹžic

nVidia just enabled 10 bit on GeForce cards with driver version 431.70

Posted on 2019-08-20 17:19:26
Jeff Stubbers

Correct. Thank you for bringing this up. This is noted in blue at the top of this article, with link to Nvidia's article on the topic for those interested.
You need to download their "Studio" driver for the GeForce graphics cards.

Posted on 2019-08-20 17:20:58
Jeff Stubbers

As noted at the top of this article highlighted in blue, Nvidia has updated the GeForce graphics card "Studio" driver to now enable 10-bit per channel (30-bit). So Nvidia Quadro, or 10-bit per channel I/O cards should no longer be necessary to enable 30bit (10bit per channel) color.
Again, Nvidia's article link: https://www.nvidia.com/en-u...

Posted on 2019-08-20 17:31:44
Mark Harris

Has AMD done the same or only Nvidia?

Posted on 2020-10-28 23:52:46
Jeff Stubbers

It's my understanding AMD has done the same.

Posted on 2020-10-30 14:35:55
Mark Harris

Thank you!

Posted on 2020-10-31 01:53:38
Rahul S

Not True! What is the source of this assertion?

Posted on 2020-12-09 14:53:54