Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1243
Article Thumbnail

How to enable 30 bit in Photoshop

Written on October 3, 2018 by Jeff Stubbers
Share:

30 bit setup in Photoshop

To setup a 30-bit workflow in Photoshop choose Edit, > Preferences > Performance. From the "Graphics Processor Settings" section choose "Advanced Settings..." button. This will bring up the Advanced Graphics Processor Settings window. In the Advanced Graphics Processor Settings window, select the checkbox for "30 Bit Display", then choose the "OK" button.

You will need to also have a workstation class graphics card (AMD Radeon Pro / Nvidia Quadro) in your system, and connect to a 10-bit monitor using a DisplayPort cable connection.

50470

50471

50472

Which graphics cards offer 30-bit color in Photoshop?

As noted in an earlier article about setting up graphics card software to display 10 bpc output, both workstation class graphics cards (AMD Radeon Pro, & Nvidia Quadro), and consumer class graphics cards (AMD Radeon, Nvidia GeForce) graphics cards give you the ability to set 10 bpc (10-bit per channel R, G, B) for full screen Direct X programs through their driver software to allow a greater number of colors to be displayed for programs that utilize DirectX - again provided you are connected to a 10-bit display.

However professional programs like Adobe Photoshop along with others tend to utilize OpenGL for 10-bit per channel color, and currently only workstation class Nvidia Quadro, or AMD Radeon Pro graphics cards offer 10-bit per channel color through OpenGL.

As Nvidia itself notes regarding 10 bpc output:
"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI."

10-bit per channel work-around

If you have other programs that would benefit from a consumer class AMD Radeon or Nvidia GeForce graphics card, or simply would like the price to performance these graphics cards offer in certain situations, there is a work-around to get 30-bit color in professional programs while using one of these cards by utilizing a 10-bit I/O add-on card, such as a Blackmagic Decklink card. These cards will provide a 10 bpc signal to a monitor. So it technically is possible to have an 8 bpc GeForce or Radeon graphics card in your system for general use, and also have one of the 10-bit I/O type of card to supply 10 bpc to a 10-bit screen. One thing to be aware of though is that the screen attached to the 10-bit I/O card will only supply an image of the photo during editing to the screen attached to this card. When not editing a photo / video, the screen will not display anything else. So after factoring in the cost of the additional 10-bit I/O card, and the fact it only outputs to the screen it is connected to while editing a photo or video, it may or may not be beneficial to go this route over simply getting a workstation class graphics card that can output all content to all screens in 10 bpc.

Is this an 10-bit or 30-bit monitor?

First, it is worth noting that monitor manufacturers will list their 30-bit monitors as "10-bit". Yes this is confusing! The "10-bit" nomenclature the monitor manufacturers use really refers to 10-bit per channel. So that means 10-bit Red, 10-bit Green, and 10-bit Blue channels, which equates to a total of 30-bit values (10+10+10 bits per channel R,G,B) - the value amount of which can be seen in the chart on this page. Likewise an "8-bit" monitor is really a 24-bit total, as it is also referring to 8-bit Red, 8-bit Green, 8-bit Blue channels, and 8+8+8 = "24-bit".

There are 10-bit monitors that have multiple inputs, and not all of those inputs will necessarily support 10-bit per channel. So please check with your individual monitor manufacturer specifications to see which input ports are 10-bit supported.

It is also worth mentioning there are some monitors advertised as offering 10-bit color output, but are not true 10-bit, but rather 8-bit+FRC. 8-bit+FRC (Frame Rate Control) monitors are 8-bit monitors that essentially fake the output of a 10-bit monitor by flashing two colors quickly to give the illusion of the color that should be displayed if it were a true 10-bit monitor. For example if the color that should be displayed on the 10-bit monitor is number 101 in the Look Up Table (LUT), and an 8-bit monitor is only capable of displaying color number 100, or 104, an 8-bit+FRC monitor would flash the color number 100 and number 104 quickly enough that in theory one should not notice the flashing. It's goal is to fake the human eye into thinking it is really color number 101. To do this the 8-bit+FRC monitor would flash between color number 100 for 75% of the time, and color 104 for 25% of the time, to give the illusion of color number 101, similar to how rapid succession still shots work to give the illusion of motion when displayed one after the other quickly enough. If color 102 needed to be displayed, an 8-bit+FRC monitor would flash between displaying color number 100 for 50% of the time, and color number 104 for 50% of the time to give the illusion of color 102, as opposed to a true 10-bit monitor which would be able to simply display color number 102 from the LUT.

I hope this helps!

Tags: Photoshop, Graphics card, Monitor, 8-bit vs 10-bit, Settings, 10 bit, 30-bit
Luca

I need an advice. What would be an inexpensive card to work with Photoshop 10bit color space? Do I need a Quadro such as the Quadro P1000 or can I use a GTX such as a GTX 1050Ti? Thanks for your help.

Posted on 2018-10-07 15:56:41
Jeff Stubbers

Hi Luca, Thank you for asking! Yes you would need a Quadro card such as the Quadro P400 (or higher of course) for 10bit per channel in Photoshop - paired with a 10-bit per channel monitor. The GeForce GTX graphics cards can offer 10bit per channel for full screen DirectX applications (games), but only offer 8-bit per channel in professional windowed programs like Photoshop, which utilize 10bit per channel color through OpenGL.

Posted on 2018-10-08 13:24:51
Martin Hagen

Whats up with the new Nvidia RTX2080 / RTX2070? Do they got 10Bit OpenGL Support, maybe? No infos found so far....

Posted on 2018-10-09 11:24:32
Jeff Stubbers

Hi Martin, No unfortunately the Nvidia RTX cards do not support 10bit through OpenGL. This is a differentiator that Nvidia uses to separate the consumer class vs. workstation class cards, so you will need to go with an Nvidia Quadro instead for 10bit through OpenGL for professional programs like Photoshop.

Posted on 2018-10-09 20:02:38
Ottoore

What about Titan V? Thanks in advance.

Posted on 2018-10-10 10:42:04
Jeff Stubbers

Same thing, unfortunately. If the Nvidia line graphics card is not a Quadro, then it will not offer 10 bpc through OpenGL.

Posted on 2018-10-10 13:55:46