Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1243
Article Thumbnail

How to enable 30 bit in Photoshop

Written on October 3, 2018 by Jeff Stubbers
Share:

30 bit setup in Photoshop

To setup a 30-bit workflow in Photoshop choose Edit, > Preferences > Performance. From the "Graphics Processor Settings" section choose "Advanced Settings..." button. This will bring up the Advanced Graphics Processor Settings window. In the Advanced Graphics Processor Settings window, select the checkbox for "30 Bit Display", then choose the "OK" button.

You will need to also have a workstation class graphics card (AMD Radeon Pro / Nvidia Quadro) in your system, and connect to a 10-bit monitor using a DisplayPort cable connection.

50470

50471

50472

How to enable 30-bit in Lightroom

Unfortunately, Adobe Lightroom does not offer a full 10-bit per channel workflow support at this time.  However, you can make a feature request vote with Adobe at this link to have them include it in the future:
https://feedback.photoshop.com/photoshop_family/topics/add_10_bit_support_to_lightroom

How to enable 30-bit in After Effects

Adobe After Effects can only support 10-bit per channel output with an I/O card (such as a Blackmagic Decklink or similar) monitoring card. Surprisingly, After Effects does not support 10-bit per channel through a workstation class (AMD Radeon Pro / Nvidia Quadro) card.

How to enable 30-bit in Premiere Pro

The good news is that there is Nothing to enable in Premiere Pro, 10-bit per channel output should always be on by default. Adobe Premiere Pro supports 10-bit per channel output by using either an I/O card (such as a Blackmagic Decklink) which offers 10-bit per channel though HDMI / SDI, or with a workstation class graphics card (Quadro / AMD Radeon Pro). With workstation class graphics cards, 10 bit per channel is only available over DisplayPort, not HDMI.

Which graphics cards offer 30-bit color in Photoshop?

As noted in an earlier article about setting up graphics card software to display 10 bpc output, both workstation class graphics cards (AMD Radeon Pro, & Nvidia Quadro), and consumer class graphics cards (AMD Radeon, Nvidia GeForce) graphics cards give you the ability to set 10 bpc (10-bit per channel R, G, B) for full screen Direct X programs through their driver software to allow a greater number of colors to be displayed for programs that utilize DirectX - again provided you are connected to a 10-bit display.

However professional programs like Adobe Photoshop along with others tend to utilize OpenGL for 10-bit per channel color, and currently only workstation class Nvidia Quadro, or AMD Radeon Pro graphics cards offer 10-bit per channel color through OpenGL.

As Nvidia itself notes regarding 10 bpc output:
"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI."

10-bit per channel work-around

If you have other programs that would benefit from a consumer class AMD Radeon or Nvidia GeForce graphics card, or simply would like the price to performance these graphics cards offer in certain situations, there is a work-around to get 30-bit color in professional programs while using one of these cards by utilizing a 10-bit I/O add-on card, such as a Blackmagic Decklink card. These cards will provide a 10 bpc signal to a monitor. So it technically is possible to have an 8 bpc GeForce or Radeon graphics card in your system for general use, and also have one of the 10-bit I/O type of card to supply 10 bpc to a 10-bit screen. One thing to be aware of though is that the screen attached to the 10-bit I/O card will only supply an image of the photo during editing to the screen attached to this card. When not editing a photo / video, the screen will not display anything else. So after factoring in the cost of the additional 10-bit I/O card, and the fact it only outputs to the screen it is connected to while editing a photo or video, it may or may not be beneficial to go this route over simply getting a workstation class graphics card that can output all content to all screens in 10 bpc.

Is this an 10-bit or 30-bit monitor?

First, it is worth noting that monitor manufacturers will list their 30-bit monitors as "10-bit". Yes this is confusing! The "10-bit" nomenclature the monitor manufacturers use really refers to 10-bit per channel. So that means 10-bit Red, 10-bit Green, and 10-bit Blue channels, which equates to a total of 30-bit values (10+10+10 bits per channel R,G,B) - the value amount of which can be seen in the chart on this page. Likewise an "8-bit" monitor is really a 24-bit total, as it is also referring to 8-bit Red, 8-bit Green, 8-bit Blue channels, and 8+8+8 = "24-bit".

There are 10-bit monitors that have multiple inputs, and not all of those inputs will necessarily support 10-bit per channel. So please check with your individual monitor manufacturer specifications to see which input ports are 10-bit supported.

It is also worth mentioning there are some monitors advertised as offering 10-bit color output, but are not true 10-bit, but rather 8-bit+FRC. 8-bit+FRC (Frame Rate Control) monitors are 8-bit monitors that essentially fake the output of a 10-bit monitor by flashing two colors quickly to give the illusion of the color that should be displayed if it were a true 10-bit monitor. For example if the color that should be displayed on the 10-bit monitor is number 101 in the Look Up Table (LUT), and an 8-bit monitor is only capable of displaying color number 100, or 104, an 8-bit+FRC monitor would flash the color number 100 and number 104 quickly enough that in theory one should not notice the flashing. It's goal is to fake the human eye into thinking it is really color number 101. To do this the 8-bit+FRC monitor would flash between color number 100 for 75% of the time, and color 104 for 25% of the time, to give the illusion of color number 101, similar to how rapid succession still shots work to give the illusion of motion when displayed one after the other quickly enough. If color 102 needed to be displayed, an 8-bit+FRC monitor would flash between displaying color number 100 for 50% of the time, and color number 104 for 50% of the time to give the illusion of color 102, as opposed to a true 10-bit monitor which would be able to simply display color number 102 from the LUT.

I hope this helps!

Tags: Photoshop, Graphics card, Monitor, 8-bit vs 10-bit, Settings, 10 bit, 30-bit
Luca

I need an advice. What would be an inexpensive card to work with Photoshop 10bit color space? Do I need a Quadro such as the Quadro P1000 or can I use a GTX such as a GTX 1050Ti? Thanks for your help.

Posted on 2018-10-07 15:56:41
Jeff Stubbers

Hi Luca, Thank you for asking! Yes you would need a Quadro card such as the Quadro P400 (or higher of course) for 10bit per channel in Photoshop - paired with a 10-bit per channel monitor. The GeForce GTX graphics cards can offer 10bit per channel for full screen DirectX applications (games), but only offer 8-bit per channel in professional windowed programs like Photoshop, which utilize 10bit per channel color through OpenGL.

Posted on 2018-10-08 13:24:51
Martin Hagen

Whats up with the new Nvidia RTX2080 / RTX2070? Do they got 10Bit OpenGL Support, maybe? No infos found so far....

Posted on 2018-10-09 11:24:32
Jeff Stubbers

Hi Martin, No unfortunately the Nvidia RTX cards do not support 10bit through OpenGL. This is a differentiator that Nvidia uses to separate the consumer class vs. workstation class cards, so you will need to go with an Nvidia Quadro instead for 10bit through OpenGL for professional programs like Photoshop.

Posted on 2018-10-09 20:02:38
Ottoore

What about Titan V? Thanks in advance.

Posted on 2018-10-10 10:42:04
Jeff Stubbers

Same thing, unfortunately. If the Nvidia line graphics card is not a Quadro, then it will not offer 10 bpc through OpenGL.

Posted on 2018-10-10 13:55:46
Greg

Thanks for the very helpful article breaking down of 10 bit monitors in Photoshop. My remaining questions centers around how to drive a 8bit + FRC in Photoshop. First, is 8bit-FRC an acceptable middle ground between 8bit and true 10 bit or should it be avoided entirely? Second, do you drive an 8bit+FRC monitor with a 8bit or 10bit signal from the computer? For example to drive a 8bit+FRC monitor, do you use a GeForce or a Quadro.

Lastly, it would be very helpful if you published an article comparing best monitors across various price points.

Posted on 2018-11-07 18:22:21
Jeff Stubbers

Hi Greg, thank you for the compliments and thank you for asking your questions and suggestion! I would recommend avoiding 8bit + FRC if you can avoid it. However due to the rather high cost of true 10-bit monitors, I can completely see why one would choose to go with an 8 bit + FRC monitor. At least 8bit+FRC is going to offer more colors (even if done through *tricks*) than standard 16.7M color 8-bit monitors.
If you do use an 8-bit + FRC monitor, yes you would want to drive that monitor with a 10-bit per channel signal (Nvidia Quadro / AMD Radeon Pro / 10-bit I/O card). Otherwise that higher than 8-bit per channel monitor would be limited to only showing up to 16.7M colors supplied by an 8-bit per channel source (Nvidia GeForce / AMD Radeon). I hope this helps!

Posted on 2018-11-07 19:25:51
Victor Josue Quintana R.

Been reading your articles since I saw some of your posts in different forums. I'm trying to build a new rig for myself, mainly for photo editing (Lightroom, Capture One, Photoshop), some video editing and color correction (learning Premiere and want to get into Resolve) and perhaps some coding for interactive applications (Processing comes to mind). I've posted on Reddit/buildmeapc and PCPartPicker and got two very different builds. One with AMD (Ryzen 7 2700) and another with Intel (i7 9700K). But since I already own a 10-bit monitor, and I do want to take advantage of it (Asus PB328Q, and an old Dell UltraSharp 24"), I wanted to ask about the Quadro/Radeon Pro cards in comparison to their Geforce/Radeon brethren.
I've not gamed in several years, but would I be losing out on a lot if I went with a Quadro/Radeon Pro over a Geforce/Radeon card? I think the Quadro P2000 5GB may be in the price range of an RTX 2070.
Any help appreciated, and keep up the fantastic work!

Posted on 2018-11-14 15:21:41

Hey Victor, for the software you listed, I would definitely use the Intel Core i7 9700K. It should be about 18% faster than the Ryzen 2700X in Photoshop and Lightroom, and almost identical in Premiere Pro or DaVinci Resolve. I know you've been reading our articles, but just in case you missed them, https://www.pugetsystems.co... should return all the articles we've recently done that included the i7 9700K and Ryzen 2700X.

As for GeForce/Radeon vs Quadro/Radeon Pro, it really comes down to how important 10-bit is for you. You need a workstation card (Quadro/Radeon Pro) to get 10-bit support, although be aware that Lightroom don't support 10-bit at all. I'm not sure about Capture One either. Performance wise, a Quadro P2000 will be much slower than a RTX 2070 - more in line with a GeForce GTX 1060 I think. However, that shouldn't matter too much in Photoshop, Lightroom, or even Premiere Pro if you don't work with R3D or 6K+ media since those applications don't really need a high-end GPU. But if you ever get heavily into DaVinci Resolve, I would expect around 60% higher performance with the RTX 2070 over the P2000.

Posted on 2018-11-14 17:41:07