Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/987

An Introduction to Understanding 8-bit vs. 10-bit Hardware

Written on July 17, 2017 by Jeff Stubbers
Share:


If you are a photo or video editor wondering if you should go with 8-bit or 10-bit hardware, this article is for you!

In respect to monitors and graphics cards, there is some hardware available today that is 8-bit and some that is 10-bit.

Before we get too far, let's define what a "bit" is. In software coding terms, a "bit" is the smallest container of information. A bit can hold a maximum of (2) values, either 1 or 0, (on or off, true or false). Bits can refer to values in software program languages, storage space, or color space. In this article we will be discussing color, and how bits relate to color space.

There are different ways that bits can be referred to when it comes to color. We will be discussing bits per channel/component (bpc).

The higher bit value that you are dealing with, the larger the set of potential finite colors (color palette) that can be assigned to each pixel in an image. Finding the number of values per given bit is calculated by 2 to the exponent of the bit number. For example, a 4-bit value would be 2 x 2 x 2 x 2 = 16 values.

‚ÄčTo illustrate, if you are working with 8-bit per channel in a photo editing program, there will be a total of 256 color values per color channel (Red, Green, and Blue) in its color palette to choose from per pixel in that image. For a total of 24-bit worth of values (8-bit red, 8-bit green, 8-bit blue), or 16,777,216 values.

So as you can imagine, the higher the bit depth of color, the more colors available in the color pallet. The more colors available to display means smoother transitions from one color in a gradient to another.

Color information is sent from your graphics card to the monitor as a number that represents the color that a pixel should be within the confines of a given color palette. The monitor then takes that number and reproduces the color that the number corresponds to for a given pixel of an image on screen.

 

1 bit = 2  values
2-bit = 4
3-bit = 8
4-bit = 16
5-bit = 32
6-bit = 64
7-bit = 128
8-bit = 256
9-bit = 512
10-bit = 1,024
11-bit = 2,048
12-bit = 4,096
13-bit = 8,192
14-bit = 16,384
15-bit = 32,768
16-bit = 65,536
17-bit = 131,072
18-bit = 262,144
19-bit = 524,288
20-bit = 104,8576
21-bit = 2,097,152
22-bit = 4,194,304
23-bit = 8,388,608
24-bit = 16,777,216
25-bit = 33,554,432
26-bit = 67,108,864
27-bit = 134,217,728
28-bit = 268,435,456
29-bit = 536,870,912
30-bit = 1,073,741,824

The Output Color Depth for mainstream graphics cards is listed as 8 bpc, or (Bit Per Component) for mainstream class of graphics cards, such as Nvidia Geforce, or AMD Radeon. This refers to 8-bit color values for Red, 8-bit for Green, & 8-bit for Blue. Essentially 8R + 8G + 8B. In other words, it would be 24-bit color = 16,777,216 values, just more defined in that Red, Green, & Blue colors each get 8-bit worth of values to use for each color. When looking at monitors, they will often be listed at 16.7 Million display colors.

Workstation class graphics cards, such as the Nvidia Quadro or AMD FirePro line, supply 10 bpc. So there is a larger pool of color options available with 10-bit Red channel + 10-bit Green channel, & 10-bit Blue channel, for a total of 30-bit RGB or 1,073,741,824 values. When looking at monitors, you will often see "10-bit" monitors listed as '1.07 Billion display colors'.

The higher bit rating your hardware is, the larger the group of colors that will be available to you to utilize for potentially smoother gradients from one color to another, as can be seen in a sunset photo. Otherwise, you may get some colors substituted for the actual color you have captured in an image if you are working with lower bit hardware / settings.

Conclusion

So what does all of this mean when choosing hardware? If you are working professionally with images, and are going to have your images professionally printed, you will be better off with a 10-bit graphics card and 10-bit monitor, as professional print shops are able to print more colors. However, if you are just editing photos for personal use, or to post on the web, then an 8-bit graphics card and monitor would be sufficient, since the vast majority of people accessing the Internet have 8-bit hardware, and would not be able to see the difference. Although most people will not be able to tell the difference between 8-bit and 10-bit, it is easier to decrease the quality of a 10-bit image for web use, than it is to increase the quality of an 8-bit image for professional printing. So having the additional colors that 10-bit is capable of is an advantage as it provides flexibility to save to web or professionally print.

7/19/2017 Update: One reader mentioned that when you check the specs of new monitors to avoid 8+FRC monitors.
For those not aware, 8-bit+FRC (Frame Rate Control) monitors are 8-bit monitors that essentially fake the output of a 10-bit monitor by flashing two colors quickly to give the illusion of the color that should be displayed. For example if the color that should be displayed on the 10-bit monitor is number 101 in the Look Up Table, and an 8-bit monitor is only capable of displaying color number 100, or 104, an 8-bit+FRC monitor would flash the display of color number 100 and number 104 quickly enough that one should not notice the flashing, but attempts to fake the human eye into thinking it is really color number 101.  To do this the 8-bit+FRC monitor would flash between color number 100 for 75% of the time, and color 104 for 25% of the time, to give the illusion of color number 101, similar to how moving pictures work to give the illusion of motion.  If color 102 needed to be displayed, an 8-bit+FRC monitor would flash between displaying color number 100 for 50% of the time, and color number 104 for 50% of the time.  To represent color number 103, as you can imagine by now, the 8-bit+FRC monitor would flash between colors number 100 for 25% of the time, and color number 104 for 75% of the time to give the illusion of color 103, as opposed to a true 10-bit monitor which would be able to simply display color number 103.

I hope this helps!

Tags: 8-bit, 10-bit, Photoshop, Color
Daniel Fiser

hi Jeff,

this is brilliantly explained I have to say, it was always a bit of a blur to me and now crystal clear.

So to clarify, none of the GTX cards can provide 10-bit information? Only the Quadros?

I've just bought 1080ti and about to get Benq sw2700pt, which is 10-bit screen. Am I wasting money on screen that will not be provided sufficient signal from my GPU?

thanks

D

Posted on 2018-02-06 11:11:09
Jeff Stubbers

Hi Daniel,

You're very welcome! I'm glad this was helpful! Correct, if you want 10-bit color depth output for programs like Photoshop, you will need to pair a 10-bit monitor with a Quadro graphics card, not a GTX GeForce graphics card. I see the BenQ SW2700PT has DVI, HDMI, and DisplayPort inputs. I recommend connecting with DisplayPort for the best connection, as it allows for the most information across its cable.

Posted on 2018-02-08 21:32:07
Andrey Khromov

is there any slideshow application which support 10bit? Preferable for Mac.

Posted on 2018-07-17 07:35:14
Jeff Stubbers

This question is outside the scope of this article, and unfortunately I am not familiar with slideshow software, so cannot make a recommendation there. However, my guess is that the software would have less of an impact on color than the files you are trying to display. For example, .jpg files will be 8bit, so would recommend avoiding those, and using lossless TIFF files for higher color display purposes.

Posted on 2018-07-18 15:35:09
Cuauhtemoc Vega Torres

yes gtx 10 series can output 10 bit color with ease, right no even HDR standars, like hdr 10 or dolby hdr. YES GTX 10 SERIES CAN OUTPUT 10 BIT COLOR. You don't need a quadro it is a thing of the past.

Posted on 2019-04-25 14:40:51
Jeff Stubbers

Yes, you can enable 10-bit for programs that utilize *DirectX*, such as games or HDR 10 content with a Geforce 10 series graphics card. However, for professional programs such as Adobe Photoshop which uses *OpenGL*, rather than DirectX for 10-bit per channel color, a workstation class (Quadro) or use a 10-bit I/O card such as a Blackmagic Design Decklink card is needed to provide 10-bit per channel / 30-bit output.

Posted on 2019-04-26 15:01:54
Geoff Murray

A really useful article, thanks. I am in the process of deciding which card to use, 8 or 10 bit but the issue is complicated by the fact that Lightroom doesn't use 10 bit. My main software for images is Lightroom so it has left me confused and undecided. The extra power of a 1060 would be nice over that of a P2000.

Posted on 2018-11-26 09:25:47