Puget Systems print logo
Read this article at https://www.pugetsystems.com/guides/987

An Introduction to Understanding 8-bit vs. 10-bit Hardware

Written on July 17, 2017 by Jeff Stubbers

If you are a photo or video editor wondering if you should go with 8-bit or 10-bit hardware, this article is for you!

In respect to monitors and graphics cards, there is some hardware available today that is 8-bit and some that is 10-bit.

Before we get too far, let's define what a "bit" is. In software coding terms, a "bit" is the smallest container of information. A bit can hold a maximum of (2) values, either 1 or 0, (on or off, true or false). Bits can refer to values in software program languages, storage space, or color space. In this article we will be discussing color, and how bits relate to color space.

There are different ways that bits can be referred to when it comes to color. We will be discussing bits per channel/component (bpc).

The higher bit value that you are dealing with, the larger the set of potential finite colors (color palette) that can be assigned to each pixel in an image. Finding the number of values per given bit is calculated by 2 to the exponent of the bit number. For example, a 4-bit value would be 2 x 2 x 2 x 2 = 16 values.

​To illustrate, if you are working with 8-bit per channel in a photo editing program, there will be a total of 256 color values per color channel (Red, Green, and Blue) in its color palette to choose from per pixel in that image. For a total of 24-bit worth of values (8-bit red, 8-bit green, 8-bit blue), or 16,777,216 values.

So as you can imagine, the higher the bit depth of color, the more colors available in the color pallet. The more colors available to display means smoother transitions from one color in a gradient to another.

Color information is sent from your graphics card to the monitor as a number that represents the color that a pixel should be within the confines of a given color palette. The monitor then takes that number and reproduces the color that the number corresponds to for a given pixel of an image on screen.


1 bit = 2  values
2-bit = 4
3-bit = 8
4-bit = 16
5-bit = 32
6-bit = 64
7-bit = 128
8-bit = 256
9-bit = 512
10-bit = 1,024
11-bit = 2,048
12-bit = 4,096
13-bit = 8,192
14-bit = 16,384
15-bit = 32,768
16-bit = 65,536
17-bit = 131,072
18-bit = 262,144
19-bit = 524,288
20-bit = 104,8576
21-bit = 2,097,152
22-bit = 4,194,304
23-bit = 8,388,608
24-bit = 16,777,216
25-bit = 33,554,432
26-bit = 67,108,864
27-bit = 134,217,728
28-bit = 268,435,456
29-bit = 536,870,912
30-bit = 1,073,741,824

The Output Color Depth for mainstream graphics cards is listed as 8 bpc, or (Bit Per Component) for mainstream class of graphics cards, such as Nvidia Geforce, or AMD Radeon. This refers to 8-bit color values for Red, 8-bit for Green, & 8-bit for Blue. Essentially 8R + 8G + 8B. In other words, it would be 24-bit color = 16,777,216 values, just more defined in that Red, Green, & Blue colors each get 8-bit worth of values to use for each color. When looking at monitors, they will often be listed at 16.7 Million display colors.

Workstation class graphics cards, such as the Nvidia Quadro, AMD FirePro line, 10-bit I/O card such as a Blackmagic Design DeckLink card or similar, supply 10 bpc. So there is a larger pool of color options available with 10-bit Red channel + 10-bit Green channel, & 10-bit Blue channel, for a total of 30-bit RGB or 1,073,741,824 values. When looking at monitors, you will often see "10-bit" monitors listed as '1.07 Billion display colors'. (It is worth noting 10-bit I/O cards like the Blackmagic Decklink tend to only display either a timeline or only photo being edited for color correction. So when you are not editing the photo or video, the display is blank, not showing the desktop. I/O cards are best used on a separate secondary 10-bit monitor dedicated just for image color correction, not programs.)

The higher bit rating your hardware is, the larger the group of colors that will be available to you to utilize for potentially smoother gradients from one color to another, as can be seen in a sunset photo. Otherwise, you may get some colors substituted for the actual color you have captured in an image if you are working with lower bit hardware / settings.


So what does all of this mean when choosing hardware? If you are working professionally with images, and are going to have your images professionally printed, you will be better off with a 10-bit graphics card, or 10-bit I/O card and 10-bit monitor, as professional print shops are able to print more colors. However, if you are just editing photos for personal use, or to post on the web, then an 8-bit graphics card and monitor would be sufficient, since the vast majority of people accessing the Internet have 8-bit hardware, and would not be able to see the difference. Although most people will not be able to tell the difference between 8-bit and 10-bit, it is easier to decrease the quality of a 10-bit image for web use, than it is to increase the quality of an 8-bit image for professional printing. So having the additional colors that 10-bit is capable of is an advantage as it provides flexibility to save to web or professionally print.

7/19/2017 Update: One reader mentioned that when you check the specs of new monitors to avoid 8+FRC monitors.
For those not aware, 8-bit+FRC (Frame Rate Control) monitors are 8-bit monitors that essentially fake the output of a 10-bit monitor by flashing two colors quickly to give the illusion of the color that should be displayed. For example if the color that should be displayed on the 10-bit monitor is number 101 in the Look Up Table, and an 8-bit monitor is only capable of displaying color number 100, or 104, an 8-bit+FRC monitor would flash the display of color number 100 and number 104 quickly enough that one should not notice the flashing, but attempts to fake the human eye into thinking it is really color number 101.  To do this the 8-bit+FRC monitor would flash between color number 100 for 75% of the time, and color 104 for 25% of the time, to give the illusion of color number 101, similar to how moving pictures work to give the illusion of motion.  If color 102 needed to be displayed, an 8-bit+FRC monitor would flash between displaying color number 100 for 50% of the time, and color number 104 for 50% of the time.  To represent color number 103, as you can imagine by now, the 8-bit+FRC monitor would flash between colors number 100 for 25% of the time, and color number 104 for 75% of the time to give the illusion of color 103, as opposed to a true 10-bit monitor which would be able to simply display color number 103.

I hope this helps!


You may also be interested in these related articles:
Setting Graphics Card Software to Display 10-bit Output
How to enable 30-bit in Photoshop

Tags: 8-bit, 10-bit, Photoshop, Color
Avatar Daniel Fiser

hi Jeff,

this is brilliantly explained I have to say, it was always a bit of a blur to me and now crystal clear.

So to clarify, none of the GTX cards can provide 10-bit information? Only the Quadros?

I've just bought 1080ti and about to get Benq sw2700pt, which is 10-bit screen. Am I wasting money on screen that will not be provided sufficient signal from my GPU?



Posted on 2018-02-06 11:11:09
Avatar Jeff Stubbers

Hi Daniel,

You're very welcome! I'm glad this was helpful! Correct, if you want 10-bit color depth output for programs like Photoshop, you will need to pair a 10-bit monitor with a Quadro graphics card, not a GTX GeForce graphics card. I see the BenQ SW2700PT has DVI, HDMI, and DisplayPort inputs. I recommend connecting with DisplayPort for the best connection, as it allows for the most information across its cable.

Posted on 2018-02-08 21:32:07
Avatar Andrey Khromov

is there any slideshow application which support 10bit? Preferable for Mac.

Posted on 2018-07-17 07:35:14
Avatar Jeff Stubbers

This question is outside the scope of this article, and unfortunately I am not familiar with slideshow software, so cannot make a recommendation there. However, my guess is that the software would have less of an impact on color than the files you are trying to display. For example, .jpg files will be 8bit, so would recommend avoiding those, and using lossless TIFF files for higher color display purposes.

Posted on 2018-07-18 15:35:09
Avatar kaistcha

So should I configure Quadro card and 10 bit monitor(like ASUS PA32UC) to editing 10bit HDR color at timeline on Davinci Resolve?
If Yes, I’ll buy the card.
I was hesitating to buy the Quadro card for long time.

Posted on 2019-09-23 09:47:36
Avatar Jeff Stubbers

While you could, that should not be necessary. As noted on Nvidia's website (https://developer.nvidia.com/high-dynamic-range-display-development) "All NVIDIA GPUs from the 900 and 1000 series support HDR display output. The presence of HDMI 2.0 provides the bandwidth necessary for the higher quality signal desirable for HDR. The 1000 series adds support for HDR over future Display Port revisions." So you should be able to do that with a GeForce graphics card now.
Additionally, Nvidia recently released GeForce drivers to support 10 bit per channel for OpenGL programs at noted in blue at the top of this article. I hope this helps!

Posted on 2019-09-23 14:51:36
Avatar kaistcha

Thank you very much!

Posted on 2019-09-23 17:23:21
Avatar Cuauhtemoc Vega Torres

yes gtx 10 series can output 10 bit color with ease, right no even HDR standars, like hdr 10 or dolby hdr. YES GTX 10 SERIES CAN OUTPUT 10 BIT COLOR. You don't need a quadro it is a thing of the past.

Posted on 2019-04-25 14:40:51
Avatar Jeff Stubbers

Yes, you can enable 10-bit for programs that utilize *DirectX*, such as games or HDR 10 content with a Geforce 10 series graphics card. However, for professional programs such as Adobe Photoshop which uses *OpenGL*, rather than DirectX for 10-bit per channel color, a workstation class (Quadro) or use a 10-bit I/O card such as a Blackmagic Design Decklink card is needed to provide 10-bit per channel / 30-bit output.

Posted on 2019-04-26 15:01:54
Avatar Norachat Leuntawin

Hi, jeff may i ask some question a blackmagic decklink can work on adobe lightroom i have plan workstation for photo editing on 10 bit hardware may i use GTX series + Decklink + 2x 10 bit monitor

Posted on 2019-06-17 03:37:51

I can answer that for you - Lightroom (both versions) do not support 10-bit display out. Doesn't matter if you use a Quadro GPU, a Decklink, or any other card that can send out a 10-bit video signal. There is a feature request that you can add your voice to at https://feedback.photoshop.... , but I don't anticipate it will get added anytime soon since the original request for 10-bit display support was apparently made 8 years ago.

Posted on 2019-06-17 03:55:42
Avatar Norachat Leuntawin

Thx for reply, Then it's doesn't matter what a hardware 10 bit a lightroon still run on 8 bit right?

Posted on 2019-06-17 05:50:12

It will only display 8-bit - no matter your hardware. You can still work with higher bit images of course, you just are limited to seeing 8-bit on your display

Posted on 2019-06-17 05:57:37
Avatar Norachat Leuntawin

ok thank i have to should between quadro and gtx series then it still on 8 bit for now i use gtx for now

Posted on 2019-06-17 06:21:51
Avatar Geoff Murray

A really useful article, thanks. I am in the process of deciding which card to use, 8 or 10 bit but the issue is complicated by the fact that Lightroom doesn't use 10 bit. My main software for images is Lightroom so it has left me confused and undecided. The extra power of a 1060 would be nice over that of a P2000.

Posted on 2018-11-26 09:25:47
Avatar Will C

Hi, if my system used both a mainstream GPU (RTX 2080) AND a Blackmagic Design DeckLink Mini Monitor 4K PCIe, could I output 10 bits to a 10 bit monitor through the DeckLink? This is just for Resolve, I don't care about 10 bit output for any other software. Thanks!

Posted on 2019-07-23 18:26:43

Yep, that is exactly what the Decklinks are made for. Pretty much any GPU/CPU can process 10-bit (or higher) footage, it is just displaying it that is limited to the Quadro/Radeon Pro cards. The Decklinks allow you to send a full screen 10-bit video signal to a display regardless of what hardware is in the rest of the system.

Posted on 2019-07-23 18:34:51
Avatar Will C


Posted on 2019-07-25 17:44:00
Avatar Will C

I think I just got my answer digging through another of your threads "Matt Bach Mod • 2 months ago
If 10-bit is something you need, however, you really shouldn't be going through the GPU for that (even if you have a Radeon Pro or Quadro card that supports 10-bit out). What you should us is a video monitoring card like the Blackmagic Decklink series since they are going to be more color accurate than a GPU is capable of. That does mean you need to have a dedicated monitor for full screen preview, but that is really the best way to get the highest color accuracy possible."

Posted on 2019-07-23 18:35:26
Avatar Will C

I think it'd be helpful in this article to mention devices like the DeckLink. The article gave me the impression that I absolutely needed a professional class graphics card to take advantage of a 10-bit monitor, which is partially true but has some exceptions such as using a DeckLink for full screen monitoring or 10-bit gaming through DirectX.

Posted on 2019-07-23 18:39:09
Avatar Jeff Stubbers

Thank you for the suggestion, Will. Reference to 10-bit I/O cards such as the Decklink added! You may find this useful to take a look at: How to enable 10 bit in Photoshop
I hope this helps!

Posted on 2019-07-24 20:22:59
Avatar Will C

That is useful! Thanks!

Posted on 2019-07-25 17:43:44
Avatar Koncz Péter

This is very clear explanation. Thank you very much! I am about to buy a professional monitor (NEC PA271Q) that has 10 bit color depth. I am buying this because of the color accuracy and color calibration feature. I would use it for photo editing in Lightroom. My understanding is that I cannot use the full capability of this monitor with Lightroom even I had quatro card. I had few questions though:
1, Can I use any 10 bit photo viewer (e.g. GIMP 2.0) if I edit and I export the image from lightroom in TIFF. Will it display the image in 10bit ? Of course if I had e.g. a quatro.
2, I have a Dell UP2716D 8bit monitor too. Can I use that and 10bit monitor in extended screen? Is this possible with any hardware at all ? I guess windows10 can work either in 10bit or 8bit mode.
3, It is recommended to use 10bit monitor only for full screen display and not for editing, is that correct ? I checked you wrote about "10-bit per channel work-around". I would like to check and edit color in Lightroom develope modul and not on a secondary display. Not to mention I would like to enjoy the high-end display in other software. So should I go for the workstation card instead of decklink ? I return to my previous question. how an 8bit monitor can handle a 10 bit signal ?
4, Is it worth it to purchase a 10bit monitor and not to use with a 10 bit capable graphic card ?
I have a GTX 1080 and GT 640 used for display at this moment.
5, Would you recommed to buy a cheper quatro for 10 bit display and keep the 1080 for GPU demanding software ?
6, I have an i1 xrite pro. Can I use it for 10 bit monitor calibration.

I am sorry for so many questions. I just started to learn color editing. I would really apprtiate if I can get answers at least for some of them.

Posted on 2019-07-31 02:38:40
Avatar Jeff Stubbers

Hi Koncz Péter,
1. Yes, you could use any 10-bit photo viewer to view a saved photo in 10-bit (provided you have a 10-bit video output and monitor too). If you have Lightroom, perhaps you also have Photoshop? If so, Photoshop can display 10-bit per channel.
2. Yes, you can use an 8-bit monitor, and a 10-bit monitor in an extended desktop. Showing 8-bit on 8-bit monitor, and 10-bit on 10-bit monitor (provided you have 10-bit per channel video out, and 10-bit software).
3. There is no problem using a 10-bit per channel monitor for editing / normal Windows desktop use, as well as for color accurate editing viewing work. It is not necessary for a Secondary monitor. However, Lightroom will not display in 10-bit per channel (at least currently, until Adobe changes that). If you only want to use just one monitor, I would recommend the Quadro graphics card rather than I/O card like a Decklink, since they only display the photo you are editing, and when not editing the photo, they are blank. An 8-bit monitor handles a 10-bit signal just fine - it just won't display beyond 8-bit colors.
4. Unless you get a 10-bit per channel video output, there is no benefit to a 10-bit monitor, as the video output would be the color bottleneck, only sending 8-bit work of colors to the 10-bit monitor. The 10-bit monitor would only display the 8-bit colors that it is being told to display from the 8-bit video output.
5. I recommend against mixing graphics card types. While it may work at first, it tends to inevitably end up causing stability issues as driver updates occur.
6. We have not tested the xrite i1 Display Pro device, but I presume that was made to color calibrate any type of monitor. It's probably best to ask that company that question. I hope this helps!

Posted on 2019-07-31 16:11:02
Avatar Koncz Péter

Hi Jeff,

Thank you for your answer. Actually 10 bit may be an overshoot for me, but NEC or EIZO, which has high color fidelity, usually are 10 bit as well, so as it happens to be 10 bit I would like to use this feature.
5. What quadro video card you would recommend for the 2 QHD monitors (one 8bit and one10 bit) ? P or older M or K architecture is suitable too? 4Gb or 8Gb video memory ?
6. FYI, NEC provides spectraview II software with calibration device that looks identical to x-rite i1, so I guess that will work with 10 bit too.

Posted on 2019-08-04 20:48:11
Avatar Jeff Stubbers

5. A single Quadro P2000 5GB (or higher) should be able to support (2) QHD (2560x1440) monitors (8-bit or 10-bit) just fine with its DisplayPort outputs.

Posted on 2019-08-05 13:06:46
Avatar Koncz Péter

Thank you Jeff!
I just bought the P2000. However Jon H bellow has a good point. Nvidia started to support 10 bit on Geforce card too from July 2019. Studio driver is available for Titan, 20,16, 10 series. So what is the advantage of quadro now? Does it have a better video signal or sg. like that? I heard the error rate is better, but does it really count when I am not doing CGI animation ?

Posted on 2019-08-11 03:34:43
Avatar Jeff Stubbers

At this point, the main benefit I can see of the Quadro graphics cards is higher quality. Due to additional testing they receive at the factory, we have seen they tend to last longer before failure, which can more than pay for themselves in the long run. Additionally, the higher-end Quadro cards have error correcting ECC VRAM which can be beneficial in calculations that have to be as accurate as possible.

Posted on 2019-08-12 13:05:51
Avatar Jon H

Perhaps pertinent, broader NVidia support: https://www.anandtech.com/s...

Posted on 2019-08-04 20:36:54
Avatar Jeff Stubbers

As noted at the top of this article highlighted in blue, Nvidia has updated the GeForce graphics card "Studio" driver to now enable 10-bit per channel (30-bit). So Nvidia Quadro, or 10-bit per channel I/O cards should no longer be necessary to enable 30bit (10bit per channel) color.
Again, Nvidia's article link: https://www.nvidia.com/en-u...

Posted on 2019-08-20 17:27:57
Avatar Arnab Chatterjee

I have an old laptop with an NVIDIA GeForce 920M 2GB card. (Not looking to upgrade as yet)
I wanted to add either the BenQ sw240 or sw2700pt as I am an avid hobbyist as of now, but intend to edit properly for landscape prints.
I agree mostly I'll be posting online, so a 100% sRGB monitor might suffice. But I do want to print and learn further on the workflow so I can take my skills to the next level.
Hence the inclination towards AdobeRGB monitors.

I read about NVIDIA's update on the Studio drivers across GeForce product line as well (the 30 bit support thingy).. but I think they skipped the 9xxM lineup.
It doesn't show any drivers for any 9xxM series card on the download pages.
I see however, my card has support for OpenGL 4.6. Does this mean anything?
Will it support these BenQs?
By the way, I am using a Fuji X-T3 and Capture One. So any specific advice on how this setup might workout is welcome.

Any help appreciated.
Cheers and very helpful article.

Posted on 2020-01-04 16:13:39
Avatar Jeff Stubbers

Hi Arnab,
Yes, I believe Nvidia only intends to update the Studio drivers for their 10-series and newer GeForce graphics cards, and not updating their older card drivers to include 10-bit per channel. So with the older generation graphics card, I don't believe you will be able to utilize a 30-bit workflow with Photoshop. I'm sorry. :/

Posted on 2020-01-06 16:16:13
Avatar Arnab Chatterjee

Thanks a ton for the prompt reply. Happy New Year!! Cheers!!

Posted on 2020-01-06 16:24:14
Avatar Levente Csillag

Hello. I am looking to buy my first photo monitor. I use canon 7d with kit lenses. Software i use: Lightroom and Dxo optics pro3. Plan to buy a gtx 1660 super videocard for casual gaming, but main focus on photoediting. I am not a pro at all just an enthusiast. Looking the Benq sw series sw2700pt, sw270c and the pd2795q. Maybe a few dell ultrasharp. Dont want to overkill. I just slightly edit my photos to bring up the contrast, more vibrancy, etc. Only basics. Du i still need to buy a pro monitor as i want to buy a spyder calibration tool. Looking since month to get the best for my need. Any help would be appreciated. Thank you. Levente

Posted on 2020-09-11 12:50:29
Avatar Jeff Stubbers

Unfortunately, Adobe Lightroom does not offer a full 10-bit per channel workflow support at this time. However, you can make a feature request vote with Adobe at this link to have them include it in the future:

Posted on 2020-09-11 17:48:42