Many of the monitors that we sell are capable of displaying in 10-bit. This means that instead of each screen pixel being represented by 8 bits of information you get 10 bits. With 8 bits you have 256 values possible in each colour channel (red, green and blue). 256 cubed equals 16.7 million possible colour combinations. In 10 bit there are 1024 values for each colour so that means you get over 1 billion colours. Of course going from 8 bit to 10 bit does not expand the colour gamut of the monitor, you just get more possible colours within that gamut.
Is More Better?
More colours sounds like a good idea, it can mean that colours blend more smoothly as there are a larger number of data points and it certainly is useful for many video or medical workflows that are 10 bit but I’ve always been a little sceptical of the benefit for photographic editing. I’ve certainly had comments from customers who have achieved 10 bit that the improvement in on screen image is negligible, but others have also said the improvement is very much worth having.
What you Need
To achieve 10 bit the whole chain from computer to monitor has to be 10 bit compatible, this includes the operating system, application, graphics card, signal cable, and monitor. If you are a Mac user you can stop reading here. The Mac OS doesn’t allow 10 bit video. They made some steps towards it when they updated to 10.9 but basically it is still 8 bit only. Being a Mac user if I was going to get 10 bit and see the difference for myself I was going to have to use our test PC, which runs Windows 7. You will need to go to one of the basic Windows themes to get 10 bit working. If you use an Aero theme then it makes it 8 bit.
Trial and Error
The first major hurdle I had was the graphics card. The PC had a standard Dell card in that was only 8 bit. SO I looked around and went for an ATI FirePro V3900. I had exchanged some emails with one of customers who had got 10 bit working (thanks Zoltan) so I knew it might not be as simple as just installing the card, and it wasn’t. Putting the card in was easy enough, as was installing the drivers and setting the correct option in the ATI control panel but when I opened Photoshop CC the option for 30 bit display was greyed out and a test image showed I was still in 8 bit.
So I went through making sure that my previous graphics card driver was deleted (you can find utilities to do this on the net) and also that any reference to the old card was deleted from the registry. I also installed a beta driver for the card and tried a few other things and after a very frustrating day I gave up and sent the ATI card back. I’m sure the problem was with my system not the card or the driver but had just run out of things to try.
I don’t like to be defeated though so after a few days I tried again, this time with a nVidia Quadro K600. I still made sure I deleted the old drivers and worked out how to stop Windows New Hardware Wizard jumping in and installing generic drivers before I had time to install the nVidia ones. The nVidia manual said it would automatically use 10 bit when connected to a 10 bit monitor. After the problems I had with the ATI card I wasn’t hopeful but this time it worked. The test ramp was smooth with no steps.
Did it Look Better?
Next I went on to look at some 16 bit test images (if the images were 8 bit I would not have seen any difference). I looked at about ten different images, and then switched off the 30 bit mode in Photoshop CC and looked at the same images. If there was an improvement it was very, very subtle. I would show you some screen shots but obviously you wouldn’t see the difference in an 8 bit JPEG displayed through a non-10 bit application like a web browser.
I think if you have a system that is capable of 10 bit, then give it a try, but I don’t think I’d ever advise a customer to go out and buy a new graphics card to make it happen unless they had a specific need to get 10 bit working.
Below is checklist of the things you’ll need if you do want to give it a try: