Site after site is claiming that OS X El Capitan supports 10-bit graphics drivers (my favored NEC displays have supported 10-bit for years, but without 10-bit graphics drivers, they have gotten only 8-bit signals for all these years on Macs, though PCs are 10-bit). My contacts at NEC are deeply skeptical of 10-bit support in OS X El Capitan. So if 10-bit support is there, Apple at best has done a terrible job of communicating with hardware vendors.
The “proof” all these web sites quote goes beyond ridiculous and speaks to the slovenly state of journalism in the internet age: they all quote one (1) German web site. That site’s “proof” has no legitimacy that I can tell. OS X “About” has long displayed erroneous graphics info in some cases (as one example).
Some users claim to see no banding with a gradient on an iMac 5K with Preview. When *I* make a 16-bit gradient TIF and have Preview display it on my NEC PA302W, I get a nasty posterized mangled mess (the PA302W is a 10-bit panel). So much for El Capitan and Preview, at least in general: Preview cannot be trusted to display images correctly (and I say this based on long experience). So a test with Preview carries zero merit in my view.
Here’s what would sway me as to real 10-bit support in OS X El Capitan:
- Formal public APIs documented by Apple (e.g., ones that developers may use). I see no mention in Apple’s OS X El Capitan API change notes.
- I see no mention anywhere in Apple’s OS X summary release notes of 10-bit support.
- Where is the statement by Adobe (the big and relevant player) that 10-bit graphics are there and will be supported (e.g. that there are public APIs making it possible).
On the first two points, I could have missed an API thing, but I don’t think I could have missed a note in OS X El Capitan developer change summary. Could there be a “backdoor” (private) API that Apple Photos and Apple Preview can use in some cases? Sure, but that is not the same as saying that OS X El Capitan supports 10-bit color. It has to be a public API that all vendors can implement to.
Could MPG be wrong? Of course! Being proven wrong about 10-bit is the best possible outcome for everyone. But facts as per above are what count, not screen shots or quotes of quotes of dubious claims.
Hooray, I was wrong! I like dealing with facts from a reputable source. And here that is, from my contact at Adobe:
Apple added 30-bit support for 10.11. It only works on certain displays and it works better on their 5K displays (even better on the latest gen iMac).
The next update for PS will support 30-bit color on Mac.
Update 02 Nov: I still don’t have full clarity from Adobe on which displays are supported, or even whether 10-bit-capable displays like the NEC PA series will support 30 bit. Something about dithering when 10-bit is enabled, which would would be a huge disappointment.
With Apple displays, it’s not clear whether there is any API for true calibration. So calibration solutions on the market may do a lot better with 10-bit video for faux calibration, but that still would not be true calibration.
* For 8 bit, the bandwidth needed for 4K UltraHD is 3840*2160*3*(8/8) * 60 hz = 1.493 GB/sec.At 10 bit it becomes 3840*2160*3* 10/8 * 60 hz = 1.866 GB/sec = 14.93 Gb/sec. My understanding is that the Mini DisplayPort connector can deliver 20 gigabit/sec (display bus is separate from the data bus on Thunderbolt cable), so 10-bit should be just fine on a 3840 X 2160 display.