![]() ![]() ![]() |
For the current television standards, there are limits as to which colors can and can't be displayed. The standards have a limit as to how saturated/pure the colors can be. The limits are determined by whatever the standard says red, green, and blue is. The standards were not intended to handle reds more saturated than the standardized red.
To get a feel for super saturated colors, pick up any CD, DVD, or optical disc. Reflect light off it and look at the rainbow patterns in the disc. The disc's surface acts as a diffraction grating that breaks the reflected light up into pure wavelengths. You cannot get more saturated colors than these. You will see even more pure colors if the image reflected on the disc is a black object.
Wide gamut refers to systems that can display colors that are more saturated than what usual/older systems can handle. Gamut refers to the subset of all possible colors that can be displayed.
The designers of the original NTSC standards chose a set of primaries that were much more saturated than those used today. Unfortunately, TV manufacturers sacrificed a wider gamut in order to make the picture brighter as TVs back then were fairly dim. So, practice diverged from what the standards called for. Eventually, a new set of primaries were standardized based on the Conrac monitors that were popular in post houses at the time. These are commonly referred to as SMPTE C. Eventually after that, the EBU made their own standard that reflected improved phosphors available at that time.
When HD came about, yet another new set of standard primaries were created in the Rec. 709 standard. The end result was a political compromise. The EBU red and blue were adopted, with a green halfway EBU and SMPTE C.
To recap, there are currently three standard set of primaries: SMPTE C, EBU, and Rec. 709. In practice, I would argue that it is practical to pretend that there are no differences between the three (see the article on "HD" versus "SD" color space).
It is interesting to note that wide gamut is an old idea that was abandoned. It could make a comeback.
Recall that in typical television standards, the color gamut is defined by what the standard primaries (red, green, and blue) are. In digital cinema, a different set of primaries are used: XYZ. The primaries are designed such that any visible color can be represented. They are so saturated that they can even describe colors that don't exist (it would require negative light).
In digital cinema, the projector limits what colors can actually be displayed. The digital cinema standards allow for different projectors to be used so projectors can vary in the range of colors they can reproduce. The colors in each digital cinema master are converted (fudged) into colors that the specific projector can actually reproduce. This means that the color the audience sees varies between different projector models and the algorithms used to convert color. The obvious downside to this is that audiences watching the same film may see different colors. The advantage of this setup is that the system is not confined by current projection technology. Any advances in projection technology will allow for theatres to ultimately move to a much larger color palette.
Nowadays, some of the newer LCD technologies make it easier to implement wide gamut. Different backlighting and color filters can be used to make more saturated colors. LCDs can also be pretty bright so it's not as crucial to sacrifice wide gamut for higher brightness. Some manufacturers like Sony are working towards implementing wide gamut in the home.
I must admit I'm not 100% sure on how the systems will work. The media players and TVs will need to be capable of wide gamut and they need some degree of backwards compatibility with equipment that does not support wide gamut. xvYCC color encoding will likely be used. Is like Rec. 709 encoding, except negative values are used to represent colors that lie outside the standard color gamut.
Right now, there is virtually no wide gamut-encoded content available. Yet Sony and others are shipping TVs that are capable of wide gamut. Because they want to show off their TVs, they will take normal material and intentionally oversaturate the colors to show off the TV's ability to show very saturated colors. The signal processing will avoid oversaturating 'memory' colors (especially flesh tones) where we would easily notice that the colors are off.
I don't think wide gamut will be a huge improvement in image quality. Real world scenes rarely have super saturated colors. Statistically speaking, most of the colors in our world are closer to being achromatic (no saturation / 'colorless') than being extremely saturated. It is rare to run into situations where the limited gamut of current systems rears its head.
If you take things the other way, we can almost get away with limiting the color gamut even further. The generation of Macbook Pros with LED backlighting have an extremely limited color gamut (a wider color gamut is sacrificed for brightness/battery life) that is noticeable smaller than typical color gamuts. I have one and in normal use, you usually don't notice that the color gamut is deficient.
I'm not that excited about wide gamut. On the downside, implementing wide gamut will cause some chaos.
This site by Glenn Chan. Please email any comments
or questions to glennchan /at/ gmail.com
My plugins:
Photoshop
color correction
Photoshop
noise reduction
Freeware
I wasn't satisfied with the tools in Adobe Photoshop so I made my own. Check them out!