Glenn ChanREELSony Vegas Tutorials

"HD" versus "SD" color space

This is a technical primer on the differences between commonly-used color spaces and practical issues surrounding these differences.

This no single HD or SD color space as they are actually many different HD and SD color spaces. We'll begin by looking at the two most common video standards in use today- ITU-R BT. Rec. 601 and Rec. 709- as the most common HD and SD formats derive from them. The two main differences between the Rec. 601 and Rec. 709 standards are [A] the luma coefficients (and corresponding scale factors) and [B] the primaries.

Luma coefficients

In both Rec. 601 and Rec. 709, R'G'B' values are converted into Y'CbCr values. The Y' component in Y'CbCr approximates the brightness information while the Cb and Cr chroma components approximate the color information. The formula for forming Y' is as follows:

Rec. 601 Y' = 0.299 R' + 0.587 G' + 0.114 B'

Rec. 709 Y' = 0.2126 R' + 0.7152 G' + 0.0722 B'

Luma coefficients refer to the numbers in front of R', G', and B'. Notice that they are different for Rec. 601 and 709. This means that the same input R'G'B' values will lead to different Y'CbCr values depending on whether Rec. 601 or 709 numbers are used. The choice of luma coefficients also affects the scale factors used (not shown).

When converting between Rec. 601 and 709 (or sending 601/709-encoded values down a 709/601 signal path), a color matrix should be applied so that the resulting R'G'B' values are (practically) the same. Color inaccuracy occurs when such a conversion is not applied where it should have been applied. When this happens, achromatic colors like grey and white will stay the same. Saturated colors however will be shifted in hue and saturation. Certain saturated colors may also be pushed outside R'G'B' gamut and clipped. Numerically speaking, the difference is very large. In practice, the difference is subtle and is usually not noticed. This is partly because:

These reasons can explain why consumer TV manufacturers can get away with handling the luma coefficients incorrectly. By omitting the appropriate color matrix, they can cut corners and reduce cost. But this practice is definitely wrong and does noticeably impair image quality (most noticeable where highly saturated colors are being clipped).

In professional post production, this is usually not an issue as most hardware and software will handle this issue correctly. Nonetheless, this is worth checking as some software does contain bugs and some cameras can be setup to encode with different luma coefficients (a dangerous setting in my opinion!). Sending color bars through the signal path should be done to check that the signal comes out correctly.

Controversy

At the time Rec. 709 was being formed, there were some who opposed the idea of changing the luma coefficients. By keeping the luma coefficients the same, we would have avoided the mess we have now whenever the signal is being decoded incorrectly. The benefits of changing the luma coefficients on the other hand are negligible in practice. It reduces the extent of some chroma subsampling errors for particular color combinations; for other color combinations, it makes these chroma subsampling errors worse. Many video engineers don't even think there is a problem with chroma subsampling in the first place and believe that chroma subsampling is visually lossless (it is not; my article on chroma subsampling explains and shows why). My opinion is that the change in luma coefficients has hurt video quality far more than it has improved it.

In any case, we will have to live with the different sets of luma coefficients between Rec. 601 and 709.

Primaries

Primaries refers to the exact "color" or "shade" of red, green, and blue. Color is specificied objectively in CIE 1931 x and y co-ordinates. Obviously there is a need to specify color objectively as there are many different shades of red, green, and blue.

The 3 sets of primaries in common use today are usually referred to as EBU, SMPTE C, and Rec. 709. All modern HD formats use Rec. 709 primaries (there are some obsolete HD formats that do not) while the standard for SD is either EBU or SMPTE C, depending on the country. There are also the original NTSC primaries, which are obsolete and not in current use. Why aren't the NTSC primaries currently being used today?

When the NTSC standards were developed, the designers envisioned a wide gamut system where the primaries are much more saturated/pure than what we have now. The more saturated primaries allow a greater range of highly saturated colors to be reproduced. It is theoretically ideal to make the primaries as saturated/pure as possible so that the widest range of colors can be reproduced.

One downside to wide gamut systems is that overall luminance of the display is lower. Luminance can be increased by making the primaries less saturated. For this reason, consumer TV manufacturers ignored the NTSC standard in order to make displays brighter. Early consumer TVs were fairly dim so this might have been a reasonable compromise.

Later on, SMPTE created the "SMPTE C" standard realizing that there needed to be a production standard. These primaries were derived from the Conrac CRT monitors that were commonly used for reference monitoring at the time. Eventually after that, the EBU created their own standard to reflect changes in CRT phosphors. The EBU primaries are the standard for PAL countries*.

The Rec. 709 set of primaries are a (silly) political compromise between the EBU and SMPTE C primaries. EBU red and blue were adopted, with a green halfway between EBU and SMPTE C.

HD <--> SD Conversions in practice

Ideally, conversions between HD and SD formats would take into account the different primaries of the systems being used. If such a conversion were performed, color bars on a master tape would no longer be correct in the destination format. For post production facilities, color bars would need to be relaid onto the new tape so that the tape will pass quality control checks. For post houses to perform this color space conversion would be a liability. If they forget to redo the color bars (user error happens), then their tape will not pass quality control (the first thing they check for is that the color bars line up) and they can even potentially lose a client for their screwup. And there is little benefit for them to do these color space conversions correctly- honestly, nobody will notice. They have little incentive to perform ideal color space conversions (if they even know about it).

Critical monitoring in practice

The most popular monitors for high-end reference monitoring are the Sony BVM series CRT monitors**, which use SMPTE C phosphors (they conform to SMPTE C primaries). These monitors are not ideal for HD monitoring as modern HD formats (in other words, not the obsolete 1035i formats) call for Rec. 709 primaries. There is sometimes a disconnect between what the standards call for and what actually happens in practice.


*For some reason, I think that the de facto standard in Japan are the EBU primaries. I have never read the standards for Japan so I don't know the correct answer.
**This article is dated. It was written before December 2008.



This site by Glenn Chan. Please email any comments or questions to glennchan /at/ gmail.com Eric Caton / Jemtec Boundary Noise Reduction reviews / comments versus Picturecode Noise Ninja, Imagenomic Noiseware, Topaz Denoise, Picture Cooler, Neat Image, etc.


My plugins:
Photoshop color correction
Photoshop noise reduction
Freeware

I wasn't satisfied with the tools in Adobe Photoshop so I made my own. Check them out!