This article will explain the relevant color spaces for Vegas, and the appropriate steps to take in making sure the correct color space conversions are occuring. Three different color spaces are relevant for Vegas:
This is the format used by DV, HDV, SDI, and MPEG2.
In Y'CbCr, Y' is the luma component, which represents/approximates brightness. Cb and Cr are the color difference components.
The legal range for Y' (assuming 8-bit) is from 16-235. Values outside this range provide headroom for illegal values (with 0 and 255 being reserved for synchronization purposes). This headroom avoids clipping illegal values, as they can be useful if you play around with them in post (i.e. bring them into legal range).
Note that Vegas does not work with Y'CbCr values directly. It needs to first convert the values into a R'G'B' format- the two relevant ones are described below. Unfortunately, there is no scheme for indicating which color space is which. So, programs can't tell which color space is being given to them.
This is an R'G'B' format, with no headroom for illegal/out-of-range values. Black level is at 0, and white level as at 255. This color space is suitable for display.
This is also an R'G'B' format. Black level is at 16, and white level as at 235. This color space can be useful, since it will retain a lot of the useful illegal values from Y'CbCr signals.
Studio RGB is not a good color space for display- the colors will look incorrect, washed out, and/or lacking in contrast.
Y'CbCr can decode to either computer R'G'B' or studio R'G'B'. This depends on what decompresser is being used. The Vegas 6+ default codecs will decode Y'CbCr signals to studio R'G'B'. This is good if you want to keep more of the information in the original signal. Converting to computer R'G'B' will clip these values.
When encoding R'G'B' to Y'CbCr, the encoder will want to see either computer R'G'B' or studio R'G'B' levels. This depends on the encoder. Some codecs such as Vegas' DV codec will want to see studio R'G'B' levels. If this is the case, all elements should be converted into studio R'G'B' levels. If you bring in still images, they should be converted to studio R'G'B' levels via the "computer RGB to studio RGB" color corrector preset.
When encoding to most web streaming formats, the encoder will want to see computer R'G'B' levels. If you are working with studio R'G'B' levels (e.g. working with DV video), then you should convert your levels to computer R'G'B'. One way of doing this is by nesting your .veg and applying the "studio RGB to computer RGB" color corrector preset.
As you can see, you may need to manually wrangle all your color space conversions in Vegas, depending on the situation. Vegas does not do this for you. (It should but it doesn't.)
The following table lists some of the common codecs and what levels they expect to work in:
|In 8-bit project||In 32-bit project (Vegas 8+)|
|Codecs that decode to and expect Studio R'G'B' levels||
In Vegas 9, you can use the 32-bit floating point (video levels) mode and you don't have to worry about differences between 8-bit and 32-bit/(video levels) projects.
|Codecs that decode to and expect Computer R'G'B' levels||
Suppose you have DV, HDV, and JPEG images in your timeline. For 8-bit projects, I recommend converting everything to studio RGB levels.
Looking at the table above, we see that DV and HDV both decode to studio RGB levels in 8-bit projects. We do not do anything to these media. We see that JPEG images decode to computer RGB levels. Since we want studio RGB levels, we will apply the "computer RGB to studio RGB" Color Corrector (or Levels) FX preset to all our JPEG images. One way to do this is to select JPEG images in the timeline, have the Video FX window open (Alt + 8), and drag the preset from the Video FX window onto selected clips/events on the timeline.
For previewing, the Video Preview window will be inaccurate. The Video Preview window expects/wants to see computer RGB levels. However, it is receiving studio RGB levels. So, the image it displays will be incorrect. For accurate monitoring, preview through a DV/firewire device to an external monitor. To send the video to the camera, the material has to be compressed into DV. The default DV codec (the Sony Vegas DV codec) expects studio RGB levels. We are feeding studio RGB levels, so we don't have to do anything.
When rendering the final project, we have to check to see what codec we're rendering to and what levels it expects.
This is like the example above, except we convert our events into computer RGB levels in the timeline. We leave still image formats like JPEG alone, and we apply "studio RGB to computer RGB" Color Corrector FX presets to DV and HDV clips.
For previewing, we use the Video Preview window and not the external monitor.
When rendering to particular formats, we may need to nest the project and apply a "computer RGB to studio RGB" Color Corrector FX preset.
* There are variations upon the Y'CbCr format. For the purposes of this article, you do not need to know about them. Y'CbCr formats can differ in their luma coefficients and in whether they use full-scale components/levels. In a NLE environment, the appropriate conversions for these differences are performed automatically- you do not need to worry about them.
* Y'CbCr formats also differ in their intended primary chromaticities (SMPTE C, EBU, or Rec. 709). What this means is that they assume that the viewer's display uses a particular shade/color of red, green, and blue. These differences are in many cases ignored in practice. The real world is color inaccurate to begin with (one reason is metamerism), so humans tend not to notice color inaccuracies.