colorist

Custom ACES IDT for Resolve

Note that DCTL scripts require the studio version of Resolve.

I was just involved in a discussion on creating IDTs for color grading in ACES cct in Resolve 14 when there is no built-in IDT. Or if the existing IDT isn't ideal.

Here's the process that I worked out: 

When there is no IDT specified in the project and the clip, then Resolve expects the data to be in linear gamma and AP0 gamut. An input LUT can be applied before Resolve translates the clip to ACES cct color space. Thus a LUT that can translate the camera data into linear / AP0 is a proper replacement for an IDT.

This can easily be achieved in LUTCalc (https://cameramanben.github.io/LUTCalc/). To test this theory I took a clip and set the default Slog3/S-Gamut3.Cine IDT and took a screen grab for reference. I then created a custom LUT with these settings in LUTCalc. I set the clip's IDT to None and added the new LUT as the input LUT for the clip.

As seen below, the resulting image matches the reference images from the original IDT, suggesting that the two operations are equivalent.

This now opens the opportunity to make additional customizations to this input LUT to taste, or create a LUT that matches the camera specifics of unqiue cameras.

Based on this article, the other option is to create a DCTL script: http://acescentral.com/t/adding-idts-to-resolve/161/2. A DCTL script has the advantage that it's precise math rather than interpolated lookup table. The code in a DCTL script matches the math precision the built-in IDT use.

Using the Sony SLog-3 IDT and converting it to a DCTL file, which then is placed into the LUT folder and used instead of IDT or Input LUT also creates an equivalent image. In fact it creates an exact match when using a reference wipe, whereas the input LUT yields minor variations, presumably based on the less precise math or LUTCalc having slightly different input values.

Note that DCTL scripts require the studio version of Resolve.

// SLog3 / S-Gamut3 DCTL for ACES IDT replacement
__CONSTANT__ float color_xform[9] =
{
   0.6387886672f,  0.2723514337f,  0.0888598991f,
  -0.0039159060f,  1.0880732309f, -0.0841573249f,
  -0.0299072021f, -0.0264325799f,  1.0563397820f
};

__DEVICE__ float slog3_to_linear(float v) {
  float result;

  if(v >= 171.2102946929f / 1023.0f)
  {
    result = _powf(10.0f,(v*1023.0f-420.0f)/261.5f)*(0.18f+0.01f)-0.01f;
  }
  else
  {
    result = (v*1023.0f-95.0f)*0.01125000f/(171.2102946929f-95.0f);
  }

  return result;
}

__DEVICE__ float3 transform(int p_Width, int p_Height, int p_X, int p_Y, float p_R, float p_G, float p_B)
{
  // Convert from SLog3 to Linear
  float3 linear;

  linear.x = slog3_to_linear(p_R);
  linear.y = slog3_to_linear(p_G);
  linear.z = slog3_to_linear(p_B);

  // Convert from S-Gamut3 to AP0
  float3 aces;

  aces.x = color_xform[0]*linear.x + color_xform[1]*linear.y + color_xform[2]*linear.z;
  aces.y = color_xform[3]*linear.x + color_xform[4]*linear.y + color_xform[5]*linear.z;
  aces.z = color_xform[6]*linear.x + color_xform[7]*linear.y + color_xform[8]*linear.z;

  return aces;
}

Here is the same frame with all three different methods: the built-in IDT, the input LUT, and the DCTL script.

Resolve and Color Checker Video Passport

A recent conversation on cinematography.net inspired me to work out a better technique for calibrating footage with the Color Checker Video Passport. I previously hadn't taken the time to fully understand the arrangement of the individual color chips until Adam Wilt's explanation made it click.

Here's a quick and dirty clip recorded on my Sony F3 in s-log in mixed lighting conditions:

The top has a shiny black, 40% IRE, and bright white target. On the bottom, the top row aligns with the vector scope (the big aha moment) and the second row is different skin color targets.

This is how the clip looks in Resolve when imported as is:

For this experiment a couple of quick nodes - a couple of garbage mattes that allows us to isolate individual aspects of the target on the scopes for easier workflow. And the last node with all the adjustments:

 

For step 1 we need to adjust the curve to offset the slog and bring the white and black into their legal ranges and set the middle gray around 40% IRE, using a curves adjustments. Once the ranges are sitting properly, decoupling the curves for the whilte balance on the RGB parade:

 

For step 2 on to the color calibration. Changing the garbage matte to the top row chroma chips bring up the star pattern nicely. On the right the RGB parade, which is impossible to interprete for that...

Because the white balance was already dialed in with the curves the color vectors are almost spot on. A small hue rotation adjustment of 3 degrees and some extra saturation refines the settings:

 

Lastly, switching to the last garbage matte highlighting just the skin color chips and turning on the skin color indicator on the vector scope confirms that the skin color is sitting perfectly:

 

Here is the final color checker with all adjustments:

From this clip we could now export a 3D LUT to be applied to the project or select clips, or the correction could be copied onto a group pre-clip node to apply to all clips that were shot under the same lighting conditions / camera settings.

Tags: 

Recreating Sky

On a recent grade I was faced with sizeable number of clips who had blown out sky and that needed to be made look good. If the sky is just peaking through in a few places, bringing down exposure and adding some color may be enough. But if the sky is prominent in the shot the lack of any texture will be glaring.

For one clip in I went down a more complicated path and it was worth it because it was one clip the client upon review called out as being beautiful.

This is the final clip, nicely highlighting the parrot in full color:

This is what the original footage looked like:

 

This type of work is beyond what can be easily done with Resolve and effects. So I used Fusion Connect to bring this clip into VFX software where it's easier to layer different parts together. The first step was to put a luma keyer on it to isolate the blown out sky:

 

Then I used the DaySky tool which can create a natural looking sky by date and latitutde/longtitude. But it's a blue sky with horizon color distortions. For a bit more realism I threw in some fast noise to create moving clouds, do some color tweaking and merge it with the keyed clip:

 

A little color and exposure matching in Resolve, a tracked vignette on the main bird, and things look a lot better...