🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Texture Color Space and Lighting

Started by
2 comments, last by silikone 4 years, 5 months ago

I'm struggling to find information on the correct way to encode textures with the appropriate color space. While literature suggests using a full gamut-agnostic pipeline, or more specifically, ACEScg, I seek a less sophisticated solution that directly works with sRGB, since it is adequate for my needs.

The confusion stems from the notion of sRGB textures. While it makes intuitive sense to encode color non-linearly with an optical transfer function, the same cannot be said about the gamut. If we suppose that a texture, representing the albedo of a surface, is a perfectly white reflector, all lighting interacting with the surface should appear unchanged upon bouncing off it. Such a surface would have a flat spectral response, and indeed, an all-white 255,255,255 texture would behave this way in a standard renderer. However, this would imply that the texture isn't actually sRGB encoded, as the equal-energy white point is actually shifted towards red in the sRGB color space, which uses the spectrally non-equal white point of D65. Alternatively, if we are trying to represent a perfectly red light in sRGB, the best we can get is a desaturated red with an orange tint as seen in the CIE diagram, but the actual behavior of the rendition would suggest that a correspondingly red texture does indeed represent a color beyond the sRGB triangle, for a desaturated red light bouncing off a desaturated red surface would produce an even more desaturated color, but clearly, we can encode a fully saturated texture that purely reflects red light.

Where this could pose a problem is with texture authoring. If one were to procedurally generate a texture using the sRGB color matrix values, any color outside of the gamut would produce negative values and get clamped, which is not necessarily a big deal, but as mentioned previously, these clamped values behave as if they are fully saturated pigments during lighting calculations, and that is obviously incorrect under all circumstances.

TL;DR: A naïve interpretation of sRGB textures does not seem to make sense. Is there a color space for accurately encoding/capturing/generating texture data when using an sRGB workflow in-engine?

Advertisement

I'm one of the people who just shrugs and calls it a day with the sRGB primaries (after decoding the sRGB colors to linear)... But sounds like you want to convert from those linear RGB values to XYZ color space maybe, and then perform spectral rendering? Or something like this https://radiance-online.org:447/radiance-workshop1/cd/Ward/PicturePerfect.pdf

Hodgman said:

I'm one of the people who just shrugs and calls it a day with the sRGB primaries (after decoding the sRGB colors to linear)... But sounds like you want to convert from those linear RGB values to XYZ color space maybe, and then perform spectral rendering? Or something like this https://radiance-online.org:447/radiance-workshop1/cd/Ward/PicturePerfect.pdf

Spectral rendering sounds like overkill, but working with the XYZ (or CIE RGB?) color space seems to make a lot of sense. After all, its white point is illuminant E, which is exactly what I have come to expect of texture encoding. I've so far just assumed that everything was specified in sRGB terms.

Several charts of common materials and their associated RGB values for use in PBR can be found roaming around the internet. I'm curious as to how these were derived.

This topic is closed to new replies.

Advertisement