Unity - HDR Color Intensity simulation in shader lab

Memo


HDR Color in Standard

insert image description here


Analog Operations in Custom Shader Lab

It is found that the algorithm of the script referring to unity is also wrong, the following code:

        public float exposureValue
        {
    
    
            get {
    
     return m_ExposureValue; }
            set
            {
    
    
                if (m_ExposureValue == value)
                    return;
                m_ExposureValue = value;
                var newRgbFloat = (Color)color * Mathf.Pow(2f, m_ExposureValue);
                m_ColorHdr[(int)RgbaChannel.R] = newRgbFloat.r;
                m_ColorHdr[(int)RgbaChannel.G] = newRgbFloat.g;
                m_ColorHdr[(int)RgbaChannel.B] = newRgbFloat.b;
            }
        }
        [SerializeField] private float m_ExposureValue;

Then I implement it in shader lab:

    // jave.lin :
    // 将 https://github.com/Unity-Technologies/UnityCsReference/blob/61f92bd79ae862c4465d35270f9d1d57befd1761/Editor/Mono/GUI/ColorMutator.cs#L162
    // 的处理搬运到 shader lab,学习用
    void GetHDRColor(fixed3 colorLDR, half exposure, out float3 colorHDR)
    {
    
    
        colorHDR = colorLDR * pow(2.0, exposure);
    }

The results are still different. In the end, adding a coefficient of 1.90 to exposure is almost the same.

colorHDR = colorLDR * pow(2.0, exposure * 1.90);

insert image description here


References

Guess you like

Origin blog.csdn.net/linjf520/article/details/122223197