当前位置: 首页 > news >正文

HDR Scattering and Tone Mapping

https://catlikecoding.com/unity/tutorials/custom-srp/hdr/
https://64.github.io/tonemapping/

1 high dynamic range
Up to this point when rendering a camera we’ve done so in low dynamic color range—LDR for short—which is the default. This means that each color channel is represented with a value that’s clamped to 0–1. In this mode (0,0,0) represents black and (1,1,1) represents white. Although our shaders could produce results outside this range the GPU will clamp the color while storing them, as if we used saturate at the end of every fragment function.

u can use the frame debugger to check the type of the render target of each draw call.
在这里插入图片描述
the target of a normal camera is described as B8G8R8A8_SRGB.
this means that it is an RGBA buffer with eight bits per channel, so 32 bits per pixel.
also, the RGB channels are stored in sRGB color space.
as we are working in linear color space the gpu automatically converts between both spaces when reading from and writing to the buffer.

once rendering is finished the buffer is send to the display, which interprets it as sRGB color data.

the maximum of 1 per color channel works fine as long as light intensities do not exceed it.
but but but the intensity of incoming light does not have an inherent upper bound.
The sun is an example of an extremely bright light source, which is why you shouldn’t look at it directly. Its intensity is far greater than we can perceive before our eyes get damaged. But many regular light sources also produce light with an intensity that can exceed the limits of the observer, especially when observed up close. to correctly work with such intensities we have to render to high-dynamic-range—HDR—buffers, which support values greater than 1.

1.1 HDR reflection probes
HDR rendering requires HDR render targets. this does not only apply to regular cameras, it is also true for reflection probes. Whether a reflection probe contains HDR or LDR data can be controlled via its HDR toggle option, which is enabled by default.
在这里插入图片描述
Reflection probe with HDR enabled.

When a reflection probe uses HDR it can contain high-intensity colors, which are mostly specular reflections that it captured. You can observe them indirectly via the reflections they cause in a scene. Imperfect reflections weaken the probe’s colors, which makes HDR values stand out.

在这里插入图片描述

1.2 HDR cameras
Cameras also have an HDR configuration option, but it doesn’t do anything on its own. It can be set to either Off or Use Graphics Settings.

在这里插入图片描述
Camera HDR depending on graphics settings.

The Use Graphics Settings mode only indicates that the camera allows HDR rendering. Whether this happens is up to the RP. We’ll control that by adding a toggle to CustomRenderPipelineAsset to allow HDR, passing it to the pipeline constructor.

[SerializeField]
	bool allowHDR = true;
	
	…

	protected override RenderPipeline CreatePipeline () {
		return new CustomRenderPipeline(
			allowHDR, useDynamicBatching, useGPUInstancing, useSRPBatcher,
			useLightsPerObject, shadows, postFXSettings
		);
	}

Let CustomRenderPipeline keep track of it and pass it to the camera renderer along with the other options.

Let CustomRenderPipeline keep track of it and pass it to the camera renderer along with the other options.

bool allowHDR;

	…

	public CustomRenderPipeline (
		bool allowHDR,) {
		this.allowHDR = allowHDR;}

	protected override void Render (
		ScriptableRenderContext context, Camera[] cameras
	) {
		foreach (Camera camera in cameras) {
			renderer.Render(
				context, camera, allowHDR,
				useDynamicBatching, useGPUInstancing, useLightsPerObject,
				shadowSettings, postFXSettings
			);
		}
	}

CameraRenderer then keeps track of whether HRD should be used, which is when both the camera and the RP allow it.

bool useHDR;

	public void Render (
		ScriptableRenderContext context, Camera camera, bool allowHDR,
		bool useDynamicBatching, bool useGPUInstancing, bool useLightsPerObject,
		ShadowSettings shadowSettings, PostFXSettings postFXSettings
	) {if (!Cull(shadowSettings.maxDistance)) {
			return;
		}
		useHDR = allowHDR && camera.allowHDR;}

在这里插入图片描述
HDR allowed.

1.3 HDR render textures
HDR rendering only makes sense in combination with post processing, because we can not change the final frame buffer format. so when we create our own intermediate frame buffer in CameraRenderer.Setup we’ll use the default HDR format when appropriate, instead of the regular default which is for LDR.

buffer.GetTemporaryRT(
				frameBufferId, camera.pixelWidth, camera.pixelHeight,
				32, FilterMode.Bilinear, useHDR ?
					RenderTextureFormat.DefaultHDR : RenderTextureFormat.Default
			);

the frame debugger will show that the default HDR format is R16G16B16A16_SFloat,
在这里插入图片描述
which means it’s an RGBA buffer with 16 bits per channel, so 64 bits per pixel, double the size of the LDR buffer. In this case each value is a signed float in linear space, not clamped to 0–1.

Can we use different render texture formats?
Yes, but you have to make sure that your target platform supports it. For this tutorial we stick with the default HDR format, which will always work.

when rendering through draw calls u will notice that the scene will apear darker than the final result.
this happens because those steps are stored in the HDR texture. it appears dark because the linear color data gets displayed as-is 原样,thus incorrectly interpreted as sRGB.
在这里插入图片描述
1.4 HDR post processing
At this point the result doesn’t look any different than before, because we’re not doing anything with the expanded range and it gets clamped once we render to an LDR target. Bloom might appear a bit brighter, but not much because colors get clamped after the pre-filtering pass. We also have to perform post processing in HDR to take full advantage of it. So let’s pass along whether HDR is used when invoking PostFXStack.Setup in CameraRenderer.Render.

postFXStack.Setup(context, camera, postFXSettings, useHDR);

Now PostFXStack can also keep track of whether it should use HDR.

bool useHDR;

	…

	public void Setup (
		ScriptableRenderContext context, Camera camera, PostFXSettings settings,
		bool useHDR
	) {
		this.useHDR = useHDR;}

And we can use the appropriate texture format in DoBloom.

RenderTextureFormat format = useHDR ?
			RenderTextureFormat.DefaultHDR : RenderTextureFormat.Default;

the difference between HDR and LDR bloom can be dramatic or unnoticeable, depending on how bright scene is.
often the bloom threshold is set to 1 so only HDR colors contribute to it.
this way the glow indicates colors that are too bright for the display.

在这里插入图片描述
because bloom averages colors even a single very bright pixel can end up visually affecting a very large region. u can see this by comparing the pre-filter step with the final result.
even a single pixel can produce a big circular glow.
在这里插入图片描述

HDR bloom pre-filtering step.

1.5 fighting fireflies
A downside of HDR is that it can produce small image regions that are much brighter than their surroundings. When these regions are about the size of a pixel or smaller they can drastically change relative size and pop in and out of existence during to movement, which causes flickering. These regions are known as fireflies萤火虫. When bloom gets applied to them the effect can become stroboscopic闪烁.

https://thumbs.gfycat.com/LimpingDesertedGoose-mobile.mp4

eliminating this problem entirely would require infinite resolution, which is not possible.
the next best thing that we can do is more aggressively blur the image during pre-filtering, to fade out the fireflies. let us add a toggle option to PostFXSettings.BloomSettings for this.

public bool fadeFireflies;

Add a new pre-filter fireflies pass for this purpose. Once again I won’t show adding the pass to the PostFxStack shader an the PostFXStack.Pass enum. Select the appropriate pass for pre-filtering in DoBloom.

Draw(
			sourceId, bloomPrefilterId, bloom.fadeFireflies ?
				Pass.BloomPrefilterFireflies : Pass.BloomPrefilter
		);

The most straightforward way to fade the fireflies is to grow our 2×2 downsample filter of the pre-filtering pass into a large 6×6 box filter. We can do that with nine samples, applying the bloom threshold to each sample individually before averaging. Add the required BloomPrefilterFirefliesPassFragment function for that to PostFXStackPasses.
在这里插入图片描述

float4 BloomPrefilterFirefliesPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float2 offsets[] = {
		float2(0.0, 0.0),
		float2(-1.0, -1.0), float2(-1.0, 1.0), float2(1.0, -1.0), float2(1.0, 1.0),
		float2(-1.0, 0.0), float2(1.0, 0.0), float2(0.0, -1.0), float2(0.0, 1.0)
	};
	for (int i = 0; i < 9; i++) {
		float3 c =
			GetSource(input.fxUV + offsets[i] * GetSourceTexelSize().xy * 2.0).rgb;
		c = ApplyBloomThreshold(c);
		color += c;
	}
	color *= 1.0 / 9.0;
	return float4(color, 1.0);
}

But this isn’t enough to solve the problem, as the very bright pixels just get spread out over a larger area. To fade the fireflies we’ll use a weighed average instead, based on the color’s luminance. A color’s luminance is its perceived brightness. We’ll use the Luminance function for this, defined in the Color HLSL file of the Core Library.

#include “Packages/com.unity.render-pipelines.core/ShaderLibrary/Color.hlsl”
#include “Packages/com.unity.render-pipelines.core/ShaderLibrary/Filtering.hlsl”

在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

float4 BloomPrefilterFirefliesPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float weightSum = 0.0;for (int i = 0; i < 9; i++) {float w = 1.0 / (Luminance(c) + 1.0);
		color += c * w;
		weightSum += w;
	}
	color /= weightSum;
	return float4(color, 1.0);
}

Because we perform a Gaussian blur after the initial pre-filtering step we can get away with skipping the four samples directly adjacent to the center, reducing the amount of samples from nine to five.
在这里插入图片描述

float2 offsets[] = {
		float2(0.0, 0.0),
		float2(-1.0, -1.0), float2(-1.0, 1.0), float2(1.0, -1.0), float2(1.0, 1.0)//,
		//float2(-1.0, 0.0), float2(1.0, 0.0), float2(0.0, -1.0), float2(0.0, 1.0)
	};
	for (int i = 0; i < 5; i++) {}

This will turn single-pixel fireflies into ×-shape patterns and split single-pixel horizontal or vertical lines into two separate lines in the pre-filtering step, but after the first blur step those patterns are gone.

This doesn’t completely eliminate the fireflies but reduces their strength so much that they are no longer glaringly obvious, unless the bloom intensity is set much higher than 1.

2 scattering bloom 散射bloom
now that we have HDR bloom let us consider a more realistic application of it. the idea is that cameras are not perfect. their lenses do not focus all light correctly. A portion of the light gets scattered over a larger area, somewhat like our bloom effect does currently. The better a camera the less it scatters. The big difference with our additive bloom effect is that scattering doesn’t add light, it only diffuses it. Scattering can visually vary from a slight glow to a light haze that veils the entire image.

Eyes also aren’t perfect and light gets scattered inside them in a complex way. It happens with all incoming light, but is only really noticeable when it is bright. For example, it is obvious when looking at a small bright light source against a dark background, like a lantern at night, or the sun’s reflection on a bright day.

Instead of a uniform circular blurry glow we’ll see many-pointed asymmetrical start-like patterns which also have hue shifts, unique to our own eyes. But our bloom effect will represent a featureless camera with uniform scattering.
在这里插入图片描述

Bloom caused by scattering in camera.

2.1 Bloom Mode
We’re going to support both classical additive and energy-conserving scattering bloom. Add an enum option for these modes to PostFXSettings.BloomSettings. Also add a 0–1 slider to control how much the light gets scattered.

public enum Mode { Additive, Scattering }

		public Mode mode;

		[Range(0f, 1f)]
		public float scatter;

在这里插入图片描述
Scattering mode chosen and set to 0.5.

Rename the existing BloomCombine pass to BloomAdd and introduce a new BloomScatter pass. Make sure that the enum and pass order remain alphabetical. Then use the appropriate pass in DoBloom during the combine phase. In the case of scattering we’ll use the scatter amount for intensity instead of 1. We still use the configured intensity for the final draw.

Pass combinePass;
		if (bloom.mode == PostFXSettings.BloomSettings.Mode.Additive) {
			combinePass = Pass.BloomAdd;
			buffer.SetGlobalFloat(bloomIntensityId, 1f);
		}
		else {
			combinePass = Pass.BloomScatter;
			buffer.SetGlobalFloat(bloomIntensityId, bloom.scatter);
		}
		
		if (i > 1) {
			buffer.ReleaseTemporaryRT(fromId - 1);
			toId -= 5;
			for (i -= 1; i > 0; i--) {
				buffer.SetGlobalTexture(fxSource2Id, toId + 1);
				Draw(fromId, toId, combinePass);}
		}
		else {
			buffer.ReleaseTemporaryRT(bloomPyramidId);
		}
		buffer.SetGlobalFloat(bloomIntensityId, bloom.intensity);
		buffer.SetGlobalTexture(fxSource2Id, sourceId);
		Draw(fromId, BuiltinRenderTextureType.CameraTarget, combinePass);

The function for the BloomScatter pass is the same as the one for BloomAdd, except that it interpolates between the high-resolution and low-resolution sources based on intensity, instead of adding them. Thus a scatter amount of zero means that only the lowest bloom pyramid level is used while scatter 1 means that only the highest is used. At 0.5 the contributions of successive levels end up as 0.5, 0.25, 0.125, 0.125 in case of four levels.

float4 BloomScatterPassFragment (Varyings input) : SV_TARGET {
	float3 lowRes;
	if (_BloomBicubicUpsampling) {
		lowRes = GetSourceBicubic(input.fxUV).rgb;
	}
	else {
		lowRes = GetSource(input.fxUV).rgb;
	}
	float3 highRes = GetSource2(input.fxUV).rgb;
	return float4(lerp(highRes, lowRes, _BloomIntensity), 1.0);
}

Scattering bloom doesn’t brighten the image. It might appear to darken the above example, but that’s because it only shows a cropped portion of the original. The energy conservation isn’t perfect however, because the Gaussian filter gets clamped to the edge of the image, which means that the contribution of the edge pixels is magnified. We could compensate for that, but won’t because it’s usually not obvious.

3 tone mapping
although we can render in HDR the final frame buffer is always LDR for regular cameras. thus color channels get cut off at 1.
effectively the white point of the final image is at 1.
extremely bright colors end up looking no different that those what are fully saturated.
for example, i made a scene with multiple light levels and objects with various amounts of emission, far exceeding 1.
the strongest emission is 8 and the brightest light has intensity 200.

在这里插入图片描述
Scene without post FX; only realtime lighting.

Without applying any post FX it is hard and even impossible to tell which objects and lights are the very bright ones. We can use bloom to make this obvious. For example, I used threshold 1, knee 0.5, intensity 0.2, and scatter 0.7 with max iterations.
在这里插入图片描述
the glowing objects are clearly supposed to be bright, but we still do not get a sense of how bright they are relative to the rest of the scene.
to do so we would need to adjust the brightness of the image——increasing its white point—so the brightest colors no longer exceed 1.
We could do that by uniformly darkening the entire image, but that would make most of it so dark that we wouldn’t be able to clearly see it. Ideally we adjust the very bright colors a lot while adjusting dark colors only a little. Thus we need a nonuniform color adjustment. This color adjustment doesn’t represent a physical change of the light itself, but how it is observed. For example, our eyes are more sensitive to darker tones than lighter ones.

Conversion from HDR to LDR is known as tone mapping, which comes from photography and film development. Traditional photos and film also have a limited range and nonuniform light sensitivity, so many techniques have been developed to perform the conversion. There is no single correct way to perform tone mapping. Different approaches can be used to set the mood of the final result, like a classical filmic look.

3.1 Extra Post FX Step
We perform tone mapping in a new post FX step after bloom. Add a DoToneMapping method to PostFXStack for this purpose, which initially just copies a source to the camera target.

void DoToneMapping(int sourceId) {
		Draw(sourceId, BuiltinRenderTextureType.CameraTarget, Pass.Copy);
	}

We need to adjust the result of bloom, so get a new full-resolution temporary render texture and use it as the final destination in DoBloom. Also make it return whether it drew anything, instead of directly drawing to the camera target when the effect is skipped.

int
		bloomBucibicUpsamplingId = Shader.PropertyToID("_BloomBicubicUpsampling"),
		bloomIntensityId = Shader.PropertyToID("_BloomIntensity"),
		bloomPrefilterId = Shader.PropertyToID("_BloomPrefilter"),
		bloomResultId = Shader.PropertyToID("_BloomResult"),;

	…
	
	bool DoBloom (int sourceId) {
		//buffer.BeginSample("Bloom");
		PostFXSettings.BloomSettings bloom = settings.Bloom;
		int width = camera.pixelWidth / 2, height = camera.pixelHeight / 2;
		
		if (
			bloom.maxIterations == 0 || bloom.intensity <= 0f ||
			height < bloom.downscaleLimit * 2 || width < bloom.downscaleLimit * 2
		) {
			//Draw(sourceId, BuiltinRenderTextureType.CameraTarget, Pass.Copy);
			//buffer.EndSample("Bloom");
			return false;
		}
		
		buffer.BeginSample("Bloom");
		…
		buffer.SetGlobalFloat(bloomIntensityId, finalIntensity);
		buffer.SetGlobalTexture(fxSource2Id, sourceId);
		buffer.GetTemporaryRT(
			bloomResultId, camera.pixelWidth, camera.pixelHeight, 0,
			FilterMode.Bilinear, format
		);
		Draw(fromId, bloomResultId, finalPass);
		buffer.ReleaseTemporaryRT(fromId);
		buffer.EndSample("Bloom");
		return true;
	}

3.3 reinhard

The goal of our tone mapping is to reduce the brightness of the image so that uniform white regions show a variety of colors, revealing the details that were lost. 乱用otherwise
It’s like when your eyes adjust to a suddenly bright environment until you can see clearly again.
But we don’t want to scale down the entire image uniformly, because that would make darker colors indistinguishable, trading over-brightness for underexposure.

So we need a nonlinear conversion that doesn’t reduce dark values much but reduces high values a lot.
At the extremes,极端情况下zero remains zero and a value that approaches infinity is reduced to 1.
A simple function that accomplishes that is c/(1+c) where c is a color channel. This function is known as the Reinhard tone mapping operation in its simplest form, initially proposed by Mark Reinhard,
except that he applied it to luminance while we’ll apply it to each color channel in isolation.

在这里插入图片描述
Reinhard tone mapping.

this works, but could do wrong for very large values due to precision limitations.
for the same reason very large values end up at 1 much earlier than infinity.
公式:
c/(c+1),当c很大的时候,这个值接近于1,所以值很大,则接近1的速度就很快。
so let us clamp the color before performing tone mapping.
a limit of 60 avoids any potential issues for all modes that we will support.

color.rgb = min(color.rgb, 60.0);
	color.rgb /= color.rgb + 1.0;

在这篇文章中:https://64.github.io/tonemapping/
说到了some disadvantages:
the problem with the ‘simple’ Reinhard tone mapping is that it does not necessarily make good use of the full Low Dynamic Range.
for example, if our max scene radiance (这里可以理解为颜色) happened to be (1.0, 1.0, 1.0) then the resulting maximum brightness would only be (0.5, 0.5, 0.5)——only half of the available range.
如何改进呢?文中提到了一个扩展的Reinhard方法:
在这里插入图片描述
where Cwhite is the biggest radiance value in the scene.
now, our biggest radiance value will get mapped to (1.0, 1.0, 1.0), using the full LDR.

同样此文还提到了一个在catelike中未解释的术语:white point
(note that u can also just set Cwhite to a value lower than the maximum radiance, which will ensure that anything higher gets mapped to (1.0, 1.0, 1.0)——for this reason it is sometimes referred to as the ‘white point’).

其文中还提到了真正的Reinhard:
这里也全部抄写过来:
reinhard’s formulas actually operate on a thing called luminance rather than operating on RGB-triples as i implied.
luminance is a single scalar value which measures how bright we view sth. it may not be obvious, but for example we perceive green as much brighter than blue. in other words, (0.0,0.7,0.0) appears much brighter than (0.0, 0.0, 0.7).
converting a linear RGB triple to a luminance value is easy:
在这里插入图片描述
reinhard’s formula entails is to convert our linear RGB radiance to luminance, apply tone mapping the luminance, then somehow scale our RGB value by the new luminance. the simplest way of doing that final scaling is:
在这里插入图片描述

float luminance(vec3 v)
{
    return dot(v, vec3(0.2126f, 0.7152f, 0.0722f));
}

vec3 change_luminance(vec3 c_in, float l_out)
{
    float l_in = luminance(c_in);
    return c_in * (l_out / l_in);
}

vec3 reinhard_extended_luminance(vec3 v, float max_white_l)
{
    float l_old = luminance(v);
    float numerator = l_old * (1.0f + (l_old / (max_white_l * max_white_l)));
    float l_new = numerator / (1.0f + l_old);
    return change_luminance(v, l_new);
}

3.4 Neutral
the white point of Reinhard tone mapping is theoretically infinite,

#ifndef CUSTOM_POST_FX_PASSES_INCLUDED
#define CUSTOM_POST_FX_PASSES_INCLUDED

#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Color.hlsl"
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Filtering.hlsl"

TEXTURE2D(_PostFXSource);
TEXTURE2D(_PostFXSource2);
SAMPLER(sampler_linear_clamp);

float4 _PostFXSource_TexelSize;

float4 GetSourceTexelSize () {
	return _PostFXSource_TexelSize;
}

float4 GetSource(float2 fxUV) {
	return SAMPLE_TEXTURE2D(_PostFXSource, sampler_linear_clamp, fxUV);
}

float4 GetSourceBicubic (float2 fxUV) {
	return SampleTexture2DBicubic(
		TEXTURE2D_ARGS(_PostFXSource, sampler_linear_clamp), fxUV,
		_PostFXSource_TexelSize.zwxy, 1.0, 0.0
	);
}

float4 GetSource2(float2 fxUV) {
	return SAMPLE_TEXTURE2D(_PostFXSource2, sampler_linear_clamp, fxUV);
}

struct Varyings {
	float4 positionCS : SV_POSITION;
	float2 fxUV : VAR_FX_UV;
};

Varyings DefaultPassVertex (uint vertexID : SV_VertexID) {
	Varyings output;
	output.positionCS = float4(
		vertexID <= 1 ? -1.0 : 3.0,
		vertexID == 1 ? 3.0 : -1.0,
		0.0, 1.0
	);
	output.fxUV = float2(
		vertexID <= 1 ? 0.0 : 2.0,
		vertexID == 1 ? 2.0 : 0.0
	);
	if (_ProjectionParams.x < 0.0) {
		output.fxUV.y = 1.0 - output.fxUV.y;
	}
	return output;
}

bool _BloomBicubicUpsampling;
float _BloomIntensity;

float4 BloomAddPassFragment (Varyings input) : SV_TARGET {
	float3 lowRes;
	if (_BloomBicubicUpsampling) {
		lowRes = GetSourceBicubic(input.fxUV).rgb;
	}
	else {
		lowRes = GetSource(input.fxUV).rgb;
	}
	float3 highRes = GetSource2(input.fxUV).rgb;
	return float4(lowRes * _BloomIntensity + highRes, 1.0);
}

float4 BloomHorizontalPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float offsets[] = {
		-4.0, -3.0, -2.0, -1.0, 0.0, 1.0, 2.0, 3.0, 4.0
	};
	float weights[] = {
		0.01621622, 0.05405405, 0.12162162, 0.19459459, 0.22702703,
		0.19459459, 0.12162162, 0.05405405, 0.01621622
	};
	for (int i = 0; i < 9; i++) {
		float offset = offsets[i] * 2.0 * GetSourceTexelSize().x;
		color += GetSource(input.fxUV + float2(offset, 0.0)).rgb * weights[i];
	}
	return float4(color, 1.0);
}

float4 _BloomThreshold;

float3 ApplyBloomThreshold (float3 color) {
	float brightness = Max3(color.r, color.g, color.b);
	float soft = brightness + _BloomThreshold.y;
	soft = clamp(soft, 0.0, _BloomThreshold.z);
	soft = soft * soft * _BloomThreshold.w;
	float contribution = max(soft, brightness - _BloomThreshold.x);
	contribution /= max(brightness, 0.00001);
	return color * contribution;
}

float4 BloomPrefilterPassFragment (Varyings input) : SV_TARGET {
	float3 color = ApplyBloomThreshold(GetSource(input.fxUV).rgb);
	return float4(color, 1.0);
}

float4 BloomPrefilterFirefliesPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float weightSum = 0.0;
	float2 offsets[] = {
		float2(0.0, 0.0),
		float2(-1.0, -1.0), float2(-1.0, 1.0), float2(1.0, -1.0), float2(1.0, 1.0)
	};
	for (int i = 0; i < 5; i++) {
		float3 c =
			GetSource(input.fxUV + offsets[i] * GetSourceTexelSize().xy * 2.0).rgb;
		c = ApplyBloomThreshold(c);
		float w = 1.0 / (Luminance(c) + 1.0);
		color += c * w;
		weightSum += w;
	}
	color /= weightSum;
	return float4(color, 1.0);
}

float4 BloomScatterPassFragment (Varyings input) : SV_TARGET {
	float3 lowRes;
	if (_BloomBicubicUpsampling) {
		lowRes = GetSourceBicubic(input.fxUV).rgb;
	}
	else {
		lowRes = GetSource(input.fxUV).rgb;
	}
	float3 highRes = GetSource2(input.fxUV).rgb;
	return float4(lerp(highRes, lowRes, _BloomIntensity), 1.0);
}

float4 BloomScatterFinalPassFragment (Varyings input) : SV_TARGET {
	float3 lowRes;
	if (_BloomBicubicUpsampling) {
		lowRes = GetSourceBicubic(input.fxUV).rgb;
	}
	else {
		lowRes = GetSource(input.fxUV).rgb;
	}
	float3 highRes = GetSource2(input.fxUV).rgb;
	lowRes += highRes - ApplyBloomThreshold(highRes);
	return float4(lerp(highRes, lowRes, _BloomIntensity), 1.0);
}

float4 BloomVerticalPassFragment (Varyings input) : SV_TARGET {
	float3 color = 0.0;
	float offsets[] = {
		-3.23076923, -1.38461538, 0.0, 1.38461538, 3.23076923
	};
	float weights[] = {
		0.07027027, 0.31621622, 0.22702703, 0.31621622, 0.07027027
	};
	for (int i = 0; i < 5; i++) {
		float offset = offsets[i] * GetSourceTexelSize().y;
		color += GetSource(input.fxUV + float2(0.0, offset)).rgb * weights[i];
	}
	return float4(color, 1.0);
}

float4 CopyPassFragment (Varyings input) : SV_TARGET {
	return GetSource(input.fxUV);
}

float4 ToneMappingACESPassFragment (Varyings input) : SV_TARGET {
	float4 color = GetSource(input.fxUV);
	color.rgb = min(color.rgb, 60.0);
	color.rgb = AcesTonemap(unity_to_ACES(color.rgb));
	return color;
}

float4 ToneMappingNeutralPassFragment (Varyings input) : SV_TARGET {
	float4 color = GetSource(input.fxUV);
	color.rgb = min(color.rgb, 60.0);
	color.rgb = NeutralTonemap(color.rgb);
	return color;
}

float4 ToneMappingReinhardPassFragment (Varyings input) : SV_TARGET {
	float4 color = GetSource(input.fxUV);
	color.rgb = min(color.rgb, 60.0);
	color.rgb /= color.rgb + 1.0;
	return color;
}

#endif
using UnityEngine;
using UnityEngine.Rendering;

public partial class PostFXStack {

	enum Pass {
		BloomAdd,
		BloomHorizontal,
		BloomPrefilter,
		BloomPrefilterFireflies,
		BloomScatter,
		BloomScatterFinal,
		BloomVertical,
		Copy,
		ToneMappingACES,
		ToneMappingNeutral,
		ToneMappingReinhard
	}

	const string bufferName = "Post FX";

	const int maxBloomPyramidLevels = 16;

	int
		bloomBucibicUpsamplingId = Shader.PropertyToID("_BloomBicubicUpsampling"),
		bloomIntensityId = Shader.PropertyToID("_BloomIntensity"),
		bloomPrefilterId = Shader.PropertyToID("_BloomPrefilter"),
		bloomResultId = Shader.PropertyToID("_BloomResult"),
		bloomThresholdId = Shader.PropertyToID("_BloomThreshold"),
		fxSourceId = Shader.PropertyToID("_PostFXSource"),
		fxSource2Id = Shader.PropertyToID("_PostFXSource2");

	CommandBuffer buffer = new CommandBuffer {
		name = bufferName
	};

	ScriptableRenderContext context;

	Camera camera;

	PostFXSettings settings;

	int bloomPyramidId;

	bool useHDR;

	public bool IsActive => settings != null;

	public PostFXStack () {
		bloomPyramidId = Shader.PropertyToID("_BloomPyramid0");
		for (int i = 1; i < maxBloomPyramidLevels * 2; i++) {
			Shader.PropertyToID("_BloomPyramid" + i);
		}
	}

	public void Setup (
		ScriptableRenderContext context, Camera camera, PostFXSettings settings,
		bool useHDR
	) {
		this.useHDR = useHDR;
		this.context = context;
		this.camera = camera;
		this.settings =
			camera.cameraType <= CameraType.SceneView ? settings : null;
		ApplySceneViewState();
	}

	public void Render (int sourceId) {
		if (DoBloom(sourceId)) {
			DoToneMapping(bloomResultId);
			buffer.ReleaseTemporaryRT(bloomResultId);
		}
		else {
			DoToneMapping(sourceId);
		}
		context.ExecuteCommandBuffer(buffer);
		buffer.Clear();
	}

	bool DoBloom (int sourceId) {
		PostFXSettings.BloomSettings bloom = settings.Bloom;
		int width = camera.pixelWidth / 2, height = camera.pixelHeight / 2;
		
		if (
			bloom.maxIterations == 0 || bloom.intensity <= 0f ||
			height < bloom.downscaleLimit * 2 || width < bloom.downscaleLimit * 2
		) {
			return false;
		}

		buffer.BeginSample("Bloom");
		Vector4 threshold;
		threshold.x = Mathf.GammaToLinearSpace(bloom.threshold);
		threshold.y = threshold.x * bloom.thresholdKnee;
		threshold.z = 2f * threshold.y;
		threshold.w = 0.25f / (threshold.y + 0.00001f);
		threshold.y -= threshold.x;
		buffer.SetGlobalVector(bloomThresholdId, threshold);

		RenderTextureFormat format = useHDR ?
			RenderTextureFormat.DefaultHDR : RenderTextureFormat.Default;
		buffer.GetTemporaryRT(
			bloomPrefilterId, width, height, 0, FilterMode.Bilinear, format
		);
		Draw(
			sourceId, bloomPrefilterId, bloom.fadeFireflies ?
				Pass.BloomPrefilterFireflies : Pass.BloomPrefilter
		);
		width /= 2;
		height /= 2;

		int fromId = bloomPrefilterId, toId = bloomPyramidId + 1;
		int i;
		for (i = 0; i < bloom.maxIterations; i++) {
			if (height < bloom.downscaleLimit || width < bloom.downscaleLimit) {
				break;
			}
			int midId = toId - 1;
			buffer.GetTemporaryRT(
				midId, width, height, 0, FilterMode.Bilinear, format
			);
			buffer.GetTemporaryRT(
				toId, width, height, 0, FilterMode.Bilinear, format
			);
			Draw(fromId, midId, Pass.BloomHorizontal);
			Draw(midId, toId, Pass.BloomVertical);
			fromId = toId;
			toId += 2;
			width /= 2;
			height /= 2;
		}

		buffer.ReleaseTemporaryRT(bloomPrefilterId);
		buffer.SetGlobalFloat(
			bloomBucibicUpsamplingId, bloom.bicubicUpsampling ? 1f : 0f
		);

		Pass combinePass, finalPass;
		float finalIntensity;
		if (bloom.mode == PostFXSettings.BloomSettings.Mode.Additive) {
			combinePass = finalPass = Pass.BloomAdd;
			buffer.SetGlobalFloat(bloomIntensityId, 1f);
			finalIntensity = bloom.intensity;
		}
		else {
			combinePass = Pass.BloomScatter;
			finalPass = Pass.BloomScatterFinal;
			buffer.SetGlobalFloat(bloomIntensityId, bloom.scatter);
			finalIntensity = Mathf.Min(bloom.intensity, 1f);
		}

		if (i > 1) {
			buffer.ReleaseTemporaryRT(fromId - 1);
			toId -= 5;
			for (i -= 1; i > 0; i--) {
				buffer.SetGlobalTexture(fxSource2Id, toId + 1);
				Draw(fromId, toId, combinePass);
				buffer.ReleaseTemporaryRT(fromId);
				buffer.ReleaseTemporaryRT(toId + 1);
				fromId = toId;
				toId -= 2;
			}
		}
		else {
			buffer.ReleaseTemporaryRT(bloomPyramidId);
		}
		buffer.SetGlobalFloat(bloomIntensityId, finalIntensity);
		buffer.SetGlobalTexture(fxSource2Id, sourceId);
		buffer.GetTemporaryRT(
			bloomResultId, camera.pixelWidth, camera.pixelHeight, 0,
			FilterMode.Bilinear, format
		);
		Draw(fromId, bloomResultId, finalPass);
		buffer.ReleaseTemporaryRT(fromId);
		buffer.EndSample("Bloom");
		return true;
	}

	void DoToneMapping(int sourceId) {
		PostFXSettings.ToneMappingSettings.Mode mode = settings.ToneMapping.mode;
		Pass pass = mode < 0 ? Pass.Copy : Pass.ToneMappingACES + (int)mode;
		Draw(sourceId, BuiltinRenderTextureType.CameraTarget, pass);
	}

	void Draw (
		RenderTargetIdentifier from, RenderTargetIdentifier to, Pass pass
	) {
		buffer.SetGlobalTexture(fxSourceId, from);
		buffer.SetRenderTarget(
			to, RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store
		);
		buffer.DrawProcedural(
			Matrix4x4.identity, settings.Material, (int)pass,
			MeshTopology.Triangles, 3
		);
	}
}

相关文章:

  • RenderTexutre在FrameDebugger中查看
  • 第一篇blog来啦 - 君子坦荡荡
  • Blend模式混合公式
  • 一周的收获
  • 分布式Java应用:基础与实践
  • 鱼竿弯曲曲线
  • 一次rman恢复的实验
  • Particles Color and Depth Textures
  • Android下的SQLite数据库的相关操作及AndroidTestCase测试
  • VAR_LIGHT_MAP_UV语义
  • [置顶] 数据库开发常识
  • 数据结构中三表合一的实现
  • 雷电
  • CSLA - 介绍以及学习资料
  • 软粒子
  • php的引用
  • 【css3】浏览器内核及其兼容性
  • 【剑指offer】让抽象问题具体化
  • HashMap ConcurrentHashMap
  • javascript从右向左截取指定位数字符的3种方法
  • JavaScript异步流程控制的前世今生
  • Linux Process Manage
  • Linux链接文件
  • Sublime text 3 3103 注册码
  • Vue 2.3、2.4 知识点小结
  • 安装python包到指定虚拟环境
  • 创建一个Struts2项目maven 方式
  • 等保2.0 | 几维安全发布等保检测、等保加固专版 加速企业等保合规
  • 个人博客开发系列:评论功能之GitHub账号OAuth授权
  • 如何在GitHub上创建个人博客
  • 体验javascript之美-第五课 匿名函数自执行和闭包是一回事儿吗?
  • 微信小程序--------语音识别(前端自己也能玩)
  • 想使用 MongoDB ,你应该了解这8个方面!
  • 想写好前端,先练好内功
  • 追踪解析 FutureTask 源码
  • Java数据解析之JSON
  • 阿里云API、SDK和CLI应用实践方案
  • 阿里云移动端播放器高级功能介绍
  • ​ssh免密码登录设置及问题总结
  • #{} 和 ${}区别
  • #前后端分离# 头条发布系统
  • $L^p$ 调和函数恒为零
  • (附源码)计算机毕业设计SSM教师教学质量评价系统
  • (十二)springboot实战——SSE服务推送事件案例实现
  • (一)Linux+Windows下安装ffmpeg
  • (幽默漫画)有个程序员老公,是怎样的体验?
  • **PHP二维数组遍历时同时赋值
  • .bat批处理(六):替换字符串中匹配的子串
  • .NET CF命令行调试器MDbg入门(一)
  • .NET 发展历程
  • .NET/C# 使用 ConditionalWeakTable 附加字段(CLR 版本的附加属性,也可用用来当作弱引用字典 WeakDictionary)
  • .net开发时的诡异问题,button的onclick事件无效
  • .NET牛人应该知道些什么(2):中级.NET开发人员
  • @font-face 用字体画图标
  • [BT]BUUCTF刷题第4天(3.22)