#include <optix_types.h>
Public Attributes | |
OptixDenoiserAlphaMode | denoiseAlpha |
CUdeviceptr | hdrIntensity |
float | blendFactor |
CUdeviceptr | hdrAverageColor |
unsigned int | temporalModeUsePreviousLayers |
float OptixDenoiserParams::blendFactor |
blend factor. If set to 0 the output is 100% of the denoised input. If set to 1, the output is 100% of the unmodified input. Values between 0 and 1 will linearly interpolate between the denoised and unmodified input.
OptixDenoiserAlphaMode OptixDenoiserParams::denoiseAlpha |
alpha denoise mode
CUdeviceptr OptixDenoiserParams::hdrAverageColor |
this parameter is used when the OPTIX_DENOISER_MODEL_KIND_AOV model kind is set. average log color of input image, separate for RGB channels (default null pointer). points to three floats. with the default (null pointer) denoised results will not be optimal.
CUdeviceptr OptixDenoiserParams::hdrIntensity |
average log intensity of input image (default null pointer). points to a single float. with the default (null pointer) denoised results will not be optimal for very dark or bright input images.
unsigned int OptixDenoiserParams::temporalModeUsePreviousLayers |
In temporal modes this parameter must be set to 1 if previous layers (e.g. previousOutputInternalGuideLayer) contain valid data. This is the case in the second and subsequent frames of a sequence (for example after a change of camera angle). In the first frame of such a sequence this parameter must be set to 0.