NVIDIA OptiX 7.7 nvidia_logo_transpbg.gif Up
OptixDenoiserParams Struct Reference

#include <optix_types.h>

Public Attributes

OptixDenoiserAlphaMode denoiseAlpha
 
CUdeviceptr hdrIntensity
 
float blendFactor
 
CUdeviceptr hdrAverageColor
 
unsigned int temporalModeUsePreviousLayers
 

Member Data Documentation

 blendFactor

float OptixDenoiserParams::blendFactor

blend factor. If set to 0 the output is 100% of the denoised input. If set to 1, the output is 100% of the unmodified input. Values between 0 and 1 will linearly interpolate between the denoised and unmodified input.

 denoiseAlpha

OptixDenoiserAlphaMode OptixDenoiserParams::denoiseAlpha

alpha denoise mode

 hdrAverageColor

CUdeviceptr OptixDenoiserParams::hdrAverageColor

this parameter is used when the OPTIX_DENOISER_MODEL_KIND_AOV model kind is set. average log color of input image, separate for RGB channels (default null pointer). points to three floats. with the default (null pointer) denoised results will not be optimal.

 hdrIntensity

CUdeviceptr OptixDenoiserParams::hdrIntensity

average log intensity of input image (default null pointer). points to a single float. with the default (null pointer) denoised results will not be optimal for very dark or bright input images.

 temporalModeUsePreviousLayers

unsigned int OptixDenoiserParams::temporalModeUsePreviousLayers

In temporal modes this parameter must be set to 1 if previous layers (e.g. previousOutputInternalGuideLayer) contain valid data. This is the case in the second and subsequent frames of a sequence (for example after a change of camera angle). In the first frame of such a sequence this parameter must be set to 0.