NVIDIA OptiX 8.0 nvidia_logo_transpbg.gif Up
OptixDenoiserParams Struct Reference

#include <optix_types.h>

Public Attributes

CUdeviceptr hdrIntensity
 
float blendFactor
 
CUdeviceptr hdrAverageColor
 
unsigned int temporalModeUsePreviousLayers
 

Detailed Description

Member Data Documentation

 blendFactor

float OptixDenoiserParams::blendFactor

blend factor. If set to 0 the output is 100% of the denoised input. If set to 1, the output is 100% of the unmodified input. Values between 0 and 1 will linearly interpolate between the denoised and unmodified input.

 hdrAverageColor

CUdeviceptr OptixDenoiserParams::hdrAverageColor

this parameter is used when the OPTIX_DENOISER_MODEL_KIND_AOV model kind is set. average log color of input image, separate for RGB channels (default null pointer). points to three floats. if set to null, average log color will be calculated automatically. See hdrIntensity for tiling, this also applies here.

 hdrIntensity

CUdeviceptr OptixDenoiserParams::hdrIntensity

average log intensity of input image (default null pointer). points to a single float. if set to null, autoexposure will be calculated automatically for the input image. Should be set to average log intensity of the entire image at least if tiling is used to get consistent autoexposure for all tiles.

 temporalModeUsePreviousLayers

unsigned int OptixDenoiserParams::temporalModeUsePreviousLayers

In temporal modes this parameter must be set to 1 if previous layers (e.g. previousOutputInternalGuideLayer) contain valid data. This is the case in the second and subsequent frames of a sequence (for example after a change of camera angle). In the first frame of such a sequence this parameter must be set to 0.