NVIDIA IndeX API nvidia_logo_transpbg.gif Up
nv::index::IApplication_depth_buffer_gl Class Referenceabstract

Depth buffer in OpenGL format provided by an application, enabling NVIDIA IndeX to do depth-correct compositing with a rendering performed by the application. More...

#include <iapplication_depth_buffer.h>

Inherits mi::base::Interface_declare< 0x4d85444e, ... >.

Public Member Functions

virtual mi::Uint32 * get_z_buffer_ptr ()=0
 Return a pointer to the the depth buffer. More...
 
virtual mi::Uint32 get_z_buffer_precision () const =0
 Returns the precision of the depth buffer values. More...
 

Detailed Description

Depth buffer in OpenGL format provided by an application, enabling NVIDIA IndeX to do depth-correct compositing with a rendering performed by the application.

The depth buffer must be be in normalized unsigned integer format. It can be retrieved with an OpenGL call such as:

glReadPixels(0, 0, width, height, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, buffer);

Member Function Documentation

 get_z_buffer_precision()

virtual mi::Uint32 nv::index::IApplication_depth_buffer_gl::get_z_buffer_precision ( ) const
pure virtual

Returns the precision of the depth buffer values.

Returns
Precision of the depth buffer in bits (commonly 24 bits)

 get_z_buffer_ptr()

virtual mi::Uint32 * nv::index::IApplication_depth_buffer_gl::get_z_buffer_ptr ( )
pure virtual

Return a pointer to the the depth buffer.

Returns
pointer to buffer, or 0 if not initialized.

The documentation for this class was generated from the following file: