Image Reconstruction from Spatially Non-Uniform Samples

Adam Siekawa

Supervisor(s): Radosław Mantiuk

West Pomeranian University of Technology


Abstract: We evaluate methods for image reconstruction from spatially non-uniform samples. Such distribution is characteristic for location of cones on the human retina. Transformation from spatially non-uniform samples to raster image in Cartesian coordinates is obligatory step for the retinal rendering techniques. We present two methods in our work, one using triangle mesh renderer and second using image post processing operation called a push-pull. In the first method vertices are placed at corresponding sample positions in screen space, which allows us to perform fast triangular interpolation of values on a GPU. Second method is based on image pyramid processing which filters out blank pixels during downsampling and fills them during upsampling. We evaluate the performance and quality of reconstruction using sample data generated by the GPU-accelerated ray tracer. As it is work in progress, a map of the non-uniform sample distribution and a map of triangles are generated off-line during preprocessing.
Keywords: Image Processing, Physically-based Rendering, Real-time Graphics, Rendering, VR
Full text:
Year: 2017