Skip to main content

Perceived Aliasing Thresholds in High-Fidelity Rendering

Veronica Sundstedt, Kurt Debattista, Alan Chalmers, Perceived Aliasing Thresholds in High-Fidelity Rendering. APGV 2005 - Second Symposium on Applied Perception in Graphics and Visualization (poster). August 2005. PDF, 1867 Kbytes.

Abstract

High-fidelity rendering is very computationally expensive making it difficult to achieve interactive rates except for simple scenes. Recent selective rendering techniques, which alter the number of rays cast per pixel, have been explored to achieve this goal. These approaches have shown that rendering times can be significantly reduced without perceptual degradation. In traditional ray-traced images aliasing is removed by supersampling the image plane. In this sketch we identify the threshold at which decreasing the number or rays shot per pixel would result in no perceptual degradation. We conduct psychophysical experiments using four realistic environments and one test environment as a comparison. This test scene was designed to exhibit high spatial frequencies and thus was a worst case for aliasing. We determine the computational bound by varying the number of rays shot per pixel in both still images and animations. The lighting simulation system Radiance is adapted for use in these experiments. The results can be used in the design of more effective perceptual selective rendering algorithms; using the computational bound as an indication of where to threshold the rendering process. Selective rendering will alter this threshold based on perceptual importance of pixels within the image. This will reduce computation time while maintaining a perceptually high quality result for realistic scenes.

Bibtex entry.

Contact details

Publication Admin