Virtual Reality and Perception Laboratory



 

Our lab performs basic and applied research on stereoscopic depth perception and virtual reality. We study how the brain, or a machine, can reconstruct a three-dimensional percept of the world around us from the two-dimensional images on the retinas and how we use this information to move about and interact with our environment. We are particularly interested in the role of vertical disparities in stereopsis, the integration and relation between depth cues and the role of eye movements in depth perception. We have excellent facilities for generating and presenting three-dimensional displays and recording the movements of the head, eyes and body in response to these displays.

We have active research projects that apply our basic research to the study of perception in virtual and augmented reality. We are pursuing a number of issues in simulation, immersive virtual-reality (VR) and augmented-reality (AR) systems including effects of distortions in the stereoscopic display, cue conflict in synthetic displays, the effects of time delay and the development of novel predictive head trackers for augmented and virtual reality. We also use virtual-reality technology as a tool to study depth perception and the perception of motion through a three-dimensional environment. In virtual worlds we are not constrained by the laws of physics and the natural world. With carefully designed experiments, we can manipulate the sensory inputs to the user more freely to investigate how sensory cues are used and integrated.

 

 

 

Virtual Reality
        and
Perception Lab
Overview
Research Projects
People
News
Funding
Publications
Links
Directions
Grad Studies/
Undergraduate Projects
Contact
NEW: 3D Cinema
Postdoc Position