2:00 PM | Introduction Anjul Patney, NVIDIA Research |
Slides (PPTX) |
2:10 PM | A Framework for Perception-driven Advancement of MR systems Femke van Beek, Facebook Reality Labs |
Slides (PPTX) |
2:40 PM | Eye and Eye Movements Joohwan Kim, NVIDIA |
Slides (PPTX) |
3:10 PM | Case study: Redirected Walking in VR/AR Prof. Dr. Frank Steinicke, Universität Hamburg |
Slides (PDF) |
3:40 PM | Break | |
3:50 PM | Case study: ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism Prof. Martin S. Banks, University of California at Berkeley |
Slides (PPTX) |
4:20 PM | Case study: Computational Near-Eye Displays with Focus Cues Robert Konrad, Stanford University |
Slides (PDF) |
4:50 PM | Summary / Q&A Moderated by Anjul Patney, NVIDIA Research |
Slides (PPTX) |
Over the past few years, mixed reality (MR, encompassing both virtual and augmented reality) has transitioned from the realm of expensive research prototypes and military installations into widely available consumer devices. Supporting the increasing pixel counts and framerates of modern MR head-mounted displays (HMDs) continues to be a challenging computing workload. Additionally, contemporary MR experiences still have considerable room for improvements from the perspective of visual perception, performance, and comfort. Insights from vision science have been repeatedly shown to improve both the immersion and performance of mixed-reality experiences. Thus, an understanding of this rapidly evolving field is vital for HMD designers, application developers, and content creators.
Our course begins with an overview of the importance of human perception in modern MR. We accompany this overview with a dive into the key characteristics of the human visual system and the psychophysical methods used to study its properties. After laying the vision science groundwork, we present three case studies outlining the applications of human vision to improving the performance, quality, and applicability of MR graphics. Finally, we conclude with a Q&A session for more in-depth audience interaction.
Site last updated: July 29, 2018