Augmented-reality (AR) systems present information about a user’s surrounding environment by overlaying it on the user’s real-world view. However, such overlaid information tends to obscure a user’s field of view and thus impedes a user’s real-world activities. This problem is especially critical when a user is wearing a head-mounted display. In this paper, we propose an information presentation mechanism for mobile AR systems by focusing on the user’s gaze information and peripheral vision field. The gaze information is used to control the positions and the level-of-detail of the information overlaid on the user’s field of view. We also propose a method for switching displayed information based on the difference in human visual perception between the peripheral and central visual fields. We develop a mobile AR system to test our proposed method consisting of a gaze-tracking system and a retinal imaging display. The eye-tracking system estimates whether the user’s visual focus is on the information display area or not, and changes the information type from simple to detailed information accordingly.
Yoshio Ishiguro and Jun Rekimoto, Peripheral Vision Annotation: Noninterference Information Presentation Method for Mobile Augmented Reality, The 2nd International conference on Augmented Human (AH), New York, NY, USA, ACM, March 12-14, Odaiba, Tokyo, Japan. [Best Paper Award !!]