Peripheral Vision Annotation

ABSTRACT
Augmented-reality (AR) systems present information about a user’s surrounding environment by overlaying it on the user’s real-world view. However, such overlaid information tends to obscure a user’s field of view and thus impedes a user’s real-world activities. This problem is especially critical when a user is wearing a head-mounted display. In this paper, we propose an information presentation mechanism for mobile AR systems by focusing on the user’s gaze information and peripheral vision field. The gaze information is used to control the positions and the level-of-detail of the information overlaid on the user’s field of view. We also propose a method for switching displayed information based on the difference in human visual perception between the peripheral and central visual fields. We develop a mobile AR system to test our proposed method consisting of a gaze-tracking system and a retinal imaging display. The eye-tracking system estimates whether the user’s visual focus is on the information display area or not, and changes the information type from simple to detailed information accordingly.

References

Yoshio Ishiguro and Jun Rekimoto, Peripheral Vision Annotation: Noninterference Information Presentation Method for Mobile Augmented Reality, The 2nd International conference on Augmented Human (AH), New York, NY, USA, ACM, March 12-14, Odaiba, Tokyo, Japan. [Best Paper Award !!] 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s