Towards Privacy-Aware AR: A Look into Mitigating Privacy Concerns in a Pervasive AR Future

Fig 1: An overview of our approach to creating a interactive privacy-aware AI assistant for everyday AR In this work, we present a practical design framework for a privacy-aware virtual assistant for everyday AR, focusing on educating users with a lack of knowledge regarding technical and/or privacy literacy. Our approach features human-in-the-loop to learn privacy context, provides transparency into the system state of privacy detectors, and affords the user control and the ability to provide feedback to the system. The...

Context-Aware Inference and Adaptation: A Framework for Designing Intelligent AR Interfaces

We have developed a holistic framework for the design of Intelligent AR interfaces. Using our proposed taxonomy of everyday AR contexts we proposed a framework for the design of an intelligent AR interface to infer users’ wants and needs and predict the desired adaptations to the AR interface design dimensions, virtual content, and interaction techniques. Depending on the context, the intelligent interface may make general adaptations to the whole system, or to individual apps. This work is in preparation for...

Exploring the Benefits and Challenges of Working with AR Virtual Displays In-The-Wild

Mobile workers have limited access to display infrastructure while working on remote settings. While permanent office settings can support one or more monitors with large screen space, workers in remote settings are often reliant on the portable devices available to them at the time, such as laptops, tablets, and smartphones. Existing research has shown that virtual displays are feasible with current technology although they do not perform as well as a physical multi-monitor setup would. They have been shown to...

EyeST: Predicting Information Relevance from Eye Gaze Data in Immersive Space to Think

Eye gaze patterns vary based on reading purpose and complexity, and can provide insights into a reader's perception of the content. We hypothesize that during a complex sensemaking task with many text-based documents, we will be able to use eye-tracking data to predict the importance of documents and words, which could be the basis for intelligent suggestions made by the system to an analyst. We introduce a novel eye-gaze metric called `GazeScore' that predicts an analyst's perception of the relevance...

Goldilocks Zoning: Evaluating a Gaze-Aware Approach to Task-Agnostic VR Notification Placement

While virtual reality (VR) offers immersive experiences, users need to remain aware of notifications from outside VR. However, inserting notifications into a VR experience can result in distraction or breaks in presence, since existing notification systems in VR use static placement and lack task and environment awareness. We address this challenge by introducing a novel notification placement technique, Goldilocks Zoning, which leverages a 360-degree heatmap generated using eye-tracking data in order to place notifications near salient areas of the environment...

Gestures vs. Emojis: Comparing Non-Verbal Reaction Visualizations for Immersive Collaboration

Collaborative virtual environments afford new capabilities in telepresence applications, allowing participants to co-inhabit an environment to interact while being embodied via avatars. However, shared content within these environments often takes away the attention of collaborators from observing the non-verbal cues conveyed by their peers, resulting in less effective communication. Exaggerated gestures, abstract visuals, as well as a combination of the two, have the potential to improve the effectiveness of communication within these environments in comparison to familiar, natural non-verbal visualizations....

Welcome to the 3D Interaction Group

The 3D Interaction (3DI) Group performs research on 3D user interfaces (3D UIs) and 3D interaction techniques for a wide range of tasks and applications. Interaction in three dimensions is crucial to highly interactive virtual reality (VR) and augmented reality (AR) applications to education, training, gaming, visualization, and design. We also conduct empirical studies to understand the effects of immersion in VR and AR, the impact of natural and magic 3D interaction techniques, and usability and user experience in 3D UIs.

Our Team

Our Lab