TTC Labs - [XR] Visualizing People’s Emotional States During an Experience

[XR] Visualizing People’s Emotional States During an Experience

4th Jun 2023

This pattern was a co-created output of a Design Jam which took place in Seoul National University on the subject of body-based data in XR. During the Jam, four design teams created simple prototypes which helped deliver better transparency and control over data use in imagined XR services.

The speculative scenario of Team Travel centered on the experience of a person visiting Japan’s digital twin to plan an upcoming trip with his wife who relies on a wheelchair for mobility. He has low technology and privacy savviness.

Participating in the Body-Based Data Privacy in XR Design Jam in South Korea, Team Travel was made up of students and academics from Seoul National University together with external privacy experts.

Problem & Opportunity

During the Design Jam, Team Travel identified an opportunity to design accessible VR features that help their persona to understand and control recommendations using gestures and voice commands with ease.

How might we... people easily understand recommendations based on their body-based data, with controls optimized for accessibility?

Data being used
  • Neural
  • Motion
  • Vitals
  • Voice
Key Features of Prototype

Team Travel’s solution explored how to leverage body-based data to provide relevant recommendations and gesture-based controls to easily navigate the experience.

Accessible XR Interfaces
Accessible interfaces that allow people to interact using their body movements or voice commands, instead of relying solely on buttons or traditional 2D interfaces.

Showing When Someone Likes Something
Throughout the experience, people are presented with visual notifications showing what was inferred from reading their body-based data. At any point, users can access a summary of the things they enjoyed during the experience.

Adjusting Recommendations
The person has the option to agree or disagree with the recommendations they receive. This feedback is used to improve the accuracy and relevance of future recommendations.

Design Pattern: Live Emoticon

Team Travel used XR prototyping techniques to realize their solution, making use of props and sketches to demonstrate the spatial and physical dimensions of their prototype. The prototypes that were developed by Team Travel and the other participating teams were then used to synthesize learnings and insights which were distilled into UX design patterns for privacy interactions.

Team Travel’s solution was used as the basis for the Live Emoticon design pattern.

The Live Emoticon envisages a feature which could provide a visual representation of a person’s emotional response during an experience. This is designed to provide transparency around interest-based data inputs, showing people how their body-based data is being interpreted in context.

In this example, the system reads a person’s facial data to infer that they like a bicycle. Registering the person’s interest, the system uses Live Emoticons to represent their reaction, and icons to show the data types used to determine their response.