TTC Labs - [XR] Adjusting How Your Body-Based Data is Interpreted

[XR] Adjusting How Your Body-Based Data is Interpreted

6th Jun 2023

This pattern was a co-created output of a Design Jam which took place in Seoul National University on the subject of body-based data in XR. During the Jam, four design teams created simple prototypes which helped deliver better transparency and control over data use in imagined XR services.

The speculative scenario of Team Travel centered on the experience of a person visiting Japan’s digital twin to plan an upcoming trip with his wife who relies on a wheelchair for mobility. He has low technology and privacy savviness.

Participating in the Body-Based Data Privacy in XR Design Jam in South Korea, Team Travel was made up of students and academics from Seoul National University together with external privacy experts.

Problem & Opportunity

During the Design Jam, Team Travel identified an opportunity to design accessible VR features that help their persona to understand and control recommendations using gestures and voice commands with ease.

How might we...

...help people easily understand recommendations based on their body-based data, with controls optimized for accessibility?

Data being used
  • Neural
  • Motion
  • Vitals
  • Voice
Key Features of Prototype

Team Travel’s solution explored how to leverage body-based data to provide relevant recommendations and gesture-based controls to easily navigate the experience.

Accessible XR interfaces
Accessible interfaces that allow people to interact using their body movements or voice commands, instead of relying solely on buttons or traditional 2D interfaces.

Showing When Someone Likes Something
Throughout the experience, people are presented with visual notifications showing what was inferred from reading their body-based data. At any point, users can access a summary of the things they enjoyed during the experience.

Adjusting Recommendations
The person has the option to agree or disagree with the recommendations they receive. This feedback is used to improve the accuracy and relevance of future recommendations.

Design Pattern: Fine Tuner

Team Travel used XR prototyping techniques to realize their solution, making use of props and sketches to demonstrate the spatial and physical dimensions of their prototype. The prototypes that were developed by Team Travel and the other participating teams were then used to synthesize learnings and insights which were distilled into UX design patterns for privacy interactions.

Team Travel’s solution was used as the basis for the Fine Tuner design pattern.

The Fine Tuner is designed for someone who wants to teach a system how to accurately read their data, ensuring interpretations are more precise in the future.

In this example, the system shows a visual representation of a person’s emotional response during an experience. Inferring that the person likes the bicycle, the system shows a heart emoticon. The person disagrees with this interpretation, choosing another option that represents their feeling better — in this case a doubting face.