Context
In 2023, I joined a UC Berkeley team selected for NASA’s SUITS challenge. We were among the 11 top groups chosen by NASA to design an AR software on HoloLens to aid lunar missions. This project is still developing, more animated graphics will be provided soon in May 2024.
Challanges
How to design a hands-free AR system for astronauts that supports lunar mission tasks without interrupting their ongoing activities
Final Design
LLM-Powered Responsive Voice Assistant
Ursa's voice assistant is enhanced with Large Language Models (LLM), offering astronauts tailored instructions and feedback
Non-intrusive Interfaces
Ursa features non-intrusive interfaces strategically placed in the user's peripheral vision, ensuring essential information is accessible without obstructing the lunar environment
The Collaboration
Our team held regular meetings with NASA to better understand astronaut needs, guiding our interface design. I collaborated with front-end and back-end engineers who focusing on areas such as image tracking, rover control, head-mounted device development, and LLM function integration.
Discover
Who is our user?
Astronaut
Heavy things on the hand
Astronauts often need to carry heavy equipment in their hands, which can hinder their ability to operate system with hands.
Harsh lunar environments
The extreme conditions on the moon require astronauts to continuously monitor their suits' signals and surrounding environment. Consequently, any system used must minimize cognitive load to prevent overburdening the astronauts and ensure their focus remains on critical tasks.
What's the NASA’s Tasks
Task 1 - Navigation
Task 2 - Egress (Prepare suits for outdoor tasks)
Task 3 - Geo Sampling
Task 4 - Control Rover
Task 5 - Communicate between LMCC and Astronaut
Define
Design Goals
Non-intrusive
Interfaces are subtly integrated, preserving clear environmental visibility for astronaut
Hands-Free
The system operates completely hands-free, facilitating unencumbered task execution.
Smooth
Communication
Enhanced commands and real-time updates ensure uninterrupted and clear communication.
Challenge 1
Optimal Interaction Methods for Astronauts
In traditional UX design, finger touch is the predominant input method. However, given that astronauts cannot use their hands freely, we explored alternatives such as gaze control and voice control.
Gaze Control
Advantages: Precise location targeting.
Disadvantages: Requires looking around, potentially causing accidental inputs.
Voice Control
Advantages: Minimizes errors and doesn't require manual input.
Disadvantages: Less effective for precise location-based tasks like setting a pin.
Final Decision:Voice Control
After analyzing all tasks, we found minimal need for precise location inputs, limiting the scenarios where gaze control would be beneficial. We chose voice control as it significantly reduces the potential for errors and is more adaptable to the wide range of tasks astronauts perform, providing a more efficient and error-free user experience.
Challenge 2
Designing a Voice Assistant for NASA's Multiple Tasks on the moon.
We were tasked with designing a voice assistant to support astronauts in completing NASA's seven main EVA (Extravehicular Activity) scenarios on the moon. The challenge involved creating a user-friendly voice assistant that avoids operational confusion. After receiving NASA’s detailed requirements, we analyzed and categorized the tasks into five distinct groups to ensure clarity and ease of use in command execution.
Final Design
To enhance developer' understanding of system interactions, we designed a voice flow prototype that simulates real-time communication with the voice assistant.
Ongoing….
We plan to finish it by the end of the May and present it to NASA. More details are coming in one months
yani-shi
Work
Creative Tech
Resume
About