Intro

A focus of NASA’s research plans is preparing for the exploration of Near Earth Asteroids (NEA). As part of this work, resources are being devoted to the improvement of Extravehicular Activity (EVA) mobility solutions. While work is being performed on an NEA, an astronaut’s hands must be free to perform any task at all times. Members of Johnson Space Center tasked us to create a hands-free jetpack system to address this necessity.

Previous Solutions

Hands-free mobility solutions have already been heavily researched. Solutions such as eye, chin & head, and even tongue tracking exist already. Unfortunately, these solutions are ill-suited for controlling the EVA jetpack, as they all require physical actions from body parts the astronaut will already be utilizing for their mission. A less physically demanding interaction was required.

The Prototype

We landed on voice-control as our primary input source using the six degrees of freedom as our base. Though astronauts are constantly communicating with their teammates, a command language could be created in order to distinguish conversation from commands. Our command language was developed as seen below:

Command Language

Arduino-Powered LED Box

Surprisingly, NASA wouldn’t let us borrow one of their jetpacks for testing, but we built a white box with LED lights corresponding to the six degrees of freedom as an alternative. As commands are issued in, the box lights up the appropriate LED light.

Google Glass Heads-Up Display

We needed to provide a visual representation of commands and other critical pieces of feedback without requiring the user to take their eyes off of their current task. User feedback indicated the importance of including the time left for the EVA task at hand and a providing a central location for suit status warnings.

Android-Powered Forearm Display

The forearm display presents detailed suit status information, augmenting what is seen in their heads-up display. The information presented on the forearm display includes:

  1. Jetpack command queue
  2. Houston time
  3. Detailed forearm battery status
  4. Detailed heads-up display battery status
  5. Propellant tank status

The microphone on the forearm display was used as the command language input source for this prototype.

Future Work

To expand on this initial foundation, it's important to provide additional audio feedback for command input, the addition of a “Stop” command to interrupt jet movement, and a “Hold” command to keep the astronaut in place. Furthermore, it would be important to provide a preview of upcoming commands in the heads-up display in addition to refining and prioritizing forearm display content.

Conclusion

Using a hands-free approach for EVA control opens up a large number of possibilities from astronaut safety to advanced missions. Our system is designed to address potential problems identified in previous hands-free approaches. Although there is still work to be done, the use of an increasingly computer-automated solution helps increase efficiency and ease of use.