home · mobile · calendar · colloquia · 2005-2006 · 

Colloquium - Dowding and Clark

Spoken Dialogue Systems Supporting Surface Exploration
NASA Ames Research Center
Kelley Clark
NASA Ames Research Center

Astronauts in space suits on EVA have few options for interacting with their computer hardware and software systems. Pressurized suits and gloves restrict movement, which makes the use of traditional keyboards and mice problematic. During a series of NASA-sponsored field tests, we have explored the use of spoken dialogue systems for all computer interactions, effectively using mouth and ears instead of hands and eyes. The three most recent field tests have taken place at the Mars Society Desert Research Center, in Rotations 9 (2003), 16 (2004), and 37 (2005). This work is part of the Mobile Agents project, led by Bill Clancey at NASA Ames Research Center.

The computer hardware and software developed under the Mobile Agents project implement an automated EVA assistant, which helps an astronaut with logging samples that have been collected, logging and uploading images and voice annotations, and tracking the astronaut's location, health, and progress. The end-product of a successful EVA is a database of samples, images, and voice annotations, indexed by time and location, with explicit connections between correlated items. The system also supports commanding robotic assistants to provide a variety of support functions, including taking pictures, towing gear to specific locations, and managing wireless network connectivity.

During these field tests we have collected over 25K utterances of speech data in a mixed human-human/human-computer modality. We will present descriptive statistics about this data, speech recognition performance results, and details of how this uniquely valuable data resource is driving our speech recognition research.

Department of Computer Science
University of Colorado Boulder
Boulder, CO 80309-0430 USA
May 5, 2012 (14:13)