Responsive Virtual Human Museum Guides

September 1st, 2008 - November 30th, 2012 | PROJECT

The University of Southern California's Institute for Creative Technologies and the Museum of Science, Boston will create life-sized, 3-D Virtual Humans that will interact with visitors as interpretive guides and learning facilitators at science exhibits. Through the use of advanced artificial intelligence and intelligent tutoring techniques, Virtual Humans will provide a highly responsive functionality in their dialogue interpretation that will generate sophisticated interaction with visitors about the STEM content related to the exhibit. The project exemplifies how the confluence of science, technology, engineering, mathematics and education can creatively and collaboratively advance new tools and learning processes. The Virtual Human project will begin to present to the visitor a compelling, real life, interactive example of the future and of the related convergence of various interdisciplinary trends in technology, such as natural language voice recognition, mixed reality environments, para-holographic display, visitor recognition and prior activity recall, artificial intelligence, and other interdisciplinary trends. The 3-D, life-sized Virtual Humans will serve as museum educators in four capacities: 1) as a natural language dialogue-based interactive guide that can suggest exhibits to explore in specific galleries and answer questions about particular STEM content areas, such as computer science; 2) as a coach to help visitors understand and use particular interactive exhibits; 3) be the core focus of the Science behind the Virtual Humans exhibit; and 4) serve as an ongoing research effort to improve human and virtual human interactions at increasingly sophisticated levels of complexity. The deliverables will be designed to build upon visitor experiences and stimulate inquiry. A living lab enables visitors to become part of the research and development process. The project website will introduce visitors to the technologies used to build virtual humans and the research behind their implementation. The site will be augmented with videos and simulations and will generate user created content on virtual human characters. Project evaluation and research will collect language and behavioral data from visitors to inform the improvement of the virtual guide throughout the duration of the grant and to develop a database that directly supports other intelligent systems, and new interface design and development that will have broad impact across multiple fields.

Project Website(s)

(no project website provided)

Project Products

Responsive Virtual Human Museum Guides: Summative Evaluation

Team Members

William Swartout, Principal Investigator, University of Southern California
David Traum, Co-Principal Investigator, University of Southern California
Jacquelyn Morie, Co-Principal Investigator, University of Southern California
Diane Piepol, Co-Principal Investigator, University of Southern California
H. Chad Lane, Co-Principal Investigator, University of Southern California

Funders

Funding Source: NSF
Funding Program: ISE/AISL
Award Number: 0813541
Funding Amount: 2062116

Tags

Audience: Evaluators | General Public | Museum | ISE Professionals
Discipline: Education and learning science | Engineering | General STEM | Mathematics | Technology
Resource Type: Project Descriptions
Environment Type: Exhibitions | Games | Simulations | Interactives | Media and Technology | Museum and Science Center Exhibits | Websites | Mobile Apps | Online Media