next up previous
Next: Introduction

M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 372

Appears in Applied Artificial Intelligence, June 1997, Vol. 11, No. 4, pp. 267-284.

Perceptive Spaces for Performance and Entertainment:
Untethered Interaction using Computer Vision and Audition

Christopher R. Wren, Flavia Sparacino, Ali J. Azarbayejani, Trevor J. Darrell, Thad E. Starner, Akira Kotani, Chloe M. Chao, Michal Hlavac, Kenneth B. Russell, Alex P. Pentland

Perceptual Computing Section, The MIT Media Laboratory ; 20 Ames St., Cambridge, MA 02139 USA
{cwren,flavia,ali,trevor,thad,akira,cchao,hlavac, kbrussel,sandy}@media.mit.edu
http://vismod.media.mit.edu/groups/vismod/

Abstract:

Bulky head-mounted displays, data gloves, and severely limited movement have become synonymous with virtual environments. This is unfortunate since virtual environments have such great potential in applications such as entertainment, animation by example, design interface, information browsing, and even expressive performance. In this paper we describe an approach to unencumbered, natural interfaces called Perceptive Spaces. The spaces are unencumbered because they utilize passive sensors that don't require special clothing and large format displays that don't isolate the user from their environment. The spaces are natural because the open environment facilitates active participation. Several applications illustrate the expressive power of this approach, as well as the challenges associated with designing these interfaces.





Christopher R. Wren
Thu Jul 10 10:04:10 EDT 1997