next up previous
Next: Introduction

M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 498

Massachusetts Institute of Technology
Department of Electrical Engineering and Computer Science
Proposal for Thesis Research in Partial Fulfillment
of the Requirements for the Degree of
Doctor of Philosophy

Title: Understanding Expressive Action

 Submitted by:  Christopher R. Wren
    MIT Media Lab
    20 Ames St. E15-384
    Cambridge, MA 02139
    USA

Date of submission: June 15, 1999

Expected Date of Completion: December 19, 1999

Laboratory where thesis will be done: MIT Media Lab
  Vision and Modeling Group
  Alex P. Pentland, supervisor

Brief Statement of the Problem:

User interfaces make measurements of the user and use those measurements to give the user control over some abstract domain. The sophistication of these measurements range from the trivial keyclick to the most advanced perceptual interface system. Once the measurements are acquired the system usually attempts to extract some set of features as the first step in a pattern recognition system that will convert those measurements into whatever domain of control the application provides. Those features are usually chosen for mathematical convenience or to satisfy an ad hoc notion of invariance. The expressivity of any such interface is limited by the user's ability to overcome the reality of their bodies and perform in this arbitrary feature space.

The fact that people are embodied places powerful constraints on their motion. An appropriate model of this embodiment allows a perceptual system to separate the necessary aspects of motion from the purposeful aspects of motion. The necessary aspects are a result of physics, and are predictable. The purposeful aspects are the direct result of a person attempting to express themselves through the motion of their bodies. By taking this one thoughtful step closer to the original intentions of the user, we open the door to better interfaces. Understanding embodiment is the key to perceiving expressive motion.




next up previous
Next: Introduction

1999-06-15