Next: Categories:
Perceptual Computing TR#466, MIT Media Laboratory To appear IEEE PAMI '98; Submitted 4/26/96
Real-Time American Sign Language Recognition
Using Desk and
Wearable Computer Based Video
Thad Starner, Joshua Weaver, and Alex Pentland
Room E15-383, The Media Laboratory
Massachusetts Institute of Technology
20 Ames Street, Cambridge MA 02139
thad,joshw,sandy@media.mit.edu
Abstract:
We present two real-time hidden Markov model-based systems for
recognizing sentence-level continuous American Sign Language (ASL)
using a single camera to track the user's unadorned hands. The first
system observes the user from a desk mounted camera and achieves 92%
word accuracy. The second system mounts the camera in a cap worn by
the user and achieves 98% accuracy (97% with an unrestricted
grammar). Both experiments use a 40 word lexicon.
Thad Starner
1998-09-17