Affective Social Quest(ASQ)
    Teaching Emotion Recognition with Interactive Media & Wireless Expressive Toys

    CHAPTER TWO: BACKGROUND Chapter Index

Affective Social Quest builds on prior technology and therapeutic principles that serve as context for this work. A technological motivation section describes how research in the Media Lab and in industry relates to ASQ. Following is a section devoted to the therapeutic motivation behind this work. Theories of human development and autism are presented along with the current methods of therapy for autistic children and those that relate specifically to teaching affect recognition and expression.

Technological Motivation

The Media Lab has several research areas from which this research draws from. This thesis research addresses the interests of three diversified disciplines: Digital Life researches how bits, people, and things exist in an interconnected world, Toys of Tomorrow researches tech-toys, and Affective Computing researches ways for a computer to recognize and respond appropriately to emotion. The following section offers a brief background of each area.

Digital Life

With the innovation of embedded technology, objects have become more sophisticated, advancing ways technology can offer convenience and personalization to people through devices used every day. Digital Life is the extension of technology into these objects.

Ted Selker projects that some computer "experiences will be interactive, some will be vicarious, and others will be voyeuristic"(Selker 97, p89). At present, one object serves multiple uses. For instance, a car with a mobile phone and an onboard global positioning system (GPS) that can store personalized keywords may soon be able to contact the driver when information is most relevant and timely. For example, when approaching a local grocery the car can remind the driver that the kitchen ordered groceries that are ready for pick up. Roberto Tuchman, Director of a center focused on PDD intervention programs, the Dan Marino Center, recognized how this type of technology could also provide a therapeutic remediation for attention deficit disorder (ADD), serving as a reminding and sequencing aid to manage domestic responsibilities (Tuchman 99).

Digital life in every day objects provides the ability to select, arrange, modify, and communicate with the data, similarly to the way we communicate with people. Data once serially processed can now be randomly accessed, dynamically processed, and wirelessly accessible. Object functionality becomes customizable, flexible, efficient, and personalized, and for some therapeutic. When a design incorporates visual and natural mapping between humans and things, interaction appears natural by building on relationships that have meaning to people (Norman 90).

Toys of Tomorrow

From animatronics in theme parks to battery powered barking toy dogs, embedded technology seemingly makes people feel like these dolls act with emotion. Mattel’s Tickle Me Elmo transcended moving doll parts to sensing dolls using the innovative idea of embedding voice systems in toys; when the doll is tickled, the doll giggles. Around the same time, the Media Lab and Apple Computer collaborated on the development of a toy called Noobie. The goal was to build an environment that allows computer animals to be played with and explored in the same way real animals are; by touching (Druin 87, p12). Noobie entertained and educated children by displaying educational media footage about animals on its tummy monitor. Children touched different parts of this oversized soft interface to see how the Noobie, short for "New Beast," would react (Druin 87).

A decade later demonstrates a reduction in size of embedded technology by the toys available today. Japan’s Bandai created the virtual pet, a Tomagotchi. The little pet inside this hand-held toy relied on its owner to feed, bathe, educate and entertain it to keep it alive. Neglect resulted in a wailing, incessant beep to communicate the need for basic care. When ignored over time the virtual pet would die. Initially available only in Japan, these rudimentary artificially intelligent pets became popular and Americans were buying them on the black market until sold in the United States. Later versions became so sophisticated that two Tomagotchi-type systems could interact with each other and perform fighting games. This infrared communication worked with a personal computer too. New features could be downloaded from the manufacturer’s web site and then uploaded into the hand-held companion (Saunders 97).

The next rave of tech toys for children was released in Christmas 1998. Tiger Electronics, LTD. developed a ‘special plush toy’ (toy-biz lingo for a stuffed animal with embedded technology) that became the next high-in-demand purchase known as Furby. Furby is a Gremlin-esque small stuffed toy with the pet like features of the Tomagotchi. A toy ad-agency executive responded to the question "what was the one thing that made them distinctive and unique and desirable?" by answering, "it was a friend that a child could nurture" (Kirsner 98, p197). Furby has 300 different combinations of eye, ear, and mouth movements. Two microprocessors control the 8000-rpm motor cams that drive the mechanical animated motion of the Furby. Sensors, referred to as attitude sensors, detect whether the plush toy is standing or held upside down. Furby has a microphone so the toy can respond to sound, has light sensors for the toy to switch into sleep mode, and has pressure sensors for the toy to detect petting. Furby moves, blinks and speaks two languages, English and Furbish -- the Furby language (Kirsner 98). Americans paid above market price during the 1998 holiday season to own a Furby, and not only one, but two of them. Two Furbys were preferred because they could interact with each other. With infrared transceivers between the Furby eyes, Furbys appeared to communicate by talking to each other. Additionally, Furbys speak in their emotion riddled native Furbish language until they seemingly learn English from their caretakers. These dolls were cleverly programmed to produce certain familiar English words over time. The magic of the Furby and similar interactive toys are their invisible mechanisms and their initially unpredictable human-like behavior. Sherry Turkle is researching children interacting with Furby toys and shared her observation that these children perceive these special plush toys to be ‘sort-of-alive’ objects (Turkle 99).

Another toy with affect is interactive Barney, developed by Microsoft for its ActiMates line of dolls. Barney is a special plush toy designed to complement popular child television characters and has a TV/PC pack accessory to interact with the television or the computer. ActiMates Barney is an upgraded version of the earlier Playskool Talking Barney. Barney includes pressure sensors in the hands, feet, and tummy to activate an internal voice-storage unit like the Playskool version, but has an added wireless feature for the doll to communicate with a computer system or video program. The doll’s voice recordings play when certain scenarios appear on the screen. This separately sold package offers two-way interactive communication via radio frequency (RF) signals between the doll and a transmitter box to talk to CD-ROMs for the personal computer or VHS for television. Barney stores approximately 2,000 word responses forming random word subsets as phrases that appropriately match events triggered by interaction with the child or system. Its large word vocabulary, along with its light-sensitive eyes for playing peek-a-boo, contribute to the doll’s magic and its seeming aliveness.

Researchers at the Media Lab have been creating their own family of plush computer toys as well. A research group known as Synthetic Characters, led by Bruce Blumberg, developed an interactive game called "Swamped" (Johnson 99). The behavior of a chicken was modeled in the software where certain behaviors could be activated by the way the plush doll chicken was controlled. Although the application design focuses on behavior, the tangible interface of the external physical doll relates to the external physical interface of the dwarves.

‘Swamped’ includes a soft plush chicken, a sensor-embedded doll used to control the same onscreen character via radio frequency. The system interpolates the chicken’s movement and models that behavior in order to interact with an animated virtual raccoon. The interactive game resembles the Coyote and Roadrunner cartoon, with a conniving raccoon trying to steal eggs from the chicken who tries to protect them by distracting the raccoon. Raw sensor data is used by the game’s gesture recognition system to control the behavior of the raccoon based on the interactive behavior of the chicken. Thirteen sensors are embedded in the chicken for interpreting the user’s intent: doll’s attitude (position) using pitch and roll sensors, gyroscope sensing roll velocity, magnetometers for orientations with respect to magnetic north, flexion sensors for wing position, squeeze sensors embedded in the body and beak, and a potentiometer to sense rotation (Johnson 99, p4). The interactive tetherless sympathetic interface, controlling a synthetic animated character in a 3D virtual environment, entertains with its musical and animated representation of user control (Johnson 99). For example, when the chicken beak is squeezed the virtual character squawks, if moved right-to-left for walking, a walking melody plays, and when this motion is accelerated, a faster tempo plays and the chicken runs. Embedded technology from Toys of Tomorrow relates to the ASQ project. As shown in Figure 1, the dwarves use a similar embedded technology of wireless communication in plush toys.

Another related soft interface for therapeutic aid developed by Marina Umaschi Bers encouraged children to communicate emotional narratives while in the hospital. Plush dolls were the interface to a computer screen and helped children create narrative stories to entertain and counsel them as they recovered from cancer treatment at Boston’s Children Hospital (Umaschi Bers 97, Umaschi Bers 98). Research like Umaschi Bers SAGE and the author’s ASQ explore ways technology may promote health. Both are therapeutic systems focused on enhancing mental well-being for children; the SAGE environment encourages terminally ill children to express their feelings, while the ASQ environment remediates empathetic disabled children to recognize emotion. Another Media Lab related project that uses plush toys and embedded systems is Dana Kirsch’s project ‘Affective Tigger’, an affective responsive plush toy. Like ASQ, its goal concerns communicating emotion with a computer.

Affective Computing

Affect in humans plays a critical role in rational decision-making, in perception, in human interaction, and in human intelligence (Picard 97). Affective technology may be embedded into systems in a way that resembles empathetic communication. An early example of affective computing is the 1966 program called ‘Eliza.’ It is a text-based application that imitates a psychotherapist interacting with a user (Picard 97, page 116). It was sometimes effective with its users because they attributed a shared understanding to it based on Eliza’s response. Jonathan Klein researched the affect of empathy on human subjects who had been frustrated while using a computer game (Klein 98). CASPER (for Computer-aided Active Support for Personal Emotion Regulation) takes the format of an online agent practicing social emotional feedback strategies to reduce their frustration. Subjects that received narrative empathetic listening responses showed more interest and continued their interaction with the system more than those who received no response. The research illustrated the benefit of a computer system’s empathetic understanding and response to user frustration (Klein 98). Another project for detecting frustration was developed by Jocelyn Scheirer. Working with Raul Fernandez and Rosalind Picard, she developed frustration detection glasses -- ‘Expression Glasses’ -- that detect a user’s facial expression from forehead and eyebrows using sensors to map the muscle movement up or down to a level of "interest" or "confusion" (Scheirer 99). Later she developed ‘Touch Phone,’ a phone pressure sensor and software display for detecting the amount of pressure applied to the handset of a telephone (Scheirer 99). These technologies may facilitate communication of signals that can carry certain kinds of affective information.

Dana Kirsch developed one of the first affective systems embedded in a toy. "Affective Tigger" ascertains the way the toy is being treated by its user and responds with an affective gesture. Tigger, the bouncing tiger from the ‘Winnie the Pooh’ stories, uses embedded sensors to detect the way it is being handled. The doll uses two different sensor systems in its ears and mouth to communicate its affective state, which in turn is based on two other sensors that detect the amount of pressure when petted or velocity of movement when bounced. The plush doll conveys inappropriate behavior by moving its ears downward to display sad when it is petted roughly or bounced too hard. When happy, Tigger says positive Tigger phrases and the ears perk up (Kirsch 99). Embedded affect in toys adds a feature of perception from the doll perspective and would be a nice addition to an upgraded version of the dwarf technology.

Affective computing strives to teach the emotionless computer to recognize emotion in people. Several methods have been explored as first stages to achieve system recognition of a person’s emotional state. Some preliminary ways of gathering user data that may later be helpful for detecting user affective states have been mentioned: ‘CASPER’ identifies directly reported frustration, ‘About Face’ and ‘Touch Phone’ analyze expressive movements, and Tigger’s attitude sensors identify haptic movement.

Other research explores physiological data of the user. One project of Elias Vyzas, Jennifer Healey and Rosalind Picard studies physiological data in humans as they intentionally express basic emotion in the ‘emotion recognition of an actor project.’ The computer collects the user’s blood volume pressure (BVP) for heart rate, skin conductivity with galvanic skin response (GSR), electromyogram (EMG) for facial tension in the jaw, and respiration. These physiological data are extracted and analyzed while the actor deliberately tries to feel these emotions. The data was used to develop algorithms to discriminate between eight emotional states (Healey & Picard 98, Vyzas & Picard 99).

Alex Pentland and the Media Lab’s Perceptual Computing group explore visual recognition systems, some of which focus on face recognition. Pattern matching algorithms in these systems match facial features and their location on the face to code or interpret close matches to static images using a Facial Action Coding System (FACS) (Pentland 98). Irfan Essa and Alex Pentland extended Paul Eckman’s FACS work by building a system that characterized the way the expression moves across the face as opposed to focusing on the detection of human expression (Essa & Pentland 95). Research to extend these systems to recognize human emotional expression is ongoing.

A computer can process information and search through images to match features quickly by searching for patterns. What a computer cannot discern is that an up-turned mouth might mean a person is happy. That feature needs to be programmed in order to correlate with human expressions that may mean happy. Affective visual cues include more than lip formation on the face. They include other features of the face, body posture and gesture. Also, a representation for a single emotion could use various combinations of these modes to express the same emotion. Computers have difficulty generalizing. The computer cannot infer the many ways happiness can be expressed. The computer cannot presently distinguish if an emotion with an up-turned mouth, for happy, and body posture of hands-on-hip, for angry, could mean happy in some cases, but also could carry other meaning, like angry. One could consider a computer to be autistic for these reasons.

Affective Social Quest is designed to teach emotion recognition to people. Although this work is about teaching affect to children with autism, it has a broad range of applications in aiding all children to learn how to access emotions. A similar approach may also be possible for teaching computers emotion recognition. ASQ stores over 200 video segments displaying basic emotion expression and could later be re-used for coding the different ways an expression, say for happy, could be understood by a vision system. The goal is to store many samples of emotion images, or frames in a video clip, which correlate with a particular emotion category. Paul Eckman identified six discrete emotions that are treated as distinct from each other: happiness, sadness, anger, fear, surprise, and disgust (Eckman 92). ASQ selected four of these emotions for its pilot: happiness (happy), sadness (sad), anger, and surprise. ASQ uses the emotion labels with other labeled representations -- word, icon, and doll face -- and displays them on the screen for the autistic child. This helps the child scaffold different representations for an emotion so that they might learn through cognitive inference to generalize the many ways one emotion may be displayed.

Therapeutic Motivation

Improving the social well-being of autistic individuals motivates the work of this research as well. Children with autism have a neurological disorder that affects their interpersonal perception and communication (Hobson 98, Sigman & Capps 97). Generally, children learn social cues visually by associating interactions around them and mimicking that behavior. Autistic children process information differently, yet how they internally process communication is still a mystery. Research and clinical work point to intervention methods that help these children learn and possibly adapt to mainstream educational programs. The goal has been to focus on their strengths to overcome their deficits. This section briefly describes how normal children develop socially and contrasts that development with that of autistic children. Following is information about autism specifically, and about how the technological motivation addresses the therapeutic motivation.

Development Theories

Early work on cognition is credited to Jean Piaget, who studied human development by extending his background in biology and epistemology. He is known for genetic epistemology and most famous for his intellectual development theory, referred to as ‘constructivism,’ which proposes that learning happens by doing (Piaget 70, Craig 76, Hala 97, Papert 99). Piaget believed activities in life are assimilated into our understanding of the world and that novel actions either ‘go over one’s head’ or are accommodated into one’s model of the world. According to Piaget, this process happens by means of interaction with an object, not a person (Piaget 70, Hala 97).

B.F. Skinner extended Pavlovian animal behavior to human behavior using antecedent reinforcements to create secondary associations. For example, the sound of a bell could be made to cause a dog to salivate by pairing the bell sound with the presentation of edible treats and then slowly thin the reinforcement, the treat, to extinction. Skinnerian behavior analysis resembles Piaget by being principled in the act of doing something. However, Skinner’s interest focused more on how to modify behavior, referred to as operant behavior.

The approach in operant behavior is to change one’s behavior with reinforcement (Pierce 99). Both Piaget and Skinner attributed human development to action without addressing people or the affect of people. Lev Semenovich Vygotsky on the other hand attributed knowledge to the interaction between humans via communication and calls this interaction ‘interpsycholgogical’ (Hala 97). He believed that without interaction between people, understanding would be irrelevant. Peter Hobson refers to Vygotsky’s ‘interpsychological category’ of mental functioning as the area on which to focus attention. He argues that prior study of "cognition focused on development through configuration of behavior-cum-experience (objects) ignores the nature of the transactions that occur between people" (Baron-Cohen 93, p208). Additionally, Peter Salovey and John Mayer, and later Dan Goleman add that understanding emotions contributes to higher cognitive development by increasing what they call ‘emotional intelligence’ (Salovey & Mayer 90, Goleman 96, Salovey 97). On the continuum of emotional intelligence, the better people are able to discriminate and recognize emotion, the better they may be able to modulate their behavior to best succeed in situations. According to this theory, interpersonal communication is the method by which individuals can differentiate self and other.

Social Cognition studies social communication as the method for learning, which contrasts traditional cognition’s focus on development through interaction with objects. Social cognition focuses on the shared understanding of two people comprehending an intended message exchanged between one another. In autism literature this is referred to as theory-of-mind. Simon Baron-Cohen popularized theory-of-mind in Mindblindess (MIT Press 95) by describing individuals with autism as not being able to infer other people’s mental states. The social-interactive and communication problem caused by their cognitive disorder limits them from forming abstract associations. This ‘metarepresentational’ association of how people socially relate to each other is referred to as joint-attention. Joint-attention is the ability to share affect; how people share affective communication relates to this research. According to Tuchman, interaction with others necessitates recognition of affect first and then the desire to share that affect (Tuchman 99). ASQ attempts to help children with autism to go through this process. The building blocks to achieve metarepresentation include mastering emotion recognition, then sharing that affect through joint-attention, and lastly seeing another’s point-of-view with theory-of-mind. An overview of communication development may offer an explanation for how affect may be important for helping children understand social communication.

Communication

Normally communication implies that some message is transferred. Language uses words (strings of phonetic aggregations) and grammar (syntactic and semantic constraints or combinatorial rules of theory of a language) to articulate a message (Hala 97). Other modes of communication enrich the meaning of a message. Eye contact, gesture, and melody in language (prosody), along with posture, act like adjectives to the spoken word. Children with language delays and distorted developmental disorders, such as autism, are characterized by a lack of relatedness to people (Hala 97, p95). According to theory-of-mind thinking, language communicates thoughts and feelings to others; there is no language if there is no one to communicate with or no shared understanding. Peter Hobson and Roberto Tuchman describe the development of social cognition as a precursor to theory-of-mind. Hobson believes that the sharing of communication is not innate but nurtured in the child’s infant development (Baron-Cohen 93, p 209) Development in very early child communication suggests that children develop modes of communication other than language to indicate relatedness.

Prior to communication, a more important aspect of development occurs: the awareness of self and of the existence of objects and other beings. First forms of communication occur with infant cries, vocalizing the intent need for a caretaker. Sending that signal and receiving a response leads to the first awareness that the cry is a tool for communication. Whether or not an infant recognizes this internally, it sets the stage for further development of communication based on the recognition of signals transferred, i.e., social communication.

Other modes of registering communicative information in infant years are through messages via senses (sight, hearing, olfaction, taste, touch, and proprioception (feeling pressure). The senses carry content concerning an object or person in an infant’s surroundings. Additionally, communication occurs through sight via common modes like body gesture and facial expression, and through hearing using prosody, pitch, and tone. Each of these channels of communication orients the infant to others. An infant’s ability to differentiate between people is easy to detect when the infant signals a request for the primary caregiver, yet when others try to soothe the infant, the crying wail continues until the infant senses the appropriate person (Hala 97).

A parent’s communication with a child is often through the child requesting something. In this case caretaker to child sharing occurs, which includes affective content. An infant has her own feelings and, when attended to by others, observes expressed feelings from that person and can modulate the initial request to a sharing experience between self and other. Recognizing an affective expression in interpersonal relating is the basic level of communication and sharing (Baron-Cohen 93, p 209). The process of recognition helps the child differentiate affect and helps conceptualize others as separate people with their own psychological attitudes and experiences.

These early stages of shared communication foster future development of communication. Children learn effective methods for conveying the message intended. If they need something, often a cry is attempted. As children explore other methods and mimic the ways they see others interact, communication begins to include interpretations of the learned signals and becomes part of the child’s relatedness to others. Autistic children interact differently.

Hobson proposes his hypothesis that the autistic deficit "arose through the child’s lack of basic perceptual-affective abilities and propensities required for a person to engage in personal relatedness" (Baron-Cohen 93, p 205). Requests for objects are usually direct and often without sharing information in a social way; sensorialy expressed. Newly walking infants will often point to things that they want to share with someone, whereas autistic children will approach an object of interest and ‘grab assistance’ if it is out of reach, without an interest in sharing information about that object. The child tends to regard people as a source to attain physical needs, not emotional needs. Why they use this mode of interaction is unclear.

Another theory, based on genetic epistemology, suggests an inborn pre-wired ability to experience reciprocal, affective personal relations with others in order to become engaged. According to this theory, expressions and actions with others ‘naturally’ involve sharing feelings (Baron-Cohen 93, p 204). Piaget attributed affect to genetic epistemology, which may be why he did not focus on emotion as part of cognitive development.

Affect or emotion has received limited research attention until late in the twentieth century. Recently, two neurologists, Joseph LeDoux and Antonio Damasio, proposed neuroanatomical structures and processes that suggest significant roles for emotion in cognition, such as in perception and relational decision making (LeDoux 94, Damasio 94). The exact way that the brain processes emotion has been an open question and is now starting to be illuminated with the help of the research developments in neuro-technology.

Emotion theorists are in dispute over the nature of emotions. Peter Lang believes that emotions are continuous along at least two dimensions -- arousal and valence – and he suggests that one feels an emotion to a certain degree. Arousal is the positive or negative emotion experienced and valence is the degree that that emotion is felt (Lang 90, p382). Paul Eckman on the other hand, believes that emotions are primary discrete entities and that more complex emotions form from combinations of these primary ones (Eckman 92). There are also many other theories of emotion, but no general agreement on precisely how to describe emotions.

What many are beginning to realize is that emotion plays an important role in human development and problem solving. "Early in development, care givers assign meaning to a range of vocal as well as non-vocal behaviors, and the means of communication become increasingly conventionalized through reciprocal social exchanges" (Cohen & Volkmar 97, page 515). Children with autism adopt unconventional methods of communication by focusing on their own immediate needs and the properties of the physical world rather than social interactions and socioemotional concerns (Cohen & Volkmar 97, page 514). They adopt idiosyncratic ways of expressing themselves, which may result from their limited ability to imitate others in social situations.

Autism

Background

Autism strikes between one to two per thousand children for the more severe cases, and between four to ten per ten thousand worldwide for less severe cases, most of them males (Cohen & Volkmar 97, p 847, Sigman 98). Leo Kanner, once head of the John Hopkins clinic, published a chapter in a book on child psychiatry in 1943 called ‘Autistic Disturbances of Affective Contact.’ He described children with a rare and interesting childhood affliction that he labeled early infantile autism, a disorder not yet clinically identified. Around the time of Kanner’s publication, Hans Asperger from Austria independently described the same child disturbance using ‘autistic’ to describe the disorder (Frith 91).

In Anthropologist from Mars, Oliver Sacks describes Kanner’s initial thought that autism might be a dysfunction in early development between care taker and child, and that Kanner referred to it as the bad parenting ‘refrigerator mother’ (Sacks 93, p107). Bruno Bettelheim in 1967 furthered this hypothesis in his published work The Empty Fortress: Infantile autism and the birth of the self (Free Press, NY). In the past twenty years autism has been found to be neurologically based. In autistic symptomatology, it is expressed as the atypical function of an affected individual’s brain (Cohen & Volkmar 97, section 7). Autism is a neurologically determined behavioral disorder, yet no prenatal or perinatal factors attributable to the disorder are clearly identified (Tuchman 93). One observation is that the brain size of autistic children is larger than that of typical children. Also, autistic children sometimes suffer from insomnia and may not prune neurotransmitters in their sleep like normal children (Tuchman 93).

Even with what is known about autism, often the disorder goes undetected until a child is expected to speak. A child pediatrician might not suspect autism until the child shows a restrictive use of or delay in verbal abilities, even though a parent may notice something peculiarly unique about their child, which they may not be able to articulate.

Detection Markers

Early psychological markers for diagnosing autism include difficulties with communication, socialization, and symbolism (Prizant 98). Early signs most easily observed in autistic children include preservation, stubbornness, inattention, inflexibility, and tantrums. In 1992, Baron-Cohen developed what may possibly be the simplest diagnostic approach to detecting autism with CHAT, Checklist for Autism in Toddlers (Cohen & Volkmar 97). They developed a questionnaire, administered in pediatric offices, for parents to evaluate their child’s development. CHAT was designed to assess the joint-attention of their child as observed by family members. Results from the questionnaire are used to identify how many behaviors are absent in their child. This mini index for early detection was targeted at children who were at least eighteen months old.

One marker for autism is a child’s limited display of proto-declarative pointing, a joint-attention act, for indicating a desire for an object or shared event. Pointing imparts communication; it conveys need or desire. Another observation for early detection is gaze monitoring. Limited shifting of eye gaze between other people or between people and objects may represent delays in social development. Eye gaze signals the intent to share interest or meaning and strengthens social interaction. The third identifying trait involves a child’s play. Idiosyncratic behavior in pretend play, such as playing with a toy in a way that doesn’t mimic the family’s behavior in everyday life (for example, playing with a doll’s eyes versus combing a doll’s hair) indicates a symbolic aberration. Children who display a lack of all three behavioral characteristics have later been found to be developmentally retarded.

In the 1996 Baron-Cohen study which included 16,000 questionnaire responses, 93% of the children whose parents noticed that their children had developmental delays in all three areas – pointing, eye-gaze and play -- were later diagnosed with autism (Prizant 98).

Not included in the Baron-Cohen study, but commonly found with autistic children, is their peculiar way of comprehending. Generalization is the ability to take what is learned in one setting and show that accomplished skill in a different environment. Autistic children usually have difficulty with this. A child may learn to tie his shoe in the house, yet if you ask him to tie his shoe in the car he may not be able to do it; what he learned in the house may not connect to something he could do in the car and he was unable to generalize how to tie his shoe.

Some believe that autistic children have a sensory integration disorder, where their senses are either hyper- or hypo- stimulated, which seems to explain their reactions to certain sensory communication. Touch and proprioception affect some children with autism; these children dislike being held or touched; however, they often love to roll their entire body on the floor. Some shriek at sounds that can only be distinguished by the frequency of that sound and are not necessarily audible by everyone. Others are aroused by light reflections or fixate on a single object for long periods of time. Some children dislike how certain things look, particularly if they differ from what they expect. They may appear obsessive about appearances to the degree of exactness. Communication channels for autistic children present a distortion in their world and may affect the way they learn early communication.

Terminology

Autism is the best-recognized and most frequently occurring subset in a group of disorders collectively known as the pervasive development disorders (PDD) (Siegel 96). Children are categorized as having one or more of these descriptive disorders along the PDD spectrum: autistic disorder, Rett syndrome, childhood disintegrative disorder, Asperger’s syndrome, or Not Otherwise Specified (PDD / NOS). Tuchman describes a broader continuum of disorders that closely relate to autism, such as attention deficit disorder, deficits in attention motor coordination, perception semantic-pragmatic language disorder, Asperger’s syndrome, high-functioning autism, fragile X syndrome, PDD / NOS, autism, and then autism, and then children with both autism and mental retardation (Tuchman 98).

Behavioral Modification

Many different approaches for behavior intervention are available to parents of autistic children. There are two popular styles. One is a one-on-one behavioral approach, known as applied behavioral analysis (ABA). This type of intervention was developed by B.F. Skinner and popularized for autistic children by Ivar Lovaas. The other is a station-based independent approach, known as T.E.A.C.C.H. (Treatment and Education of Autistic and related Communication Handicapped Children,) developed by Gary Mesibov in North Carolina. A common behavioral analytic treatment is one-to-one discrete trial teaching that focuses on a child’s development of one specific behavior skill until some measure of success has been achieved. The process includes repeated trials at set intervals for learning one task, like learning the proto-declarative pointing mentioned earlier, and uses positive stimulus to reward successes. One task may take as long as one month to master, with four to six hours of training per day. In contrast, T.E.A.C.C.H. focuses on self-care skills and managing disruptive behavior and has a child focus on individually customized programs in quiet stations independently. According to this method, each child has different strengths and weaknesses, so each development strategy is customized to the child.

Behavioral interventions for some young children with autism may be successful in preparing them to assimilate into mainstream educational programs. High-functioning and Asperger children are more likely to succeed in the transition to special education programs in public schools because these children have few language deficits and this helps them overcome difficult situations, though these skills don’t necessarily help them in social situations. Behavior analysis is the process of breaking down behavior that most people master without thinking into discrete fragments that can be memorized (Volkmar 99). Identifying and encouraging a child’s strengths may encourage their weak areas to evolve (Prizant 98). Creating specifically tailored intervention programs for each individual offers him/her the most potential for success and the possibility of blending in. The best results are said to be achieved for children who receive early intervention before they reach the age of five years old.

Children diagnosed with autism or any disorder along the pervasive development disorder (PDD) spectrum are advised to start a specialized program once they have been diagnosed, some as early as twelve months old (Prizant 98), though generally at eighteen months old. These interventions, which usually apply to all levels of PDD, seek to enhance the development and well being of the child by addressing particular sets of difficulties (Sigman & Capps 98). The kinds of programs differ enormously, but most include strategies to help children develop social and communication skills, such as how to behave appropriately with other children and with groups of children, along with verbal language skills. Practitioners agree that teaching these children to recognize emotion in others is one of the most problematic areas. Because autistic children lack perceptual-affective abilities, complex communication such as humor and sarcasm often goes without recognition or understanding.

Many programs instruct children using emotion words and icon representations, yet systematic observations or experimental investigations of specific social behaviors are few (Fein 87, Lubin 99, Tuchman 99, Sigman & Capps 97). One procedure for teaching emotion is to show children photographs of people exhibiting emotional expressions. Figure 2 contains three learning development aid (LDA) emotion cards; here they show the emotions angry, happy, and sad. Children are seated facing a practitioner who will hold an LDA photo emotion card, such as one of the cards shown in figure 2, and provide a verbal ‘prompt’ e.g., "touch happy" or a similar verbal expression. If the child touches the picture then the practitioner will immediately provide enthusiastic social praise and occasionally provide access to a preferred snack, toy, or activity. The practitioner will then display another picture card depicting the same emotional content.
 


Figure 2: Photo Emotion Cards illustrating ‘angry,’ ‘happy’ and ‘sad’

If the child does not touch the picture, the practitioner will repeat the verbal prompt and provide a physical prompt, e.g., pointing to the picture. If the child points to the picture after the physical prompt is provided, then physical assistance will be given to insure that the child touches the picture.

In addition, software tools that augment the child’s training are available. Most applications focus on verbal development, object matching, or event sequencing. Laureate is the most popular software developer of programs specifically designed for autistic children. Laurette software is designed for autistic children to solve ‘what if’ scenarios and help them decide what the next action in a sequence could be. Sequencing is difficult for PDD children. While they may include scenarios that include emotion, the focus is not on developing the child’s emotion recognition.

Knowledge Adventure and Broderbund are two companies that develop educational software for all children. The screen interface uses animation and bold colors to engage children. Some autistic children might enjoy this kind of software because of its stimulation quality.

Mayer-Johnson has a "board maker" software tool that combines words with its standardized icons (Picture Communication Symbols (PCS)), to help children communicate through pictures. The PCS can be arranged on the computer to tell a story and then printed to share with others. Additionally, the software allows new pictures to be added.

Other computer-aided software designed for handicapped children shows success in enabling them to acquire direct targeted skills. Most of these computer applications require some language understanding and uses a pictorial based system in the interface. However, in reviewing literature describing some of the available systems, the author noticed that the screen captures that are included in the literature are not visually captivating.

Though these different software applications are designed to help autistic children learn certain behavioral tasks, their focus has not been on teaching autistic children emotion recognition, based on the research covered to date in this thesis.

Strengths of the autistic mind

Autistic children may be either gifted musically or visually, or both. They will often remember the melody to songs and have perfect pitch or rhythm when listening or demonstrating what they heard. In some cases, it is easier for them to understand a message if it is put to music (Selker 99). Temple Grandin, an autistic adult and successful author and scientist, attributes much of her success to her visually-based system of cognition, or photographic visual memory (Grandin 95). Grandin writes about her understanding of the world through a set of images she has recorded photographically in her memory. Every time she experiences a new event she searches her memory for a similar situation in order to map the current situation to that scenario. She calls her memory a video database or CD memory. This helps her comprehend new situations that otherwise would be confusing, particularly regarding social behavior.

Why ASQ

Affective Social Quest builds on the strengths of autistic children’s visual system with video clips and dolls. Different displays of emotional expressions presented in moving pictures may be helpful to autistic children because they will be able to see the frame by frame development of an emotional expression. Recognition of and appropriate reaction to emotional affect may be useful for everyone, but as stated earlier is a barrier to being a participant in society for the autistic. Focus on the potential for using computing and physical interfaces in therapy is the heart of this work.

The intervention programs with the most success are usually developed from behavioral analytic approaches (Maurice 96). By breaking down learning into small and separate parts of a task, success for learning that behavior is achieved over time. The time demands on these untiring practitioners are intensive; they often spend many hours with one child, patiently showing them sets of pictures and instructing them in highly repetitive tasks.

A visually based therapeutic system that entertains and educates may be used to augment the practitioner’s training. This system can emphasize interaction and facilitate the intervention style of teaching; it can provide multiple views of the content inexhaustibly. While a child is enjoying the system, the practitioner is free to spend time on other aspects of the child’s development such as behavioral procedures for future sessions.

Because people with autism are visual thinkers, providing them with contextual scenarios that correlate with an emotional behavior, as well as providing depictions of various emotional expressions, may contribute to building their own memory databases, as Temple Grandin did.

Cognitively, actors are able to mimic affect when they portray a character. They learn how to communicate and express in a social way, so as to be believed by their audience. A more advanced actor will use a personal memory experience to evoke an emotion on demand (Hagan 73). For autistic children, this same approach to acting can be adopted even if they do not internally feel the emotion, but can simulate the correct expression. This process can be compared to learning to communicate in a second language. We are not proposing to modify the cognitive processing an autistic child uses to display emotional expressions, but offer for consideration an alternative approach to help him/her integrate socially at will.


 ASQ_Chapter One:_Introduction

Chapter Two Index

Chapter Title: Background  Technological Motivation  Digital Life
 Toys of Tomorrow
 Affective Computing
 Therapeutic Motivation  Development Theories
 Communication
 Autism  Background
 Detection Markers
 Terminology
 Behavioral Modification  Figure 2: Photo Emotion Cards illustrating ‘angry,’ ‘happy’ and ‘sad’  Strengths of the autistic
 Why ASQ


ASQ_Chapter Three:_System Description

ASQ_Table of Contents