ebook img

DTIC ADA434716: How to Build Robots that Make Friends and Influence People PDF

6 Pages·3.7 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview DTIC ADA434716: How to Build Robots that Make Friends and Influence People

How to build robots that make friends and influence people Cynthia Breazeal Brian Scassellati cynthia'ai .mit. edu scaz~ai .mit. edu MIT Artificial Intelligence Lab 545 Technology Square Cambridge, MA 02139 Abstract Using the development of human infants as a guide- line, we have been building a robot that can interact In order to interact socially with a human, a robot socially with people. must convey intentionality, that is, the human must From birth, an infant responds with various innate believe that the robot has beliefs, desires, and inten- proto-social responses that allow him to convey sub- tions. We have constructed a robot which exploits jective states to his caregiver. Acts that make internal natural human social tendencies to convey intention- processes overt include focusing attention on objects, ality through motor actions and facial expressions. We orienting to external events, and handling or explor- present results on the integration of perception, atten- ing objects with interest [14]. These responses can be tion, motivation, behavior, and motor systems which divided into four categories. Affective responses al- allow the robot to engage in infant-like interactions low the caregiver to attribute feelings to the infant. with a human caregiver. Exploratory responses allow the caregiver to attribute curiosity, interest, and desires to the infant, and can be used to direct the interaction to objects and events in 1 Introduction the world. Protective responses keep the infant away from damaging stimuli and elicit concerned and car- Other researchers have suggested that in order to ing responses from the caregiver. Regulatory responses interact socially with humans, a software agent must maintain a suitable environment for the infant that is be believable and life-like, must have behavioral con- neither overwhelming nor under-stimulating. sistency, and must have ways of expressing its internal These proto-social responses enable the adult to in- states [2, 3]. A social robot must also be extremely ro- terpret the infant's actions as intentional. For exam- bust to changes in environmental conditions, flexible ple, Trevarthen found that during face-to-face interac- in dealing with unexpected events, and quick enough tions, mothers rarely talk about what needs to be done to respond to situations in an appropriate manner [6]. to tend to their infant's needs. Instead, nearly all the If a robot is to interact socially with a human, the mothers' utterances concerned how the baby felt, what robot must convey intentionality, that is, the robot the baby said, and what the baby thought. The adult must make the human believe that it has beliefs, de- interprets the infant's behavior as communicative and sires, and intentions [8]. To evoke these kinds of be- meaningful to the situation at hand. Trevarthen con- liefs, the robot must display human-like social cues cludes that whether or not these young infants are and exploit our natural human tendencies to respond aware of their consequences of their behavior, that is, socially to these cues. whether or not they have intent, their actions acquire Humans convey intent through their gaze direction, meaning because they are interpreted by the caregiver posture, gestures, vocal prosody, and facial displays. in a reliable and consistent way. Human children gradually develop the skills necessary The resulting dynamics of interaction between care- to recognize and respond to these critical social cues, giver an infant is surprisingly natural and intuitive which eventually form the basis of a theory of mind very much like a dialog, but without the use of natu- [1]. These skills allow the child to attribute beliefs, ral language (sometimes these interactions have been goals, and desires to other individuals and to use this called proto-dialogs). Tronick, Als, and Adamson [15] knowledge to predict behavior, respond appropriately identify five phases that characterize social exchanges to social overtures, and engage in communicative acts. between three-month-old infants and their caregivers: DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited U, " Happiness Sadness Surprise Anger Calm Displeasure Figure 1: Overview of the software architecture. Percep- tion, attention, internal drives, emotions, and motor skills are integrated to provide rich social interactions. Fear Interest Boredom initiation, mutual-orientation, greeting, play-dialog, and disengagement. Each phase represents a collec- Figure 2: Kismet, a robot capable of conveying intention- tion of behaviors which mark the state of the comn- ality through facial expressions and behavior. munication. The exchanges are flexible and robust; a particular sequence of phases may appear multiple Kismet has fifteen degrees of freedom in facial features, times within a given exchange, and only the initiation including eyebrows, cars, eyehds, lips, and a mouth. and mutual orientation phases must always be present. i ne yebrows, eas eyeds lp and a th. The proto-sociTahl e rreeosstp-posooncssaeels s ooff hhummaann iinnffaanttss pllaayy Tvihseio np lasytfsotermm ; aelsaoc hh aesy ef ouhra sd eagnr eeins doefp efnredeednotm veinrt itchael a critical role in their social development. These re- axis of rotation (pan), the eyes share a joint horizon- sponses enable the infant to convey intentionality to tal axis of rotation (tilt), and a one degreae of freedom the caregiver, which encourages the caregiver to en- neck (pan). Each eyeball has an embedded color CCD gage htim as a social being and to establish natural and ck (pan). E mf ocal h . Kismed colorhCCD flexible dialog-ke exchanges. For a robot, the abil-length. Kismet is attached fibltoconveyin tentional exchanitygtherosgg.Fh-loikte, - to a parallel network of eight 50MHz digital signal pro- ity to convey intentionaclty through infant-like proto- cessors (Texas Instruments TMS320C40) which han- social responses could be very useful in establishing dle image processing and two Motorola 68332-based natural, intuitive, flexible, and robust social exchanges microcontrollers which process thMe otivational sys- with a human. To explore this question, we have con- twm. structed a robot called Kismet that performs a variety of proto-social responses (covering all four categories) 2.1 Perception and Attention Systems by means of several natural social cues (including gaze direction, posture, and facial displays). These consid- erations have influenced the design of our robot, from humain sow a preferen Fsotrim u its physical appearance to its control architecture (see exhibit certain low-level feature properties. For exam- Figure 1). We present the design and evaluation of pie, a four-nionth-old infant is more likely to look at a thgues ss1 )We emsin trhete derisn ander.luatn o moving object than a static one, or a face-like object than one that has similar, but jumbled, features [10]. To mimic the preferences of human infants, Kismet's perceptual system combines three basic feature detec- 2 A Robot that Conveys Intentionality tors: face finding, motion detection, and color saliency analysis. The face finding system recognizes frontal Kismet is a stereo active vision system augmented views of faces within approximately six feet of the with facial features that can show expressions analo- robot under a variety of lighting conditions [121. The gous to happiness, sadness, surprise, boredom, anger, motion detection module uses temporal differencing calm, displeasure, fear, and interest (see Figure 2). and region growing to obtain bounding boxes of mov- Affect Space A&r sotaunac,- e "(cid:127);; _ :: Figure 4: Kismet's affective state can be represented as a point along three dimensions: arousal, valence, and stance. Figure 3: Kismet's attention and perception systems. Tcehnitse rasf faercet sshpoawcen ihse rdei.vided into emotion regions whose Low-level feature detectors for face finding, motion detec- tion, and color saliency analysis are combined with top- down motivational influences and habituation effects by 2.2 The Motivation System the attentional system to direct eye and neck movements. In these images, Kismet has identified two salient objects: The motivation system consists of drives and a face and a colorful toy block, emotions. The robot's drives represent the basic "n eeds" of the robot: a need to interact with people ing objects [5]. Color content is computed using an (the social drive), a need to be stimulated by toys opponent-process model that identifies saturated ar- and other objects (the stimulation drive), and a need eas of red, green, blue, and yellow [4]. All of these for rest (the fatigue drive). For each drive, there is a systems operate at speeds that, are amenable to social desired operation point, and an acceptable bounds of interaction (20-30Hz). operation around that point (the homeostatic regime). Low-level perceptual inputs are combined with Unattended, drives drift toward an under-stimulated high-level influences from motivations and habitua- regime. Excessive stimulation (too many stimuli or tion effects by the attention system (see Figure 3). stimuli moving too quickly) push a drive toward an This system is based upon models of adult human vi- over-stimulated regime. When the intensity level of sual search and attention [16], and has been reported the drive leaves the homeostatic regime, the robot be- previously [4]. The attention process constructs a lin- comes motivated to act in ways that will restore the ear combination of the input feature detectors and a drives to the homeostatic regime. time-decayed Gaussian field which represents habitua- The robot's emotions are a result of its affective tion effects. High areas of activation in this composite state. The affective state of the robot is represented generate a saccade to that location and compensatory as a point along three dimensions: arousal (i.e. high, neck movement. The weights of the feature detectors neutral, or low), valence (i.e. positive, neutral, or neg- can be influenced by the motivational and emotional ative), and stance (i.e. open, neutral, or closed) [9]. state of the robot to preferentially bias certain stimuli. The affective state is computed by summing contri- For example, if the robot is searching for a playmate, butions from the drives and behaviors. Percepts may the weight of the face detector can be increased to also indirectly contribute to the affective state through cause the robot to show a preference for attending to the releasing mechanisms. Each releasing mechanism faces. has an associated somatic marker processes, which as- Perceptual stimuli that are selected by the attention signs arousal, valence and stance tags to each releasing process axe classified into social stimuli (i.e. people, mechanism (a technique inspired by Damasio [7]). which move and have faces) which satisfy a drive to To influence behavior and evoke an appropriate fa- be social and non-social stimuli (i.e. toys, which move cial expression, the affect-space is divided into a set of and are colorful) which satisfy a drive to be stimulated emotion regions (see Figure 4). Each region is char- by other things in the environment. This distinction acteristic of a particular emotions in humans. For ex- can be observed in infants through a preferential look- ample, happiness is characterized by positive valence ing paradigm [14]. The percepts for a given classifica- and neutral arousal. The region whose center is closest tion are then combined into a set of releasing mecha- to the current affect state is considered to be active. nisms which describe the minimal percepts necessary The motivational system influences the behavior se- for a behavior to become active [11, 131. lection process and the attentional selection process heterogeneous hierarchy as shown in Figure 5. At each level, behaviors are grouped into cross erchSion w egroups (CEGs) which represent competing strategies Sssfor satisfying the goal of the parent [3]. Within a CEG, a winner-take-all competition based on the cur- rent state of the emotions, drives, and percepts is held. The winning behavior may pass activation to its chil- P r o y Toy dren (level 0 and 1 behaviors) or activate motor skill behaviors (level 2 behaviors). Winning behaviors may S7also influence the current affective state, biasing to- wards a positive valence when the behavior is being ooapplied successfully and towards a negative valence when the behavior is unsuccessful. Competition between behaviors at the top level Figure 5: Kismet's behavior hierarchy consists of three (level 0) represents selection at the global task level. levels of behaviors. Top level behaviors connect directly Level 0 behaviors receive activation based on the to drives, and bottom-level behaviors produce motor re- strength of their associated drive. Because the sa- sponses. Cross exclusion groups (CEG) conduct winner- s . take-all competitions to allow only one behavior in the tiating stimuli for eac drive are mutually exclusive group to be active at a given time. and require different behaviors, all level 0 behaviors are members of a single CEG. This ensures that the robot can only act to restore one drive at a time. based upon the current active emotion. The active Competition between behaviors within the active emotion also provides activation to an affiliated ex- level 1 CEG represents strategy decisions. Each level pressive motor response for the facial features. The 1 behavior has its own distinct winning conditions intensity of the facial expression is proportional to based on the current state of the percepts, drives, and the distance from the current point in affect space to emotions. For example, the avoid person behavior the center of the active emotion region. For example, is the most relevant when the social drive is in the when in the sadness region, the motivational system overwhelmed regime and a person is stimulating the applies a positive bias to behaviors that seek out peo- robot too vigorously. Similarly, seek person is rele- ple while the robot displays an expression of sadness. vant when the social drive is in the under-stimulated regime and no face percept is present. The engage 2.3 The Behavior System person behavior is relevant when the social drive is already in the homeostatic regime and the robot is re- We have previously presented the application of ceiving a good quality stimulus. To preferentially bias Kismet's motivation and behavior systems to regulat- the robot's attention to behaviorally relevant stimuli, ing the intensity of social interaction via expressive the active level 1 behavior can adjust the feature gains displays [5]. We have extended this work with an of the attention system. elaborated behavior system so that Kismet exhibits Competition between level 2 behaviors represents key infant-like responses that most strongly encour- sub-task divisions. For example, when the seek age the human to attribute intentionality to it. The person behavior is active at level 1, if the robot can robot's internal state (emotions, drives, concurrently see a face then the orient to face behavior is ac- active behaviors, and the persistence of a behavior) tivated. Otherwise, the look around behavior is ac- combines with the perceived environment (as inter- tive. Once the robot orients to a face, bringing it into preted thorough the releasing mechanisms) to deter- mutual regard, the engage person behavior at level mine which behaviors become active. Once active, a 1 becomes active. The engage person behavior acti- behavior can influence both how the robot moves (by vates its child CEG at level 2. The greet behavior be- influencing motor acts) and the current facial expres- comes immediately active since the robot and human sion (by influencing the arousal and valence aspects are in mutual regard. After the greeting is delivered, of the emotion system). Behaviors can also influence the internal persistence of the greet behavior decays perception by biasing the robot to attend to stimuli and allows the play behavior to become active. Once relevant to the task at hand. the satiatory stimulus (in this case a face in mutual Behaviors are organized into a loosely layered, regard) has been obtained, the appropriate drive is adjusted according to the quality of the stimulus. Avoidance Behavior I 2.4 The M otor System ,_0_..... ... - - - Srimation DOw The motor system receives input from both the .oo E Do, - " emotion system and the behavior system. The emo- -2oo _ Sek Toy___ ",- tion system evokes facial expressions corresponding to 5 , 2 T _ (O W.)34 the currently active emotion (anger, boredom, dis- pleasure, feraor,. happiness, interest, sadness, surprise, or calm). Level 2 behaviors evoke motor skills includ-. ... iog lm). around which moves the eyes to obtain a AoA new visual scene, look away which moves the eyes Do- ue and neck to avoid a noxious stimulus, greet which - - wiggles the ears while fixating on a persons face, and o , s0 Is 2 25 36 45 5W orient which produces a neck movement with com- j pensatory eye movement to place an object in mutual .. regard. 3 Mechanics of Social Exchange The software architecture described above has al- wed usotoairme pl itemetaure dsclses abofe hasoalre- Figure 6: Kismet's response to excessive stimulation. Be- lowed u-, to implement all four classes of social re- haviors and drives (top), emotions (middle), and motor sponses on Kismet. The robot displays affective re- output (bottom) are plotted for a single trial of approxi- sponses by changing facial expressions in response to mately 50 seconds. See text for description. stimulus quality and internal state. A second class of affective response results when the robot expresses preference for one stimulus type. Exploratory re- becomes first displeased and then fearful as the stim- sponses include visual search for desired stimuli and ulation drive moves into the overwhelmed regime. Af- maintenance of mutual regard. Kismet currently has ter extreme over-stimulation, a protective avoidance a single protective response, which is to turn its head response produces a large neck movement (t = 44) and look away from noxious or overwhelming stimuli. which removes the toy from the field of view. Once Finally, the robot has a variety of regulatory responses the stimulus has been removed, the stimulation drive including: biasing the caregiver to provide the ap- begins to drift back to the homeostatic regime (one of propriate level of interaction through expressive feed- the many regulatory responses in this example). back; the cyclic waxing and waning of affective, atten- To evaluate the effectiveness of conveying intention- tive, and behavioral states; habituation to unchanging ality (via the robot's proto-social responses) in estab- stimuli; and generating behaviors in response to inter- lishing intuitive and flexible social exchanges with a nal motivational requirements. person, we ran a variant. of a social interaction ex- Figure 6 plots Kismet's responses while interact- periment. Figure 7 plots Kismet's dynamic responses ing with a toy. All four response types can be ob- during face-to-face interaction with a caregiver in one served in this interaction. The robot begins the trial trial. This architecture successfully produces interac- looking for a toy and displaying sadness (an affec- tion dynamics that are similar to the five phases of tive response). The robot immediately begins to move infant social interactions described in [151. Kismet is its eyes searching for a colorful toy stimulus (an ex- initially looking for a person and displaying sadness ploratory response) (t < 10). When the caregiver (the initiation phase). The robot begins moving its presents a toy (t - 13), the robot engages in a play eyes looking for a face stimulus (t < 8). When it behavior and the stimulation drive becomes satiated finds the caregiver's face, it makes a large eye move- (t ;-, 20). As the caregiver moves the toy back and ment to enter into mutual regard (t - 10). Once the forth (20 < t < 35), the robot moves its eyes and neck face is foveated, the robot displays a greeting behav- to maintain the toy within its field of view. When ior by wiggling its ears (t ; 11), and begins a play- the stimulation becomes excessive (t P 35). the robot dialog phase of interaction with the caregiver (t > 12). Phases of Social Interaction This reliance on the external world produces dynamic o --, -- behavior that is both flexible and robust. Our future loop ,work will focus on measuring the quality of the inter- - .. " - .actions as perceived by the human caregiver and on Greet People Behenabling the robot to learn new behaviors and skills En~(cid:127) people e ~ o- 2 ToyB h- which facilitate interaction. 0 20 40 60 so 100 120 ' References 2 ! ....... ___[,," 1Ss. Baron-Cohen. Mindblindness. MIT Press, 1995. -100 HSpalm5.S6 12] J. Bates. The role of emotion in believable characters. HE- Ire-n Communications of the ACM, 1994. 0 20 (cid:127)40 60 60 , 2 [3] B. Blumberg. Old Tricks, New Dogs: Ethology and _ __ _ _ _Interactive Creatures. PhD thesis, MIT, 1996. o[4 C. Breazeal and B. Scassellati. A context-dependent attention system for a social robot. In 1999 Inter- 0 national Joint Conference on Artificial Intelligence, SR 1999. Submitted. 15] C. Breazeal and B. Scassellati. Infant-like social in- 0 20 ,0 Go So 100 120 teractions between a robot and a human caretaker. To. 0oo Adaptive Behavior, 8(1), 2000. To appear. Figure 7: Cyclic responses during social interaction. Be- tR6.1 Brooks. Challenges for complete creature archi- haviors and drives (top), emotions (middle), and motor output (bottom) are plotted for a single trial of approxi- Behavior (B90), 1990. mately 130 seconds. See text for description. 171 A. R. Damasio. Descartes' Error. G.P. Putnam's Sons, New York, 1994. [8] K. Dautenhahn. Ants don't have friends -thoughts Kismet continues to engage the caregiver until the on socially intelligent agents. Technical report, AAAI caregiver moves outside the field of view (t : 28). Technical Report FS 97-02, 1997. Kismet quickly becomes sad, and begins to search for 19] P. Ekman and R. Davidson. The Nature of Emotion: a face, which it re-acquires when the caretaker returns Fundamental Questions. Oxford University Press, (t ;i 42). Eventually, the robot habituates to the in- New York, 1994. teraction with the caregiver and begins to attend to [10] J. F. Fagan. Infants' recognition of invariant features a toy that the caregiver has provided (60 < t < 75). of faces. Child Development, 47:627-638, 1976. While interacting with the toy, the robot displays in- 111] K. Lorenz. Foundations of Ethology. Springer-Verlag, terest and moves its eyes to follow the moving toy. New York, NY, 1973. Kismet soon habituates to this stimulus, and returns [12] B. Scassellati. Finding eyes and faces with a foveated to its play-dialog with the caregiver (75 < t < 100). vision system. In Proceedings of the American Asso- A final disengagement phase occurs (t ; 100) as the ciation of Artificial Intelligence (AAAI-98), 1998. robot's attention shifts back to the toy. [13] N. Tinbergen. The Study of Instinct. Oxford Univer- In conclusion, we have constructed an architecture sity Press, New York, 1951. for an expressive robot which enables four types of so- [14] C. Trevarthen. Communication and cooperation in cial responses (affective, exploratory, protective, and early infancy: a description of primary intersubjec- regulatory). The system dynamics are similar to the tivity. In M. Bullowa, editor, Before Speech, pages phases of infant-caregiver interaction [15]. These dy- 321-348. Cambridge University Press, 1979. naniic phases are not explicitly represented in the soft- [151 E. Tronick, H. Als, and L. Adamson. Structure ware architecture, but instead are emergent properties of early face-to-face communicative interactions. In of the interaction of the control systems with the envi- M. Bullowa, editor, Before Speech, pages 349-370. ronment. By producing behaviors that convey inten- Cambridge University Press, 1979. tionality, we exploit the caregiver's natural tendencies [16] J. M. Wolfe. Guided search 2.0: A revised model to treat the robot as a social agent, and thus to re- of visual search. Psychonomic Bulletn H Review, spond in characteristic ways to the robot's overtures. 1(2):202-238, 1994.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.