About this Journal Submit a Manuscript Table of Contents
Advances in Human-Computer Interaction
Volume 2014 (2014), Article ID 630808, 14 pages
http://dx.doi.org/10.1155/2014/630808
Research Article

Designing of a Personality Based Emotional Decision Model for Generating Various Emotional Behavior of Social Robots

Robotics Research Group, University of Auckland, Auckland, ECE, Science Building 301, 38 Princess Street, Auckland 1010, New Zealand

Received 21 March 2013; Revised 4 October 2013; Accepted 21 October 2013; Published 5 January 2014

Academic Editor: Armando Bennet Barreto

Copyright © 2014 Ho Seok Ahn. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

All humans feel emotions, but individuals express their emotions differently because each has a different personality. We design an emotional decision model that focuses on the personality of individuals. The personality-based emotional decision model is designed with four linear dynamics, viz. reactive dynamic system, internal dynamic system, emotional dynamic system, and behavior dynamic system. Each dynamic system calculates the output values that reflect the personality, by being used as system matrices, input matrices, and output matrices. These responses are reflected in the final emotional behavior through a behavior dynamic system as with humans. The final emotional behavior includes multiple emotional values, and a social robot shows various emotional expressions. We perform some experiments using the cyber robot system, to verify the efficiency of the personality-based emotional decision model that generates various emotions according to the personality.

1. Introduction

Social robots communicate with various humans in our daily life environment. They play with humans as toys and friends [19], provide information as guides [10, 11] or teachers [12], and help humans [13, 14]. In this process, emotional communication is the key to making humans regard robots as friends. For this reason, researchers have designed artificial emotional systems and social robots. Breazeal designed a three-dimensional emotional space model that consists of arousal, valence, and stance, and developed Kismet and Leonardo [15, 16]. Miwa et al. proposed a three-dimensional emotional space model that consists of activation, pleasantness, and certainty [17]. The emotional vector calculated by a quadratic differential equation decides the final emotion of the humanoid robot WE-4RII. Lee et al. proposed a linear affect-expression space model that consists of surprise, angriness, and sadness, for Doldori [18]. Karg et al. designed an affect model based on Piecewise Linear system to model transitions of affect [19]. Kanoh et al. proposed an emotional model using a three-dimensional emotional space for ifbot [20]. Becker-Asano and Wachsmuth designed the “WASABI” affect simulation architecture, which uses a three-dimensional emotion space called PAD (Pleasure-Arousal-Dominance) space [21]. In these studies, social robots generate and express their emotions in Human-Robot Interaction.

Emotions are known to be dependent on contextual and cultural background. The emotional systems of human beings result in different emotions and behaviors under the same external stimuli. One of the reasons for these differences is that humans have different personalities. Personality influences most emotional processes. As social robots communicate with unspecified persons, it is required that social robots generate and express their emotions differently from their own personalities. For this reason, researchers studied emotional models focused on personality. Wilson introduced a three-layer model having momentary emotions, mood, and personality [22]. Each of them has different properties, such as priority and duration. Kshirsagar and Magneat-Thalmann suggested a system that has personality for an emotional virtual human using a five factor model [23]. Ushida et al. compared the difference among three characters having different personality, when the same stimuli are given [24]. He has analyzed the influence of personality on stimuli and behavior decision. Bates and Reilly introduced a calculated emotional model having sociality [25, 26]. Their emotional models calculate the rules for making behaviors according to the robot's personality, using external inputs. These researches focused on personality show that the personality of an artificial emotion system is an essential element for making various emotions.

In this paper, we design a personality-based emotional decision model that generates different emotional behavior, in accordance with personality. Humans feel emotions, understand situations, and express their emotions differently, according to their cultural background, age, and gender [27, 28]. Self-disciplined people can control their emotions easier than people who are undisciplined. For example, children and males usually cannot control their emotions, but this depends on their personal propensities. By focusing on this aspect, we design an emotional system having a personality for generating various emotions. In the designed emotional system, emotional behavior is generated by linear dynamics systems with personality matrices. The final emotional behavior is changed by the personality matrices.

The main purpose of the designed personality-based emotional decision model is design of artificial emotional model that is an intuitive emotional model to be easily used by changing personality parameters as well as generates various emotional statuses according to personality. For this, we designed an emotional model with dynamics, and user sets only personality parameters independently from each other. It means that user can set nine personality matrices of our model separately. Also each property of personality can be easily added or removed without changing other personality parameters. One more unique contribution of our research is that our model generates all emotional status, not selecting only one emotion. It affects to generate various emotional status and expressions.

Age and gender are good examples to see the difference of personality. Therefore, we obtained the personality parameters of four groups (young males, young females, senior males, and senior females) to see the difference of generated emotional status and expressions according to age and gender.

This paper is organized as follows. In Section 2, we explain the personality-based emotional decision model. In Section 3, we introduce a cyber robot system used for experiments, and experimental settings. In Section 4, we present experimental results and discussions. Finally, we conclude this paper in Section 5.

2. Personality-Based Emotional Decision Model

Most researches on artificial emotional models use affect dimensional models or space models for defining emotions. For using affect dimensional models or space models, we need to consider two issues; one is defining the position of emotions on the affect space, and the other is transiting of emotional status from external/internal situation to space model. But, previous models use different space models because an objectively qualified theory for identifying the concept of basic emotions and dimensional model of personality does not yet exist. Some models use dynamics for calculating emotional status or transiting of affect status, for example, the research by Miwa et al. [17], Lee et al. [18], and Karg et al. [19].

We design a personality-based emotional decision model that generates various emotions and behaviors, although the external situation is the same. The designed model generates various emotions, by changing the personality that is the character of the emotional model. The personality-based emotional decision model is designed based on our previous model includes linear dynamics to be general-purpose, for the application to various robots having different purposes, without requiring redesign of the emotional decision model [2932]. The designed model generates multiple emotions as the final emotions and shows various emotional behaviors. All emotional statuses, such as internal status, emotion, and behavior, have interrelationships in the designed emotional model.

2.1. Personalities in Emotional Decision Model

Figure 1 shows the designed personality-based emotional decision model that consists of five parts: reactive dynamic system, internal dynamic system, emotional dynamic system, behavior dynamic system, and personality. Personality is related to all calculations of dynamic system and tunes emotional characters, such as durability of emotional elements and sensitivity of emotional elements. There are various external stimuli, for example, facial information, gestures, speech, and various sensors data. When external stimuli come in, the personalit-based emotional decision model decides whether the external stimuli influence the reactive dynamics or the internal dynamics or both of them. The reactive dynamics causes unconscious reactions, such as a sigh or yelp. It is not related to the emotional dynamics but influences the behavior dynamics directly. Some personality components, such as patience and composure, are used in the reactive dynamics. The internal dynamics, as opposed to reactive dynamics, causes conscious reactions. It uses the memory of the internal status, such as friendliness, love, and discomfort index, and influences emotional status. Sensitiveness is used in the internal dynamics for the personalities.

630808.fig.001
Figure 1: The personality-based emotional decision model, which consists of four dynamics and nine personality matrices.

The emotional dynamics is related to the internal dynamics, and influences the behavior dynamics. It uses the results of the internal dynamics as the input data with the previous emotional status. Discipline is used for the personalities in the emotional dynamics. The results of the emotional dynamics do not mean the final emotion of the model but the internal emotion of the model. The final emotion of the model is decided by the behavior dynamics. It uses the personality, such as perseverance and power of discipline which means the controllability of behavior expression. The personality is the most important element in generating different emotions and behaviors for various kinds of robots, although the external stimuli and internal dynamics remain unchanged. This is related to all calculations of the personality-based emotional decision model, and can be set by modifying the values of column vectors.

2.2. Five Emotional Parts

The emotions and external sensors of robots are different, according to the purposes of the robots. For easyly applying an emotional decision model to various purposes, we design the personality-based emotional decision model based on linear dynamic systems that can easily change the number of emotional elements. Each emotional element is formed by a column vector, and each process of the personality-based emotional decision model is defined by a state dynamic equation. The personality-based emotional decision model is comprised of the following four dynamics: the reactive dynamics, the internal dynamics, the emotional dynamics, and the behavior dynamics.

2.2.1. Reactive Dynamics

The reactive dynamics is the relation between the external stimuli and reaction. It is described as follows: where is the state vector determined by the previous result of the reactive dynamics, is the external stimuli related with unconsciousness, is the system matrix determined by the durability of unconscious reaction, and is the input matrix determined by the sensitivity of unconscious reaction, described as follows:

In (1), is expressed as a diagonal matrix whose diagonal elements have the values between 0 and 1. In (2), the elements of have the values between 0 and 1. When the external stimuli have no effect on the reactive dynamics, becomes the zero matrix. If and become zero matrices, the robot does not have any unconscious reaction. If element of has negative value, the correlated external stimuli reduces the effect on the correlated reactive dynamics.

2.2.2. Internal Dynamics

The internal dynamics is the relation between the external stimuli and the internal status. It is described as follows: where is the state vector determined by the previous result of the internal dynamics, is the external stimuli related with consciousness, is the system matrix determined by the durability of conscious reaction, and is the input matrix determined by the sensitivity of conscious reaction, described as follows:

In (3), is expressed as a diagonal matrix whose diagonal elements have the values between 0 and 1. In (4), the elements of have the values between 0 and 1. When the external stimuli have no effect on the internal dynamics, becomes the zero matrix. If and become zero matrices, the robot does not have any conscious reaction. If element of has negative value, the correlated external stimuli reduces the effect on the correlated internal dynamics.

2.2.3. Emotional Dynamics

The emotional dynamics is the relation between the internal stimuli and the emotional status. It is described as follows: where is the state vector determined by the previous result of the emotional dynamics, is the internal stimuli, is the system matrix determined by the durability of emotional status, and is the input matrix determined by the sensitivity of internal stimuli, described as follows:

In (5), is expressed as a diagonal matrix whose diagonal elements have the values between 0 and 1. In (6), the elements of have the values between 0 and 1. When the internal stimuli have no effect on the emotional dynamics, becomes the zero matrix. If and become zero matrices, the robot does not have any emotion.

2.2.4. Behavior Dynamics

The behavior dynamics is the relation between the behavior and the status of the reactive and the emotional dynamics. It is described as follows: where is the output vector determined by the previous result of the behavior dynamics, is the system matrix determined by the durability of emotional behaviors, is the reactive stimuli, and related with unconscious process, is the output matrix determined by the sensitivity of reactive stimuli, described as follows:

where is the emotional status, and related with conscious process. is the output matrix determined by the sensitivity of emotional status, described as follows:

In (7), is expressed as a diagonal matrix whose diagonal elements have the values between 0 and 1. The elements of in (8) and the elements of in (9) have the values between 0 and 1. When the emotional and the reactive dynamics have no effect on the behavior dynamics, and become zero matrices. If , and become zero matrices, the robot does not have any behavior.

2.2.5. Personality Matrices

and in (1), and in (3), and in (5), and , and in (7) are the personality matrices. Consider the following:

The set of system matrices , as shown in (10), is determined by the personality about the durability of the previous status that is the feedback data. For example, in the case of the emotional dynamics, the system matrix indicates the influence of the emotional dynamics at time to the emotional dynamics at time . The set of input matrices , as shown in (11), and the set of output matrices , as shown in (12), are determined by the personality about the sensitivities to previous stimuli that are the results of the previous dynamics. For example, in the case of the behavior dynamics, the output matrix indicates the influence of the reactive dynamics to the behavior dynamics. This also means unconscious influence. The output matrix indicates the influence of the emotional dynamics to the behavior dynamics. This also means conscious influence.

2.3. Behavior Generation Process

For emotional expression, we should generate behaviors appropriate to the final emotions. But the final emotions are a mixture of every emotional behavior value. Also, emotional expression is different from the specifications of robot systems. Therefore, we use a behavior generation system that consists of three parts: behavior training sets, emotional matrix generator, and behavior combination generator [33]. The behavior training sets indicate unit behaviors related to each emotion. Then, the emotional matrix generator calculates the emotional expression values of unit behaviors. Finally, the behavior combination generator finds the best unit behavior combination, according to the multiple emotions calculated by the personality-based emotional decision model.

3. Experimental Environments

3.1. Cyber Robot System

We develop a cyber robot system to apply the personality-based emotional decision model. The cyber robot system, as shown in Figure 2, is a simulation system that uses various intelligent engines and cyber robot clients. The top left part of Figure 2 is the selected emotion model and can be changed to other emotion models. The bottom right part of Figure 2 is the selected cyber robot system, and can be also changed to other cyber robot systems. We can choose each intelligent engine, such as emotion model, recognition module, and cyber robot system as the purpose of the system. Real robot systems can also be used for the robot system of the cyber robot system also. The cyber robot has five expressible parts: eyebrows, eyes, cheeks, mouth, and arms. Each expressible part had unit behaviors can be shown through each expressible part, and one unit behavior, from each expressible part is selected by the behavior generation system. If each expressible part has ten unit behaviors, the cyber robot can show 105 expressions.

630808.fig.002
Figure 2: The cyber robot simulator using the designed personality-based emotional decision model and a cyber robot system. Users can change the intelligent engine modules as the purpose of the system [32].
3.2. Experimental Settings

The designed model uses nine personalities, which can be used for making various characteristic people by changing the personality matrix. We divide people into four groups: young males, senior males, young females, and senior females. Young groups include people who are under twenties, and senior groups include people who are over forties. We set the personalities four groups.

3.2.1. Dynamics Settings

We use the sensing results as the external stimuli, such as “meets a good or bad person,” “listens to some words,” and “a human hits the robot”. The external stimuli, the reactive vector, the internal vector, the emotional vector, and the behavior vector are described, respectively, as follows:

3.2.2. Personality Matrices

We obtained human data from 44 people (15 young males, 8 senior males, 11 young females, and 10 senior females) for the setting of personality matrices, using a personality parameter collecting program. For obtaining human data, the program shows some situations to humans, and then humans mark their factors that reflect their personality. The values of the system matrices means how much the value of the previous time reflects on the value of the next time. The values of the input matrices and output matrices mean how much the input values reflect on the output values.

Young Males. We set the personality matrices of young males based on the human data obtained from 15 people. From (18) to (21) show the system matrix , which is the durability of each dynamics. From Tables 1, 2, 3, 4, and 5 show the input matrix , which is the sensitivity of each dynamics.

tab1
Table 1: The input matrix of the sensitivity of unconscious reaction.
tab2
Table 2: The input matrix of the sensitivity of conscious reaction.
tab3
Table 3: The input matrix of the sensitivity of internal stimuli.
tab4
Table 4: The output matrix of the sensitivity of reactive stimuli.
tab5
Table 5: The input matrix of the sensitivity of the emotional dynamics.

Consider the following:

Senior Males. We set the personality matrices of senior males based on the human data obtained from 8 people. From (22) to (25) show the system matrix , which is the durability of each dynamics. From Tables 6, 7, 8, 9 and 10 show the input matrix , which is the sensitivity of each dynamics.

tab6
Table 6: The input matrix of the sensitivity of unconscious reaction.
tab7
Table 7: The input matrix of the sensitivity of conscious reaction.
tab8
Table 8: The input matrix of the sensitivity of internal stimuli.
tab9
Table 9: The output matrix of the sensitivity of reactive stimuli.
tab10
Table 10: The output matrix of the sensitivity of emotional status.

Consider the following:

Young Females. We set the personality matrices of young females based on the human data obtained from 11 people. From (26) to (29) show the system matrix , which is the durability of each dynamics. From Tables 11, 12, 13, 14 and 15 show the input matrix , which is the sensitivity of each dynamics.

tab11
Table 11: The input matrix of the sensitivity of unconscious reaction.
tab12
Table 12: The input matrix of the sensitivity of conscious reaction.
tab13
Table 13: The input matrix of the sensitivity of internal stimuli.
tab14
Table 14: The output matrix of the sensitivity of reactive stimuli.
tab15
Table 15: The output matrix of the sensitivity of emotional status.

Consider the following:

Senior Females. We set the personality matrices of senior females based on the human data obtained from 10 people. From (30) to (33) show the system matrix , which is the durability of each dynamics. From Tables 16, 17, 18, 19, and 20 show the input matrix , which is the sensitivity of each dynamics.

tab16
Table 16: The input matrix of the sensitivity of unconscious reaction.
tab17
Table 17: The input matrix of the sensitivity of conscious reaction.
tab18
Table 18: The input matrix of the sensitivity of internal stimuli.
tab19
Table 19: The output matrix of the sensitivity of reactive stimuli.
tab20
Table 20: The output matrix of the sensitivity of emotional status.

Consider the following:

3.2.3. Scenario

We create a scenario for an external situation and compare the outputs of two emotional models. Our scenario is as follows:(1)at 2 sec., play good music for 3 seconds,(2)at 5 sec., a good person is recognized,(3)at 7 sec., the person tells the robot “I love you,”(4)at 11 sec., obstacles are detected,(5)at 13 sec., the robot is bumped against something,(6)at 15 sec., a bad person is recognized,(7)at 16 sec., the person hits the robot,(8)at 17 sec., the person tells the robot “Away with you,”(9)at 20 sec., a good person is recognized,(10)at 22 sec., the person tells the robot “Don’t worry,”(11)at 24 sec., the robot is given some food.

4. Experimental Results and Discussions

We perform some experiments to verify that the personality-based emotional decision model generates various emotions, according to age and gender. We generate the values of emotional behavior every 0.1 second.

4.1. Experimental Results

From Figures 3, 4, 5, and 6 show the graph of emotions generated by young males, senior males, young females, and senior females, respectively.

630808.fig.003
Figure 3: Graph of emotions generated by young males.
630808.fig.004
Figure 4: Graph of emotions generated by senior males.
630808.fig.005
Figure 5: Graph of emotions generated by young females.
630808.fig.006
Figure 6: Graph of emotions generated by senior females.

From Figures 7, 8, 9, and 10 show the emotional expressions based on emotions generated by young males, senior males, young females, and senior females, respectively.

630808.fig.007
Figure 7: The emotional expressions based on emotions generated by young males.
630808.fig.008
Figure 8: The emotional expressions based on emotions generated by senior males.
630808.fig.009
Figure 9: The emotional expressions based on emotions generated by young females.
630808.fig.0010
Figure 10: The emotional expressions based on emotions generated by senior females.
4.2. Discussions

In the graph of young males' emotions, happiness and love was generated until 15 seconds. Senior males generated happiness and love similarly, but the generated emotions were continued more than for young males. Both groups generated anger and other emotions at 20 seconds, but the duration of emotions was longer in the graph of senior males. This tendency was shown in the generated emotions of females. Happiness and love was generated at first in both groups, but the duration of emotions was longer in the graph of senior females. It was same for the other emotions. We think that the reason for the duration of emotions being different is that the personality matrices were set differently.

The durability of emotions is related to the system matrices . The system matrix , the durability of unconscious reaction, of young males has larger values than the system matrix of senior males. Also, the system matrix of young females has larger values than the system matrix of senior females. On the other hand, this tendency is different in other system matrices, such as , the durability of conscious reaction, , the durability of emotional status, and , the durability of emotional behaviors. The system matrices of senior people have larger values than the system matrices of young people. The output matrices are related to the reflection of unconsciousness and consciousness. The output matrix , the sensitivity of unconsciousness, of young people have larger values than the output matrix of senior people. On the other hand, the output matrix , the sensitivity of consciousness, of senior people have larger values than the output matrix of young people. This means that young people receive more new stimuli than senior people, but senior people not easily swayed by new stimuli. From these experimental results, we can decide that the emotional behavior that can be differently generated by the system matrices, which is a kind of personality.

We can see the difference of the generated emotions between males and females. Young and senior males generated anger more than other emotions, but young and senior females generated fear more than other emotions. Young people generated sadness more than senior people. Young people generated more various emotions than senior people. We think the reason why the generated emotions were different is also that the personality matrices were set differently. The sensitivity of emotions is related to the input matrices and output matrices . In particular, the values related to anger of the input matrix , the sensitivity of internal stimuli, of males are larger than the values of the input matrix of females. On the other hand, the values related to fear of the input matrix of females are larger than the values of the input matrix of males. This means that the personality matrices are strongly related in the emotional decision process, and the effects of personality are revealed in the designed personality-based emotional decision model.

We performed these experiments to verify the efficiency of the designed emotion system. Although we show that it is possible to generate various emotional behaviors according to personality-based on human data, these experiments do not show the characteristics of humans according to age and gender. In our experiments, the robot with the personality of young people generated more dynamic emotional behaviors than senior people, the robot with the personality of males generated aggressive emotions such as anger, and the robot with the personality of females generated passive emotions such as fear. But we obtained the personality matrices data from just a few people to show human characteristics; and in addition, the measurement method of human data for personality matrices is not objectively proven. In this paper, we merely used obtained human data to set the different personalities of four groups, and showed the different emotional expressions of four groups. In the future, we will obtain human data from many people by objectively proven measurement methods, and perform experiments to show the characteristics of humans according to age and gender.

5. Conclusions

Humans generate various emotions by complex emotional mechanisms, which are not yet clearly identified. Researchers have studied the emotional mechanism of humans in psychology as well as in neuroscience. To provide social robots with friendly communication skills, robot researchers have designed artificial emotion systems with social robots based on these studies. In this paper, we focused on the personality of humans. Humans feel emotions, understand situations, and express emotions differently, depending upon their cultural background, educational background, family background, age, and gender. Humans process external stimuli to internal statuses differently. Humans have different abilities to control their feeling, and express their behavior. As the personality of humans is different, humans feel and express their emotions differently. Therefore, we have designed the personality-based emotional decision model, which generates various emotions by changing the personality, which is the character of the emotional model.

The personality-based emotional decision model consists of five parts: reactive dynamics, internal dynamics, emotional dynamics, behavior dynamics, and personality. The personality-based emotional decision model is designed with four linear dynamics to be general-purpose to apply to various robots having different purposes, without redesign of emotional decision model. In each dynamics, system matrices input matrices, and output matrices are used for the characteristics of dynamics that represents personalities. The designed model uses nine personality matrices, and generates unconscious responses as well as conscious responses based on neuroscientific theories. These responses are reflected in the final emotional behavior, which is of multiple emotions, like humans. We performed some experiments using the cyber robot system to verify the efficiency of the designed model. We set four experimental systems having different personalities using human data. As an experimental result, we confirmed that the personality-based emotional decision model generates various emotions by changing personality. In the future, we will perform more experiments for showing human characteristics according to age and gender using plentiful human data. We will also design a learning mechanism of personality. Then it will provide a more useful and smart emotional system for social robots.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

References

  1. T. Shibata, K. Ohkawa, and K. Tanie, “Spontaneous behavior of robots for cooperation—emotionally intelligent robot system,” in Proceedings of the 13th IEEE International Conference on Robotics and Automation, vol. 3, pp. 2426–2431, Minneapolis, Minn, USA, April 1996. View at Publisher · View at Google Scholar · View at Scopus
  2. M. Fujita and H. Kitano, “Development of an autonomous quadruped robot for robot entertainment,” Autonomous Robots, vol. 5, no. 1, pp. 7–18, 1998. View at Publisher · View at Google Scholar · View at Scopus
  3. H. S. Ahn, I.-K. Sa, D.-W. Lee, and D. Choi, “A playmate robot system for playing the rock-paper-scissors game with humans,” Artificial Life and Robotics, vol. 16, no. 2, pp. 142–146, 2011. View at Publisher · View at Google Scholar · View at Scopus
  4. M. Fujita, “On activating human communications with pet-type robot AIBO,” Proceedings of the IEEE, vol. 92, no. 11, pp. 1804–1813, 2004. View at Publisher · View at Google Scholar · View at Scopus
  5. M. Haring, N. Bee, and E. Andre, “Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots,” in Proceedings of the 20th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN ’11), pp. 204–209, Atlanta, Ga, USA, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  6. B. Gonsior, S. Sosnowski, C. Mayer et al., “Improving aspects of empathy and subjective performance for HRI through mirroring facial expressions,” in Proceedings of the 20th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN ’11), pp. 350–356, Atlanta, Ga, USA, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  7. H. S. Ahn, D. W. Lee, D. Choi et al., “Development of an android for singing with motion capturing,” in Proceedings of the 37th Annual Conference of the IEEE Industrial Electronics Society (IECON ’11), pp. 63–68, 2011. View at Scopus
  8. H. S. Ahn, D. W. Lee, D. Choi, D. Y. Lee, H. Lee, and M. H. Baeg, “Development of an incarnate announcing robot system using emotional interaction with humans,” International Journal of Humanoid Robotics, vol. 10, no. 2, pp. 1–24, 2013. View at Publisher · View at Google Scholar
  9. M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe, and R. Tow, “Experiences with sparky, a social robot,” in Socially Intelligent Agents, K. Dautenhahn, A. Bond, L. Cañamero, and B. Edmonds, Eds., vol. 3 of Multiagent Systems, Artificial Societies, and Simulated Organizations, pp. 173–180, Springer, New York, NY, USA, 2002. View at Publisher · View at Google Scholar
  10. W. Burgard, A. B. Cremers, D. Fox et al., “The interactive museum tour-guide robot,” in Proceedings of the 15th National Conference on Artificial Intelligence, pp. 11–18, 1998.
  11. M. Bennewitz, F. Faber, D. Joho, M. Schreiber, and S. Behnke, “Towards a humanoid museum guide robot that interacts with multiple persons,” in Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots (ICHR ’05), pp. 418–423, Tsukuba, Jaban, December 2005. View at Publisher · View at Google Scholar · View at Scopus
  12. J. Solis, M. Bergamasco, K. Chida, S. Isoda, and A. Takanishi, “The anthropomorphic flutist robot WF-4 teaching flute playing to beginner students,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA ’04), vol. 1, pp. 146–151, May 2004. View at Publisher · View at Google Scholar · View at Scopus
  13. H. S. Ahn, I. K. Sa, and J. Y. Choi, “PDA-based mobile robot system with remote monitoring for home environment,” IEEE Transactions on Consumer Electronics, vol. 55, no. 3, pp. 1487–1495, 2009. View at Publisher · View at Google Scholar · View at Scopus
  14. H. S. Ahn, Y. M. Beak, I. K. Sa, W. S. Kang, J. H. Na, and J. Y. Choi, “Design of reconfigurable heterogeneous modular architecture for service robots,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’08), pp. 1313–1318, Nice, France, September 2008. View at Publisher · View at Google Scholar · View at Scopus
  15. C. Breazeal, Designing Sociable Robots, MIT Press, Cambridge, Mass, USA, 2002.
  16. W. Dan Stiehl, L. Lalla, and C. Breazeal, “A ‘somatic alphabet’ approach to ‘sensitive skin’,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA ’04), vol. 3, pp. 2865–2870, May 2004. View at Publisher · View at Google Scholar · View at Scopus
  17. H. Miwa, K. Itoh, M. Matsumoto et al., “Effective emotional expressions with emotion expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’04), vol. 3, pp. 2203–2208, October 2004. View at Publisher · View at Google Scholar · View at Scopus
  18. H. S. Lee, J. W. Park, and M. J. Chung, “A linear affect-expression space model and control points for mascot-type facial robots,” IEEE Transactions on Robotics, vol. 23, no. 5, pp. 863–873, 2007. View at Publisher · View at Google Scholar · View at Scopus
  19. M. Karg, S. Haug, K. Kühnlenz, and M. Buss, “A dynamic model and system-theoretic analysis of affect based on a piecewise linear system,” in Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive (RO-MAN ’09), pp. 238–244, Toyama, Japan, October 2009. View at Publisher · View at Google Scholar · View at Scopus
  20. M. Kanoh, S. Kato, and H. Itoh, “Facial expressions using emotional space in sensitivity communication robot ‘ifbot’,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’04), vol. 2, pp. 1586–1591, October 2004. View at Publisher · View at Google Scholar · View at Scopus
  21. C. Becker-Asano and I. Wachsmuth, “Affect simulation with primary and secondary emotions,” in Intelligent Virtual Agents, H. Prendinger, J. Lester, and M. Ishizuka, Eds., vol. 5208 of Lecture Notes in Computer Science, pp. 15–28, Springer, Berlin, Germany, 2008. View at Publisher · View at Google Scholar
  22. I. Wilson, “The artificial emotion engine, driving emotional behavior,” in Proceedings of the AAAI Spring Symposium on Artificial Intelligence and Interactive Environment, pp. 76–80, 2000.
  23. S. Kshirsagar and N. Magneat-Thalmann, “A multilayer personality model,” in Proceedings of the 2nd International Symposium on Smart Graphics (SMARTGRAPH '02), pp. 107–115, ACM, Hawthorne, NY, USA, 2002. View at Publisher · View at Google Scholar
  24. H. Ushida, Y. Hirayama, and H. Nakajima, “Emotion model for life-like agent and its evaluation,” in Proceedings of the 15th National/10th Conference on Artificial Intelligence/Innovative Applications of Artificial Intelligence (AAAI '98/IAAI '98), pp. 62–69, Menlo Park, Calif, USA, July 1998. View at Scopus
  25. J. Bates, “The role of emotion in believable agents,” Communications of the ACM, vol. 37, no. 7, pp. 122–125, 1994. View at Publisher · View at Google Scholar · View at Scopus
  26. S. Reilly, Believable social and emotional agents [Ph.D. thesis], Department of Computer Science, Carnegie Mellon University, Pittsburgh, Pa, USA, 1996.
  27. K. R. Scherer, “The role of culture in emotion-antecedent appraisal,” Journal of Personality and Social Psychology, vol. 73, no. 5, pp. 902–922, 1997. View at Scopus
  28. J. L. Tsai, Y. Chentsova-Dutton, L. Freire-Bebeau, and D. E. Przymus, “Emotional expression and physiology in European Americans and Hmong Americans,” Emotion, vol. 2, no. 4, pp. 380–397, 2002. View at Publisher · View at Google Scholar · View at Scopus
  29. H. S. Ahn and J. Y. Choi, “Emotional behavior decision model based on linear dynamic systems for intelligent service robots,” in Proceedings of the 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN ’07), pp. 786–791, Jeju, Republic of Korea, August 2007. View at Publisher · View at Google Scholar · View at Scopus
  30. H. S. Ahn, P. J. Kim, J. H. Choi et al., “Emotional head robot with behavior decision model and face recognition,” in Proceedings of the International Conference on Control, Automation and Systems (ICCAS ’07), pp. 2719–2724, Seoul, Republic of Korea, October 2007. View at Publisher · View at Google Scholar · View at Scopus
  31. H. S. Ahn, Y. M. Baek, J. H. Na, and J. Y. Choi, “Multi-dimensional emotional engine with personality using intelligent service robot for children,” in Proceedings of the International Conference on Control, Automation and Systems (ICCAS ’08), pp. 2020–2025, Seoul, Republic of Korea, October 2008. View at Publisher · View at Google Scholar · View at Scopus
  32. H. S. Ahn, J. Y. Choi, and D. W. Lee, “Universal emotional behavior decision system for social robots,” in Discrete Event Robots, iConceptPress, Hong Kong, 2012.
  33. D. Lee, H. S. Ahn, and J. Y. Choi, “A general behavior generation module for emotional robots using unit behavior combination method,” in Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive (RO-MAN ’09), pp. 375–380, Toyama, Japan, October 2009. View at Publisher · View at Google Scholar · View at Scopus