Design and implementation of GEmA: A generic emotional agent

https://doi.org/10.1016/j.eswa.2010.08.054Get rights and content

Abstract

In several successful researches, it is concluded that some other factors are effective in human behavior in addition to rationality. These factors are known as emotions and morality. These factors must be taken into account in intelligent agents to have human-like behavior. In this article, we concentrate on emotional aspects of decision making. In this research, a new computational model is introduced to map the environmental events and agent actions to emotional states. The main characteristics of this model are adaptable to different domains and can be implemented as an individual module in software agents, and it has been implemented as a library with JAVA and C# programming languages.

Research highlights

► A new generic emotional model (GEmA) is introduced. ► GEmA involves emotions as a module in decision making by agents. ► GEmA can evaluate events and actions with a computational method. ► GEmA is flexible and adaptable to different environments. ► GEmA is implemented in the Virtual Tutor domain.

Introduction

Many researchers in artificial intelligence as well as in operations research focused mainly on rational decision making. However, emotions, such as joy, fear, hope, and anger, have an important role in human behavior (Zinn, 2006). Hence, both the social and computational sciences have seen an explosion of interest in emotions in the last decade (Gratch, Marsella, & Petta, 2009).

Historically, a conflict has been perceived between emotions and reason. It was advised that emotions should not be considered for reasonable decision making. More recently, researchers have begun to explore the role of emotions in human intelligence. Gardner (1983) proposed the concept of “multiple intelligences” and divided intelligence into six types. In this theory, single IQ test could not determine a person’s intelligence. Further studies have revealed that emotions, in fact, have an important role in human reasoning and decision making (Damasio, 1994, LeDoux, 1996). Later, Goleman (1995) introduced the phrase “Emotional Intelligence Quotient” (EQ). EQ describes the ability, capacity, or skill to identify, assess, and manage the emotions.

“Inspired by the psychological models of emotions, intelligent agents researchers have begun to recognize the utility of computational models of emotions for improving complex, interactive programs” (El-Nasr, Yen, & Ioerger, 2000). Minsky (1986) concluded that: “The question is not whether intelligent machines can have any emotions, but whether machines can be intelligent without any emotions”. According to this theory, several models are represented that agent’s decision making and behavior appear more believable. “Emotionless agents are viewed as merely machines… If humans identify with and accept an agent as human instead of machine-like, they may be more able to trust the agent and better able to communicate with it… the emotional aspect distinguishes a dead machine from an agent who is believable, alive, and trustworthy” (Perry, 1996).

Agents are autonomous software modules with perception and social ability to perform goal-directed knowledge processing, over time, on behalf of humans or other agents in software and physical environments. The knowledge-processing abilities of agents include reasoning, motivation, planning, and decision making. Additional abilities of agents are needed to make them intelligent and trustworthy. Abilities to make agents intelligent include anticipation, understanding, learning, and communication in natural language. Abilities to make agents more trustworthy as well as assuring the sustainability of agent societies include being rational, responsible, and accountable. These lead to rationality, skillfulness, and morality (e.g., ethical agent, moral agent) (Ghasem-Aghaee & Ören, 2003). Intelligent agents require the emotional factors in their decision-making processes to display behavior analogous to humans. For example, agents can model user’s emotions and adapt interfaces to the user’s needs. An agent with anger filter evaluates the degree of anger that the agent may feel when it encounters to a persecuting event (Ghasem-Aghaee, PoorMohamadBagher, Kaedi, & Ören, 2007). Adamatti, Lucia, and Bazzan (2001) present a framework for simulation of agents with emotions. Chown, Jones, and Henninger (2002) describe a cognitive architecture for an interactive decision-making agent with emotions. André, Klesen, Gebhard, Allen, and Rist (2000) focused on models of personality and emotions to control the behavior of animated interactive agents. Synthetic characters enhance effectively their believability by using a model of emotion to simulate and express emotional responses (Bates, 1992, Bates et al., 1992, El-Nasr et al., 2000, Kazemifard et al., 2006). Custódio, Ventura, and Pinto-Ferrira (1999) present emotion-based system architecture together with an application of it for control and supervision purposes.

Emotion has a broad impact across a variety of disciplines such as:

  • (1)

    Integration of emotions in the agent-based tutoring system (Gratch, 2000a, Poel et al., 2004, Vicente, 2003).

  • (2)

    Consideration of emotions in project management and team configuration (Gareis, 2004, Miranda and Aldea, 2005, Miranda et al., 2003, Nair et al., 2003, Whatley, 2004).

  • (3)

    Response of military units to stress (Gratch & Marsella, 2003) and fear (Nair et al., 2003).

  • (4)

    The capability of forces in balancing a great deal of competing goals (Johns & Silverman, 2001).

  • (5)

    Management of mental resources (Zadeh, Shouraki, & Halavati, 2006).

  • (6)

    Estimation of emotions from text and annotation of emotions (Aman and Szpakowicz, 2007, Matsumoto et al., 2007, Shaikh et al., 2007, Shaikh et al., 2009).

  • (7)

    Relating prices, discounts, satisfaction, and disappointment as well as supplier reputations and consumer choices (Mengov, Egbert, Pulov, & Georgiev, 2008).

  • (8)

    Design and implementation of brain emotional-based learning intelligent controller (BELBIC), which is based on mammalian middle brain, on a hardware platform (FPGA) (Jamali, Arami, Dehyadegari, Lucas, & Navabi, 2008).

In this paper, we present GEmA, a Generic Emotional Agent. The goal of our research is to create GEmA as a general computational model of the emotion mechanisms. The model is implemented as a C# and JAVA library. GEmA library is adaptive to different domains and can be used modularly in different realistic applications. We can use the outputs of the model to determine the behavior of emotional agents.

Section 2 outlines some existing emotional models; Section 3 describes the structure of GEmA. Section 4 illustrates GEmA library. Section 5 describes its implementation, and finally Section 6 presents the conclusion and future work.

Section snippets

Review of some existing emotional models

Throughout the history of AI research, many models were proposed to simulate the emotional process. In the following subsections, we focus on the prominent models that use OCC model (Ortony, Clore, & Collins, 1988) as a part of their appraising process and briefly discuss the following: Affective Reasoner (AR), EM Architecture, Fuzzy Logic Adaptive Model of Emotions (FLAME), Scalable Hybrid Architecture for the Mimicry of Emotions (SHAME), and Emotion and Adaption (EMA).

Affective Reasoner (AR)

GEMA: a Generic EMotional Agent

The overall architecture of GEmA is illustrated in Fig. 1. The elements of the architecture are discussed in the sequel:

Appraiser includes two elements: event appraiser and action appraiser which evaluate the events and actions of agent and other agents on the basis of agent’s goals and standards. The OCC model (Ortony et al., 1988) is selected for events and actions assessment; the reason is that it includes comprehensive local and global variables to compute intensity of emotions and methods

GEmA library

We implemented the GEmA as a programming language library with Microsoft C# and JAVA.1 The library includes four parts as follows:

Emotion modeling implements two classes: Emotion and EmotionalCharacter. The Emotion class implements goal-based rules, standard-based rules, compound rules, “update the intensity of emotions”, and “decay intensity of emotions”. EmotionalCharacter is an abstract class. The users must inherit this

GEmA’s implementation

Learning and tutoring involve emotional processes. Some of these emotions sit in the way of the learning process. Emotions are used to master a practice or to learn a theory. In building a Virtual Tutor system, we should avoid negative emotions that arise in the learning process, and we should implement ways to generate positive emotions (Poel et al., 2004). In this domain, we only consider the role of emotions in learning. Many other appraisal variables, such as controllability, dominance, and

Conclusion

In this research, a new generic emotional model of emotional agents (GEmA: Generic Emotional Agent) is introduced. The main goal of GEmA involves emotions as a module in decision making by agents so that an agent can behave similar to humans. Also, a new computational method is offered, which is based upon the evaluation of events and actions. With respect to the goals and standards of the agent, the computational rules are used to map the impact of events and actions on emotional states. This

Acknowledgement

We appreciate Ms. Fatahi who helped us in the evaluation of the Virtual Tutor system based on her master thesis (Fatahi, 2008).

References (55)

  • Chown, E., Jones, R. M., & Henninger, A. E. (2002). An architecture for emotional decision-making agents. In Proceeding...
  • Custódio, L., Ventura, R., & Pinto-Ferrira, C. (1999). Artificial Emotions and emotion-based control systems. In...
  • A.R. Damasio

    Descartes’ error: Emotion, reason and the human brain

    (1994)
  • H.H. Dubs

    The principle of insufficient reason

    Philosophy of Science

    (1942)
  • Elliott, C. (1992). The affective reasoner: A process model of emotions in a multi-agent system. PhD thesis. Institute...
  • M.S. El-Nasr et al.

    FLAME – A fuzzy logic adaptive model of emotions

    Autonomous Agents and Multi-agent Systems

    (2000)
  • Fatahi, S. (2008). Design and implementation of a model based on emotion and personality in virtual learning. Master...
  • H. Gardner

    Frames of mind

    (1983)
  • Gareis, R. (2004). Emotional project management. In Proceeding of the PMI research conference, London (pp....
  • Ghasem-Aghaee, N., & Ören, T. I. (2003). Towards fuzzy agents with dynamic personality for human behavior simulation....
  • Ghasem-Aghaee, N., PoorMohamadBagher, L., Kaedi, M., & Ören, T. I. (2007). Anger filter in agent simulation of human...
  • D. Goleman

    Emotional intelligence

    (1995)
  • Gratch, J. (2000a). Emile: Marshalling passions in training and education. In Proceeding of the 4th international...
  • Gratch, J. (2000b). Modeling the interplay between emotion and decision-making. In Proceeding of the 9th conference on...
  • J. Gratch et al.

    Fight the way you train: The role and limits of emotions in training for combat

    Brown Journal of World Affairs

    (2003)
  • M.R. Jamali et al.

    Emotion on FPGA: Model driven approach

    Expert Systems with Applications

    (2008)
  • Jaques, P. A. (2004). Using an animated pedagogical agent to interact affectively with the student. PhD thesis. PGCC –...
  • Cited by (29)

    • A fuzzy cognitive map model to calculate a user's desirability based on personality in e-learning environments

      2016, Computers in Human Behavior
      Citation Excerpt :

      Thus, cognitive modeling can help researchers to design user interfaces which can understand users' needs and react to them accordingly (Trabelsi & Frasson, 2010). Interest in simulating human behavior factors such as personality, mood, and emotion in virtual environments has been growing in recent years (Harley et al., 2016; Kazemifard, Ghasem-Aghaee, & Ören, 2011; Luse, McElroy, Townsend, & DeMarie, 2013). Several studies have been carried out to consider human behavior factors in human computer interaction (Egges, Kshirsagar, & Magnenat-Thalmann, 2004; Moshkina, 2006; Santos, Marreiros, Ramos, Neves, & Bulas-Cruz, 2011).

    View all citing articles on Scopus
    View full text