Head of the project: Dominique Longin.
Funding: French National Research Agency (ANR), CONTINT 2008 program, contract No. ANR-08-CORD-005.
Date and duration: started in December 1st, 2008 for 46 months.
It concerns the interaction systems of new generation where the human user is at the core of the interaction. These systems are designed to be believable (i.e. not only trustworthy and honest, but also capable to give an illusion of life). Several studies show that the conception of this kind of systems can be realized only by integrating an advanced processing of emotions in the system. This is in order to endow the system with the capabilities to understand and to adapt to the user’s emotions, to reason about the user’s emotions, to plan its actions anticipating their effects on the user’s emotions, and to express its emotions. These are necessary prerequisite to make the system capable to interact with the user in a natural way.
The subject of emotions has been debated in the last twenty, thirty years in several disciples such as psychology, philosophy, cognitive sciences and economics. More recently, computer sciences have started to focus on emotions. Several computational models of the role of emotions in cognition and on the expression of affective contents have been proposed (e.g. the project on Affective Computing at MIT or the Kansei Information Processing in Japan). These researches have been at the basis of the development of several prototypes such as Embodied Conversational Agents (ECAs) to be used in different services (e.g. game platforms, simulators, tutoring agents, robotic assistants…).
In the domain of emotion expression and emotion recognition, researches have been focused on anthropomorphic systems that interact with a human user in a multi-modal way. Nevertheless, actual systems only consider few basic emotions such as joy, sadness, fear, surprise without considering more complex emotions such as regret, guilt, envy, shame. Several theoretical works (commonly called “theories of appraisal”) show that complex emotions (which are typical of humans) are closely related to epistemic attitudes (beliefs, predictions, expectations, etc.) and motivational attitudes (goals, desires, intentions, etc.).
Logic is suited to enable reasoning and several logics with well-known properties exist which enable to represent these attitudes. The aim of this project concern the study of complex emotions (i.e. emotions based on counterfactual reasoning and on reasoning about norms, responsibility, power and abilities).
- As a first step, we will investigate and formalize this kind of emotions (such as regret, shame, guilt, jealousy, reproach, etc.) in order to provide non-ambiguous definitions which can be used by an agent in reasoning.
- As a second step, we will exploit these definitions in order to specify and to implement a library of speech acts of expressive type which are used to express feelings and emotions.
- As a third step, we will implement in an embodied conversational agent (ECA) a planning module which take into account the emotions of the system and the emotions ascribed to the user by the system. Once the system has chosen a certain action to perform (either physical or communicative), it will compute the information to express through the different modalities (facial expressions, bodily movements, language) and it will communicate the information in a multi-modal way eventually including in its message an emotional content.