David Baxter PhD
Late Founder
Avatars As Communicators Of Emotions
ScienceDaily
July 10, 2008
Current interactive systems enable users to communicate with computers in many ways, but not taking into account emotional communication. A PhD thesis presented at the University of the Basque Country puts forward the use of avatars or virtual Internet personages as an efficient form of non-verbal communication, principally focusing on emotional aspects.
Scientists have been working for decades so that the interaction between people and computers be more natural and intuitive. In fact, a great part of the success or failure of a computer application depends on the user interface. The way in which we communicate with the operating system, for example, has progressed a lot from the time when it was required to write complicated lines of commands on a black and white screen to those with much more intuitive windows. Currently, we have systems designed that combine 3-dimensional graphics, artificial vision and speaking technologies ? what are known as multimodal interaction systems. Amongst these, the most common are voice synthesisers and recognition devices that enable the user to communicate with the machine using verbal language.
Nevertheless, taking into account the everyday interaction between people, this technology should also take into consideration non-verbal communication, i.e. facial expressions and body gestures. Our successful acceptation of a transmitted, face-to-face message depends 7% on the words used, 38% on the way in which the voice is used (tone and volume) and 55% on gesture or facial expressions. This is why Ms Amalia Ortiz Nicol?s considers it fundamental to include modules in the multimodal interfaces that enable the interpretation and generation of non-verbal, specifically emotional, communication.
In her PhD thesis, Avatars for emotional interaction, the use of avatars (virtual personages) is proposed as one of the best ways for computer systems to emit non-verbal information. Ms Amalia Ortiz is a PhD in computer science and currently works at the VICOMTech-IK4 technological centre. Her research was led by doctors N?stor Garay-Vitoria and Maria Teresa Linaza of the Computer Sciences and Artificial Intelligence Department in the Computer Faculty of the University of the Basque Country (UPV/EHU).
Tools for creating emotions
With verbal communication, Ms Ortiz?s thesis focuses on the emotional and affective aspects. Her basic aim was to see if avatars are capable of communicating emotions. As doctor Ortiz pointed out in her thesis, an avatar is a virtual person that enables a system to be equipped with an appearance (face, eyes, body, voice, and so on) and with behaviour that emulates interaction between persons.
After studying existing computer structures and architecture and observing that all of these focus on concrete cases of interaction, Dr. Ortiz designed generic architecture capable of storing any type of emotional interaction using avatars. She also designed the tools required for both the user and the creator of the interactive system to be able to generate and express emotions easily and intuitively: a tool enabling the generation of avatars without any type of previous knowledge, a system based on emotional cognitive models and an interpreting module that translates the orders and facial expressions. Once the different emotions were generated, Dr. Ortiz implemented an animation model which enabled the avatar to show each emotion together with its corresponding intensity.
Testing emotional interaction
With the aim of applying her studies to a real case, Ms Ortiz validated the generic architecture through the creation of various applications, containing all the tools developed for generation and expression: the IGARTUBEITI, AVACHAT, SASTEC and ELEIN programmes, designed at VICOMTech-IK4.
The IGARTUBEITI system consists of a virtual journey through history by means of digital narrations. In this way, history is experienced ? transmitted through emotions ? by means of the reconstruction with three-dimensional graphics and explanations of historical circumstances by a virtual guide. AVACHAT is a chat system whereby users can communicate with each other not only with text but also by means of three-dimensional avatars, both verbally and non-verbally.
Also, the SASTEC system aims to provide a series of support mechanisms for users with cognitive disabilities. To this end, a series of exercises to help the memory of such users has been designed, in such a way that their autonomy is encouraged and enhanced. The system has user interfaces made up of emotional avatars. Finally the ELEIN programme, aimed at e-Learning environments, looks to providing a novel way of communicating educational content over the Internet. It offers a three-dimensional didactic agent with the ability to speak in real time and incorporate the complete contents of courses.
After these applications were subjected to evaluations by end users, Dr. Ortiz concluded that the users prefer interaction using avatars because they considered it to be more pleasant, user-friendly and entertaining. Moreover, she observed that users were capable of recognising most facial emotions or expressions and they believed that information to be better explained if an emotional avatar is providing it. According to the data gathered, the users of the online ELEIN course responded correctly to 13% more questions when the concepts had been previously explained by an avatar and that the number of correct responses increased by a further 10% when the concepts had been explained by means of an emotional avatar.
Dr. Ortiz believes that it would be interesting to implement a module with an emotional voice synthesiser, given that users? evaluation would be more positive if the tone of voice of the avatar is in harmony with its facial expression. Thus, this PhD thesis defended at the UPV/EHU opens novel possibilities for voice synthesisers capable of simulating emotions and for body animation modules which express emotions by means of gestures.
ScienceDaily
July 10, 2008
Current interactive systems enable users to communicate with computers in many ways, but not taking into account emotional communication. A PhD thesis presented at the University of the Basque Country puts forward the use of avatars or virtual Internet personages as an efficient form of non-verbal communication, principally focusing on emotional aspects.
Scientists have been working for decades so that the interaction between people and computers be more natural and intuitive. In fact, a great part of the success or failure of a computer application depends on the user interface. The way in which we communicate with the operating system, for example, has progressed a lot from the time when it was required to write complicated lines of commands on a black and white screen to those with much more intuitive windows. Currently, we have systems designed that combine 3-dimensional graphics, artificial vision and speaking technologies ? what are known as multimodal interaction systems. Amongst these, the most common are voice synthesisers and recognition devices that enable the user to communicate with the machine using verbal language.
Nevertheless, taking into account the everyday interaction between people, this technology should also take into consideration non-verbal communication, i.e. facial expressions and body gestures. Our successful acceptation of a transmitted, face-to-face message depends 7% on the words used, 38% on the way in which the voice is used (tone and volume) and 55% on gesture or facial expressions. This is why Ms Amalia Ortiz Nicol?s considers it fundamental to include modules in the multimodal interfaces that enable the interpretation and generation of non-verbal, specifically emotional, communication.
In her PhD thesis, Avatars for emotional interaction, the use of avatars (virtual personages) is proposed as one of the best ways for computer systems to emit non-verbal information. Ms Amalia Ortiz is a PhD in computer science and currently works at the VICOMTech-IK4 technological centre. Her research was led by doctors N?stor Garay-Vitoria and Maria Teresa Linaza of the Computer Sciences and Artificial Intelligence Department in the Computer Faculty of the University of the Basque Country (UPV/EHU).
Tools for creating emotions
With verbal communication, Ms Ortiz?s thesis focuses on the emotional and affective aspects. Her basic aim was to see if avatars are capable of communicating emotions. As doctor Ortiz pointed out in her thesis, an avatar is a virtual person that enables a system to be equipped with an appearance (face, eyes, body, voice, and so on) and with behaviour that emulates interaction between persons.
After studying existing computer structures and architecture and observing that all of these focus on concrete cases of interaction, Dr. Ortiz designed generic architecture capable of storing any type of emotional interaction using avatars. She also designed the tools required for both the user and the creator of the interactive system to be able to generate and express emotions easily and intuitively: a tool enabling the generation of avatars without any type of previous knowledge, a system based on emotional cognitive models and an interpreting module that translates the orders and facial expressions. Once the different emotions were generated, Dr. Ortiz implemented an animation model which enabled the avatar to show each emotion together with its corresponding intensity.
Testing emotional interaction
With the aim of applying her studies to a real case, Ms Ortiz validated the generic architecture through the creation of various applications, containing all the tools developed for generation and expression: the IGARTUBEITI, AVACHAT, SASTEC and ELEIN programmes, designed at VICOMTech-IK4.
The IGARTUBEITI system consists of a virtual journey through history by means of digital narrations. In this way, history is experienced ? transmitted through emotions ? by means of the reconstruction with three-dimensional graphics and explanations of historical circumstances by a virtual guide. AVACHAT is a chat system whereby users can communicate with each other not only with text but also by means of three-dimensional avatars, both verbally and non-verbally.
Also, the SASTEC system aims to provide a series of support mechanisms for users with cognitive disabilities. To this end, a series of exercises to help the memory of such users has been designed, in such a way that their autonomy is encouraged and enhanced. The system has user interfaces made up of emotional avatars. Finally the ELEIN programme, aimed at e-Learning environments, looks to providing a novel way of communicating educational content over the Internet. It offers a three-dimensional didactic agent with the ability to speak in real time and incorporate the complete contents of courses.
After these applications were subjected to evaluations by end users, Dr. Ortiz concluded that the users prefer interaction using avatars because they considered it to be more pleasant, user-friendly and entertaining. Moreover, she observed that users were capable of recognising most facial emotions or expressions and they believed that information to be better explained if an emotional avatar is providing it. According to the data gathered, the users of the online ELEIN course responded correctly to 13% more questions when the concepts had been previously explained by an avatar and that the number of correct responses increased by a further 10% when the concepts had been explained by means of an emotional avatar.
Dr. Ortiz believes that it would be interesting to implement a module with an emotional voice synthesiser, given that users? evaluation would be more positive if the tone of voice of the avatar is in harmony with its facial expression. Thus, this PhD thesis defended at the UPV/EHU opens novel possibilities for voice synthesisers capable of simulating emotions and for body animation modules which express emotions by means of gestures.