Interactive Embodied Agents for Cultural Heritage and Archaeological presentations
Keywords:Multimodal interaction, Virtual agent, Ambient intelligence, Virtual worlds, Cultural heritage, Archaeology
In this paper, Maxine, a powerful engine to develop applications with embodied animated agents is presented. The engine, based on the use of open source libraries, enables multimodal real-time interaction with the user: via text, voice, images and gestures. Maxine virtual agents can establish emotional communication with the user through their facial expressions, the modulation of the voice and expressing the answers of the agents according to the information gathered by the system: noise level in the room, observer’s position, emotional state of the observer, etc. Moreover, the user’s emotions are considered and captured through images. For the moment, Maxine virtual agents have been used as virtual presenters for Cultural Heritage and Archaeological shows.
BALDASSARRI, S., CEREZO, E., SERON, F. (2007): An open source engine for embodied animated agents.In Proc. Congreso Español de Informática Gráfica: CEIG’07, pp. 89–98.
BERRY, D.et al, (2005). Evaluating a realistic agent in an advice-giving task. In International Journal in Human-Computer Studies, Nº 63, pp. 304-327. http://dx.doi.org/10.1016/j.ijhcs.2005.03.006
BOFF, E. et al, (2005). An affective agent-based virtual character for learning environments. Proceedings of the Wokshop on Motivation and Affect in Educational Software, 12th International Conference on Artificial Intelligence in Education. Amsterdam, Holland, pp 1-8.
BURLESON, W. et al, (2004). A Platform for Affective Agent Research. Proceedings of the Workshop on Empathetic Agents, International Conference on Autonomous Agents and Multiagent Systems, New York, USA.
CEREZO, E., BALDASSARRI, S., SERON, F. (2007): Interactive agents for multimodal emotional user interaction. In Proc. of IADIS International Conference Interfaces and Human Computer Interaction, pp. 35–42.
CASELL, J. et al (eds), (2000), in Embodied Conversational Agents. MIT Press, Cambridge, USA.
El-NASR, M. S. et al, (1999). A PET with Evolving Emotional Intelligence. Proceedings of the 3rd Annual Conference on Autonomous Agents. Seattle, USA, pp. 9 – 15. http://dx.doi.org/10.1145/301136.301150
GRAESSER, A. et al, (2005). AutoTutor: An Intelligent tutoring system with mixed-initiative dialogue. In IEEE Transactions on Education, Vol. 48, Nº 4, pp. 612-618. http://dx.doi.org/10.1109/TE.2005.856149
KASAP, Z. and N. MAGNENAT-THALMANN (2007): “Intelligent virtual humans with autonomy and personality: State-of-the-art”, in IntelligentDecision Technologies. IOS Press. https://doi.org/10.3233/IDT-2007-11-202
MARSELLA S. C et al, (2000). Interactive Pedagogical Drama. Proceedings of the 4th International Conference on Autonomous Agents. Barcelona, Spain, pp. 301–308. http://dx.doi.org/10.1145/336595.337507
MIGNONNEAU, L. and SOMMERER, C. (2005). Designing emotional, methaforic, natural and intuitive interfaces for interactive art, edutainment and mobile communications, in Computer & Graphics, Vol. 29, pp. 837-851. https://doi.org/10.1016/j.cag.2005.09.001
PRENDINGER, H. and ISHIZUKA, M., (2005). The Empathic Companion: A Character-Based Interface that Addresses Users’ Affective States. In Applied Artificial Intelligence, Vol.19, pp.267–285. http://dx.doi.org/10.1080/08839510590910174
ROSIS, F. et al, (2003). From Greta’s mind to her face: modelling the dynamics of affective status in a conversational embodied agent. In International Journal of Human-computer Studies. Special Issue on Applications of Affective Computing in HCI, Vol 59, pp 81-118. http://dx.doi.org/10.1016/s1071-5819(03)00020-x
YUAN, X. and CHEE, S. (2005). Design and evaluation of Elva: an embodied tour guide in an interactive virtual art gallery. In Computer Animation and Virtual Worlds, Vol. 16, pp.109-119. http://dx.doi.org/10.1002/cav.65
How to Cite
This journal is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.