Towards IMACA: Intelligent multimodal affective conversational agent

Amir Hussain*, Erik Cambria, Thomas Mazzocco, Marco Grassi, Qiu Feng Wang, Tariq Durrani

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

2 Citations (Scopus)


A key aspect when trying to achieve natural interaction in machines is multimodality. Besides verbal communication, in fact, humans interact also through many other channels, e.g., facial expressions, gestures, eye contact, posture, and voice tone. Such channels convey not only semantics, but also emotional cues that are essential for interpreting the message transmitted. The importance of the affective information and the capability of properly managing it, in fact, has been more and more understood as fundamental for the development of a new generation of emotion-aware applications for several scenarios like e-learning, e-health, and human-computer interaction. To this end, this work investigates the adoption of different paradigms in the fields of text, vocal, and video analysis, in order to lay the basis for the development of an intelligent multimodal affective conversational agent.

Original languageEnglish
Title of host publicationNeural Information Processing - 19th International Conference, ICONIP 2012, Proceedings
Number of pages8
EditionPART 1
Publication statusPublished - 2012
Externally publishedYes
Event19th International Conference on Neural Information Processing, ICONIP 2012 - Doha, Qatar
Duration: 12 Nov 201215 Nov 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume7663 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference19th International Conference on Neural Information Processing, ICONIP 2012


  • AI
  • HCI
  • Multimodal Sentiment Analysis


Dive into the research topics of 'Towards IMACA: Intelligent multimodal affective conversational agent'. Together they form a unique fingerprint.

Cite this