An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction

Abstract : In multimodal human-robot interaction (HRI), the process of communication can be established through verbal, non-verbal, and/or para-verbal cues. The linguistic literature shows that para-verbal and non-verbal communications are naturally synchronized, however the natural mechnisam of this synchronization is still largely unexplored. This research focuses on the relation between non-verbal and para-verbal communication by mapping prosody cues to the corresponding metaphoric arm gestures. Our approach for synthesizing arm gestures uses the coupled hidden Markov models (CHMM), which could be seen as a collection of HMM characterizing the segmented prosodic characteristics' stream and the segmented rotation characteristics' streams of the two arms articulations. Experimental results with Nao robot are reported.
Type de document :
Communication dans un congrès
The 14th IFAC Symposium on Information Control Problems in Manufacturing, May 2012, Bucharest, Romania. 〈10.3182/20120523-3-RO-2023.00364〉
Liste complète des métadonnées

Littérature citée [16 références]  Voir  Masquer  Télécharger

https://hal-ensta.archives-ouvertes.fr/hal-01169980
Contributeur : Amir Aly <>
Soumis le : jeudi 2 juillet 2015 - 15:16:16
Dernière modification le : vendredi 8 décembre 2017 - 14:42:16
Document(s) archivé(s) le : mardi 25 avril 2017 - 20:23:27

Fichier

Aly-INCOM2012.pdf
Fichiers produits par l'(les) auteur(s)

Licence


Domaine public

Identifiants

Collections

Citation

Amir Aly, Adriana Tapus. An Integrated Model of Speech to Arm Gestures Mapping in Human-Robot Interaction. The 14th IFAC Symposium on Information Control Problems in Manufacturing, May 2012, Bucharest, Romania. 〈10.3182/20120523-3-RO-2023.00364〉. 〈hal-01169980〉

Partager

Métriques

Consultations de la notice

101

Téléchargements de fichiers

82