loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Jianfeng Xu ; Yuki Nagai ; Shinya Takayama and Shigeyuki Sakazawa

Affiliation: Media and HTML5 Application Laboratory, KDDI R&D Laboratories and Inc., Japan

Keyword(s): Conversational Agents, Multimodal Synchronization, Gesture, Motion Graphs, Dynamic Programming.

Related Ontology Subjects/Areas/Topics: Agent Models and Architectures ; Agents ; Artificial Intelligence ; Conversational Agents ; Enterprise Information Systems ; Human-Computer Interaction ; Intelligent User Interfaces ; Soft Computing ; Vision and Perception

Abstract: Multimodal representation of conversational agents requires accurate synchronization of gesture and speech. For this purpose, we investigate the important issues in synchronization as a practical guideline for our algorithm design through a precedent case study and propose a two-step synchronization approach. Our case study reveals that two issues (i.e. duration and timing) play an important role in the manual synchronizing of gesture with speech. Considering the synchronization problem as a motion synthesis problem instead of a behavior scheduling problem used in the conventional methods, we use a motion graph technique with constraints on gesture structure for coarse synchronization in a first step and refine this further by shifting and scaling the motion in a second step. This approach can successfully synchronize gesture and speech with respect to both duration and timing. We have confirmed that our system makes the creation of attractive content easier than manual creation of equal quality. In addition, subjective evaluation has demonstrated that the proposed approach achieves more accurate synchronization and higher motion quality than the state-of-the-art method. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.85.167.119

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Xu, J.; Nagai, Y.; Takayama, S. and Sakazawa, S. (2014). Accurate Synchronization of Gesture and Speech for Conversational Agents using Motion Graphs. In Proceedings of the 6th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART; ISBN 978-989-758-016-1; ISSN 2184-433X, SciTePress, pages 5-14. DOI: 10.5220/0004748400050014

@conference{icaart14,
author={Jianfeng Xu. and Yuki Nagai. and Shinya Takayama. and Shigeyuki Sakazawa.},
title={Accurate Synchronization of Gesture and Speech for Conversational Agents using Motion Graphs},
booktitle={Proceedings of the 6th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART},
year={2014},
pages={5-14},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004748400050014},
isbn={978-989-758-016-1},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 6th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART
TI - Accurate Synchronization of Gesture and Speech for Conversational Agents using Motion Graphs
SN - 978-989-758-016-1
IS - 2184-433X
AU - Xu, J.
AU - Nagai, Y.
AU - Takayama, S.
AU - Sakazawa, S.
PY - 2014
SP - 5
EP - 14
DO - 10.5220/0004748400050014
PB - SciTePress