Using Neuroimaging Data for Exploring Conversational Engagement in Human-Robot Interaction
About the project
This project will address interactions with novel conversational systems, such as social robots, and digital assistants. These technologies can assist people in societal situations such as health care, elderly care, education, public spaces and in homes.
A telepresence system for human-robot interaction will be developed that allows participants to situate themselves in natural conversations while being physically located in a functional magnetic resonance imaging scanner. Each participant will interact directly with a human-like robot and other human actors while lying in the scanner.
This neuroimaging experiment will allow us to understand cognitive processes underlying engagement and will also enable a yet unheard-of in-depth evaluation and perception of conversational engagement as a user-state.
In everyday conversations, a speaker and a listener are involved in a common project which relies on close coordination requiring the continuous attention and relatedly engagement of each participant, while there might be additional bystanders who show less engagement in the conversation.
Previous research in human-human and human-robot interaction has identified four types of events establishing engagement, involving gesture and speech: (1) joint directed gaze at objects, (2) mutual facial gaze (3) back-and-forth conversation (4) short feedback such as nods while the speaker is talking. We will study the underlying neural signatures of conversational engagement.
The researchers in the team represent the Department of Intelligent Systems at KTH EECS and the Psychology department and the Linguistics department at SU.
Assistant Professor at Stockholm University, Pi of research project Using Neuroimaging Data for Exploring Conversational Engagement in Human-Robot Interaction at Digital Futures08-16 32 32