Skip to main content
Loading...

Social Human-Robot Interaction via Touch

20 November 2019, 14:00-15:00 @ Cibali Hall, Kadir Has University

Abstract: Compared to vision and speech, touch is an understudied but emerging modality in social human-robot interaction. Touch gestures can convey information about emotional state, and correct recognition of touch gestures is expected to improve the affective interaction between humans and robots. In this seminar I will talk about our previous studies on touch gesture recognition and affect recognition through touch. We demonstrate that it is possible to recognize touch gestures well above chance level, and infer the emotional state with performance comparable to human-human touch interactions. I will conclude with a discussion on our current work and current challenges in this area.

References
T. Ballı Altuğlu, K. Altun, "Recognizing touch gestures for human-robot interaction," Proceedings of 17th International Conference on Multimodal Interaction, 9-13 November 2015, Seattle, WA, USA.
K. Altun, K. E. MacLean, "Recognizing affect in human touch of a robot," Pattern Recognition Letters, 66(1), pp. 31-40, November 2015.

Speaker Biography: Kerem Altun received his B.S. (1999) and M.S. (2002) degrees in Mechanical Engineering at Middle East Technical University, and his Ph.D. degree (2011) in Electrical and Electronics Engineering at Bilkent University. In 2011-2012 he worked as a postdoctoral research fellow at the Sensory Perception and Interaction Research Group at the Department of Computer Science at the University of British Columbia. He is currently an assistant professor at the Department of Mechanical Engineering at Işık University. His current research interests include intelligent sensing, human-robot interaction, wearable sensing, multimodal interaction, sensor data fusion, affective computing and machine learning.

Webpage: https://scholar.google.ca/citations?user=M0-UlkQAAAAJ&hl=en