Are You Listening: Implementing Non-Verbal Behaviour in a Robot

Keywords
No Thumbnail Available
Issue Date
2015-07-27
Language
en
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In this thesis it was intended to investigate whether the number of active backchannels in non-verbal listening behaviour exhibited by a robot during a human-robot interaction, influenced the human’s feeling of being listened to by the robot. This is an important issue because social robots are becoming more prevalent in everyday life and it is important that people feel comfortable while interacting with a robot. One way of improving these interactions is by implementing non-verbal listening behaviour, but is it enough to implement simple head movements and vocals which are relatively easy to implement or are more complicated behaviours needed to give people a sensation of being listened to. To investigate this a between subject experiment was designed where one group of 10 test subjects interacted with a robot which only responded with uni-modal backchannels, and where the other group interacted with the robot while it would respond with multi-modal backchannels. It was found that there are no significant differences between the two groups. Although the number of test subjects was not really high, this can give a good indication of the results when performed on a larger scale. Future research might include using a larger group to get more reliable data, seeing if different robots yield different results and implementing natural language processing, turn taking and a wider range of behaviours.
Description
Citation
Supervisor
Faculty
Faculteit der Sociale Wetenschappen