‘Well, at least it tried‘ The Role of Intentions and Outcomes in Ethically Evaluating Robots’ Actions
More and more people start to use robots in domestic settings. The way a robot is used by its owner depends among other factors on the way the owner perceives the robot’s behaviour. If a human considers the robot to act in a moral way, it is more likely to be used to its fullest potential, as a robot that is perceived as moral is usually also perceived as trustworthy. Two important aspects in evaluating behaviour as moral or not, are the intentions and outcomes of the behaviour. The main goal of this research was to find out how the attributions of intentions behind robots’ actions and the outcome of their actions influence the moral evaluation of these actions. Video clips showing a robot appearing to have a good or bad intention followed by an action that leads to a good or bad outcome were used to investigate this. It was found that both good intentions and good outcomes have a positive influence on ethical evaluations, while bad intentions and bad outcomes have a negative influence on these evaluations. Just like when evaluating humans’ actions, the influence of intentions on moral evaluations was found to be stronger than the influence of outcomes. This was found for both actions in Care/Harm and Authority/Respect contexts.
Faculteit der Sociale Wetenschappen