Values such as respect for autonomy, safety, enablement, independence, privacy and social connectedness should be reflected in the design of social robots. The same values should affect the process by which robots are introduced into the homes of older people to support independent living. These values may, however, be in tension. We explored what potential users thought about these values, and how the tensions between them could be resolved. With the help of partners in the ACCOMPANY project, 21 focus groups (123 participants) were convened in France, the Netherlands and the UK. These groups consisted of: (i) older people, (ii) informal carers and (iii) formal carers of older people. The participants were asked to discuss scenarios in which there is a conflict between older people and others over how a robot should be used, these conflicts reflecting tensions between values. Participants favoured compromise, persuasion and negotiation as a means of reaching agreement. Roles and related role-norms for the robot were thought relevant to resolving tensions, as were hypothetical agreements between users and robot-providers before the robot is introduced into the home. Participants’ understanding of each of the values—autonomy, safety, enablement, independence, privacy and social connectedness—is reported. Participants tended to agree that autonomy often has priority over the other values, with the exception in certain cases of safety. The second part of the paper discusses how the values could be incorporated into the design of social robots and operationalised in line with the views expressed by the participants.
|Number of pages||20|
|Journal||Ethics and Information Technology|
|Publication status||Published - 1 Mar 2017|
- Qualitative research
- Social robots