In the 2012 film “Robot and Frank”, the protagonist, a retired cat burglar named Frank, suffers from the first symptoms of dementia. Worried and guilty, his son buys him a “home robot” that can talk, do household chores such as cooking and cleaning, and reminds Frank to take his medicine. It is a robot as we bring it closer to building in the real world.
The film follows Frank, who is initially appalled at the idea of living with a robot, as he gradually begins to see the robot as both functionally useful and social. The film ends with a clear connection between man and machine, so Frank will protect the robot if the two get in trouble.
This is a fictional story, of course, but it challenges us to examine different types of human-robot bonds. My recent research too Human-robot Relationships examine this topic in great detail, looking beyond sex robots and robot love affairs to examine the deepest and most meaningful relationships: friendship.
My colleague and I have identified some potential risks – like leaving human friends for robots – but we have also found various scenarios where robot companionship can constructively improve people’s lives, resulting in friendships that are direct with them are comparable from human to human Relationships.
Philosophy of friendship
Robotics philosopher John Danaher sets a very high bar for what friendship means. His starting point is the “true” friendship, which was first described by the Greek philosopher Aristotle and which saw an ideal friendship as a prerequisite for mutual goodwill, admiration and common values. In this sense, friendship is about a partnership on an equal footing.
Building a robot that can meet Aristotle’s criteria is a significant technical challenge and is a considerable distance away – as Danaher himself admits. Robots that may seem to be approaching, like Sophia from Hanson Robotics, base their behavior on a library of prepared answers: a humanoid chatbot rather than an equivalent interlocutor. Anyone who has tested back and forth with Alexa or Siri will know that AI still has a long way to go in this regard.
[Read: How Polestar is using blockchain to increase transparency]
In the video below tThe humanoid robot Sophia was developed by Hanson Robotics from Hong Kong.
Aristotle also spoke of other forms of “imperfect” friendship – such as “utilitarian” and “lustful” friendships – that are inferior to true friendship because they do not require a symmetrical bond and are often to the unequal advantage of one party. This form of friendship sets a relatively very low bar that some robots – such as sex bots and robot pets – are clearly already meeting.
Artificial friends
For some, the relationship with robots is just a natural extension of the relationship with other things in our world – like people, pets, and possessions. Psychologists have even observed how people react naturally and socially to media facts such as computers and televisions. Humanoid robots, you would have thought, are more personal than your home PC.
However, the area of “robot ethics” is by no means unanimous as to whether we can or should develop some form of friendship with robots. For an influential group of British researchers who have established a series of “ethical principles in robotics”, “comradeship” between humans and robots is a contradiction and it is dishonest to market robots with social skills and they should be treated with caution – if not with an alarm. For these researchers, wasting emotional energy on entities that can only simulate emotions is becoming less and less rewarding than molding themselves from person to person Tie up.
But people are already developing connections to simple robots – like vacuum cleaners and lawn cutting machines – that can be bought for less than the price of a dishwasher. A surprisingly large number of people nicknames these robots – something they don’t do with their dishwashers. Some even take their cleaning robots on vacation.
Other indications of emotional ties with robots include the Shinto blessing ceremony for Sony Aibo robot dogs dismantled for replacement parts and the US troops who fired a 21-gun salute and medals to a bomb disposal robot named “Boomer” after it was destroyed in action.
A military bomb disposal robot similar to ‘Boomer’. U.S. Marine Corps photo by Lance Cpl. Bobby J. Segovia / Wikimedia Commons
These stories, and the psychological evidence we have so far, make it clear that we can expand emotional connections on things that are very different to us, even knowing they are made and preprogrammed. But do these connections represent a friendship comparable to that shared between people?
True friendship?
A colleague and I recently reviewed the extensive literature from human to human Relationships, to understand how and if the concepts found can be applied to bonds we may make with robots. We found evidence that many coveted from human to human Indeed, friendships do not correspond to Aristotle’s ideal.
We have noted a wide range of relationships between people, from relatives and lovers to parents, caregivers, service providers, and the intense (but sadly one-sided) relationships we have with our celebrity heroes. Few of these relationships can be described as being exactly alike, and what matters most is that they will all evolve over time.
All of this means that the expectation that robots will form Aristotelian bonds with us sets a standard that human relationships do not live up to either. We have also observed forms of social connectedness that are rewarding and satisfying, yet far from the ideal friendship outlined by the Greek philosopher.
We know that social interaction is in itself a reward and that humans, as social mammals, have a strong need for it. It’s likely that relationships with robots can help satisfy the deep-seated urges we all feel for social connection – such as physical comfort, emotional support, and enjoyable social exchange – that other people are currently offering.
Some potential risks were also discussed in our paper. These occur particularly in environments where interaction with a robot could replace interaction with humans or where humans are denied the choice of whether to interact with a person or a robot – for example in a care setting.
These are important concerns, but they are possibilities, not inevitable. In the literature we reviewed, we actually found evidence of the opposite effect: robots that create social interactions with others, act as icebreakers in groups, and help people improve their social skills or boost their self-esteem.
Chances are, as time goes on, many of us will simply follow Frank’s path to acceptance: taunting at first, before jumping into the idea that robots can be surprisingly good companions. Our research suggests that this is already happening – though perhaps not in a way that Aristotle would have endorsed.
This article by Tony Prescott, Professor of Cognitive Neuroscience and Director of the Sheffield Robotics Institute at the University of Sheffield, is republished by The Conversation under a Creative Commons license. Read the original article.
Published on February 16, 2021 – 10:51 UTC
Comments are closed.