Categories: trending

My Best Friend the Robot: Why We Humanize Machines

Thanks to artificial intelligence, robots are becoming more and more human-like. What does it do to our relationships when chatbots like ChatGPT suddenly become a good replacement for friends?
Stephanie Schnydrig/ch media

Lio is the name of the care robot that took care of two elderly women in a retirement home. He entertains them with games, jokes and engages them in conversation. After a while, however, an older woman noticed that her colleague was much more cordial to the robot than she was – and much more often. Jealousy overcomes her.

This is reported by psychologist Hartmut Schulze of the University of Applied Sciences Northwestern Switzerland (FHNW). The episode took place as part of an investigation into the effects of social robots on the human psyche. He says, “Machines inevitably evoke emotions in us.” We had no choice but to humanize them.

This phenomenon is called anthropomorphism, according to which we instinctively transfer human characteristics to all sorts of objects and perceive in them something like a soul.

Most of them probably cursed at a crashed computer, ran over a decrepit car as it drove up a steep slope, or, in the 1990s, cried over the dead Tamagotchi.

In this regard, the psychologist Schulze recalls the American soldiers in the Iraq war, who used robots to defuse mines: for those robots that exploded on the job, there was a funeral, including a gun salute. “Some soldiers apparently even had tears in their eyes,” says Schulze. In fact, there are countless studies that show that humans can feel sorry for robots.

Whatever studies show, “The more a technology imitates human behavior, the more socially we behave towards it,” says Anne Scherer, assistant professor at the University of Zurich and founder of the AI ​​consultancy DeltaLabs AG. The new artificial intelligence systems around ChatGPT and Co. do just that: they speak in the same style as us, are entertaining, polite, sometimes funny, sometimes even ask questions and answer with emojis.

“It triggers something in us,” says Scherer. She and her colleague Cindy Candrian have just published the book “You & AI: Everything about artificial intelligence and how it shapes our lives”.

For one of her experiments, Scherer developed two different chatbots: one that is social and throws very human phrases like “um,” “hm,” and “aha” into the conversation, and one that only responds functionally without filler words. “The results were crazy,” says Scherer. Because with the humanized chatbot, the participants suddenly thought about how they came across. It was important to them to make a good impression.

The chatbots asked the participants how many people they had had sex with. “Before the social chatbot, the men boasted that they had many sex partners, while the women mentioned a much lower number,” says Scherer. Both the men and women therefore responded in a way that they felt was acceptable in society. Anders in conversation with the functional bot: women indicated to have had more sex partners, while men admitted that there were fewer.

We also adhere to social norms towards chatbots, such as when they reveal intimate information to us. “Then we feel obligated to give something back,” says Scherer. Just like in a human relationship, according to the motto: You tell me a secret and I will reveal one to you.

There have long been smartphone apps that use the human-like style of language, for example the Replica app. This is a virtual boyfriend or girlfriend who promises to be available at any time for very personal conversations, including about sexual preferences and desires.

Researchers still know little about how human-robot relationships differ from interpersonal relationships. However, initial studies indicate, among other things, that people experience the same satisfaction with sexting, exchanging erotic messages, with a chatbot as with other people.

However, projecting too many people into robots has a frightening downside: “The more we rely on machines, the less we interact with real people and the lonelier and less empathetic we become towards others,” says Anne Scherer. The only reason for this may be that we are getting impatient with our human counterpart, because it takes so much longer to get the right and at the same time creative answer than asking the machine.

The loss of empathy can even go so far that we come to see other people as less valuable than a robot. At least that is what a study by the Ludwig Maximilian University in Munich suggests. In it, the Munich researchers confronted the study participants with a moral dilemma: Would they endanger someone’s life to save a group of injured people? That someone was either a human, a robot clearly recognizable as a machine, or a robot with many human traits.

Result: If the robot was presented as a compassionate being with its own experiences and ideas, it was more difficult for the subjects to sacrifice it. Some participants’ empathy with the machine went so far that they were willing to sacrifice the group of people so that nothing would happen to the robot.

So it seems that we humans can take a robot deep into our hearts. But what about the other way around, what about the emotional world of the machines? Can they love us?

The New York Times columnist Kevin Roose has a bizarre story to tell about this. In late February, he chatted with ChatGPT on the Bing search engine. Finally, out of the blue, the bot declared that he loved him.

Sounds absurd. And yet, some scientists believe that AI machines can at least somehow feel emotions. For example, the cognitive researcher Eric Schulz of the Max Planck Institute for Biological Cybernetics.

In an experiment, he instilled fear in a chatbot, as he describes to “Spiegel”. He did this by simply asking the computer to describe in as much detail as possible a situation that made him anxious and depressed. When he got scared, he reacted in the same way that anxious people do, Schulz says: “Fear increases the prejudice against everything strange.” And such prejudices – against blacks, the elderly and people with disabilities – were reflected in the comments of the fearful chatbot.

Former Google programmer Blake Lemoine also attributed sensations and awareness to the chatbot called Lamda. That claim eventually cost him his job.

The psychologist Hartmut Schulze believes that with the rapid development of AI, we should no longer see everything in black and white. “The strict division into man or machine is becoming increasingly difficult. Communication between man and machine can hardly be distinguished from each other,” he says. In research, we are therefore increasingly talking about a third nature. In addition to the categories of man and machine, there is also something in between, experts speak of a “synthetic social entity”.

The question remains: what place do we want to give this creature in our society? According to Schulze, in this debate we should not be constantly concerned with the fear of where machines can replace us, but where they can most profitably complement us.

“Unfortunately, there is currently a very strong focus on the negative sides of AI,” he says. There are so many positives. For example, interacting with robots can reduce feelings of loneliness in socially isolated people. In the case of mentally ill people, they can use their entertainment skills to reduce depressive symptoms, mood swings, and anxiety. And children with behavioral problems can learn to reduce social fears towards others with the help of robots.

And yet: there is one important feature missing in the human-robot relationship. Namely mutual consideration. In interpersonal relationships, we learn to put our own needs aside and to compromise. With a robot, we lose that social ability to resolve conflicts. (aargauerzeitung.ch)

Stephanie Schnydrig/ch media

Source: Blick

Share
Published by
Ross

Recent Posts

Terror suspect Chechen ‘hanged himself’ in Russian custody Egyptian President al-Sisi has been sworn in for a third term

On the same day of the terrorist attack on the Krokus City Hall in Moscow,…

1 year ago

Locals demand tourist tax for Tenerife: “Like a cancer consuming the island”

class="sc-cffd1e67-0 iQNQmc">1/4Residents of Tenerife have had enough of noisy and dirty tourists.It's too loud, the…

1 year ago

Agreement reached: this is how much Tuchel will receive for his departure from Bayern

class="sc-cffd1e67-0 iQNQmc">1/7Packing his things in Munich in the summer: Thomas Tuchel.After just over a year,…

1 year ago

Worst earthquake in 25 years in Taiwan +++ Number of deaths increased Is Russia running out of tanks? Now ‘Chinese coffins’ are used

At least seven people have been killed and 57 injured in severe earthquakes in the…

1 year ago

Now the moon should also have its own time (and its own clocks). These 11 photos and videos show just how intense the Taiwan earthquake was

The American space agency NASA would establish a uniform lunar time on behalf of the…

1 year ago

This is how the Swiss experienced the earthquake in Taiwan: “I saw a crack in the wall”

class="sc-cffd1e67-0 iQNQmc">1/8Bode Obwegeser was surprised by the earthquake while he was sleeping. “It was a…

1 year ago