(Stillness in the Storm Editor) “I don’t care what they think, so it doesn’t bother me.” How often have you used that to dismiss something someone thinks or says about you? We’ve all done it before. We’d like to think we can ignore the opinions of others, because, let’s face it, people aren’t always nice or respectful. But can we really ignore this? A recent study suggests we’re hardwired to care about what other people think of us, so much so, even when we know a robot is slinging the insults, it still affects us.
In the below study, scientists were trying to figure out how much we value the opinions of others. They devised a test wherein participants would hear insulting statements made by a robot. Despite the fact it was a non-human making the statements, people still felt adverse reactions.
We already know, through various well-established discoveries, that our brain is designed to reward us for climbing the social ladder. When we feel like other people have good opinions about us, our brain feeds us more positive emotion, which reduces anxiety, stress, and increases creativity and fluid intelligence.
We know that the more disturbed your social attachments are, which means that your social relationships aren’t stable, and could even bee antagonistic, your brain will make you more neurotic, leading to mental illness.
We also know that in life, when people have a bad opinion about us, sometimes the first reaction can be to dismiss it, by degrading the value of the opinion through degrading the person.
“They’re an idiot, So what does it matter?
“What do they know, anyway…”
“The lion doesn’t concern itself with the opinion of sheep.”
I would argue, these reactions are self-comforting mechanisms we use to devalue the worth of another person’s opinion in our own mind.
And the very fact we do this with such consistency, across cultures, distances, and time, suggests a mechanism at work. Namely, this mechanism is that we’re hardwired to care about what other people think about us, and trying to deny that by destroying the image of another person in our minds only creates self-imposed isolation that leads to greater mental illness.
The looking glass self was a psychology theory of selfhood developed in the early 20th century, which helps explain this phenomenon.
Perhaps we should learn how to contend with the opinions of others in a way that makes us use negative critiques in a constructive manner.
Perhaps, if we dare to embrace how others see us, we can discern their viewpoint, empathize with it, and in doing so, maintain our connection to them while also guarding against false judgments about ourselves.
For me, this information makes me recall spiritual wisdom about learning how to not take things personality, which is one of the four agreements.
Buy Book The Four Agreements: A Practical Guide to Personal Freedom (A Toltec Wisdom Book)
– Justin
Related Social Media and Social Control: How Silicon Valley Serves the US State Department
by Staff Writer, November 18th, 2019
Discouraging words from a robot can hinder a person’s ability to play a game.
Source: Carnegie Mellon University
Trash talking has a long and colorful history of flustering game opponents, and now researchers at Carnegie Mellon University have demonstrated that discouraging words can be perturbing even when uttered by a robot.
The trash talk in the study was decidedly mild, with utterances such as “I have to say you are a terrible player,” and “Over the course of the game your playing has become confused.” Even so, people who played a game with the robot — a commercially available humanoid robot known as Pepper — performed worse when the robot discouraged them and better when the robot encouraged them.
Lead author Aaron M. Roth said some of the 40 study participants were technically sophisticated and fully understood that a machine was the source of their discomfort.
“One participant said, ‘I don’t like what the robot is saying, but that’s the way it was programmed so I can’t blame it,’” said Roth, who conducted the study while he was a master’s student in the CMU Robotics Institute.
But the researchers found that, overall, human performance ebbed regardless of technical sophistication.
The study, presented last month at the IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) in New Delhi, India, is a departure from typical human-robot interaction studies, which tend to focus on how humans and robots can best work together.
“This is one of the first studies of human-robot interaction in an environment where they are not cooperating,” said co-author Fei Fang, an assistant professor in the Institute for Software Research. It has enormous implications for a world where the number of robots and internet of things (IoT) devices with artificial intelligence capabilities is expected to grow exponentially. “We can expect home assistants to be cooperative,” she said, “but in situations such as online shopping, they may not have the same goals as we do.”
The study was an outgrowth of a student project in AI Methods for Social Good, a course that Fang teaches. The students wanted to explore the uses of game theory and bounded rationality in the context of robots, so they designed a study in which humans would compete against a robot in a game called “Guards and Treasures.” A so-called Stackelberg game, researchers use it to study rationality. This is a typical game used to study defender-attacker interaction in research on security games, an area in which Fang has done extensive work.
Each participant played the game 35 times with the robot, while either soaking in encouraging words from the robot or getting their ears singed with dismissive remarks. Although the human players’ rationality improved as the number of games played increased, those who were criticized by the robot didn’t score as well as those who were praised.
It’s well established that an individual’s performance is affected by what other people say, but the study shows that humans also respond to what machines say, said Afsaneh Doryab, a systems scientist at CMU’s Human-Computer Interaction Institute (HCII) during the study and now an assistant professor in Engineering Systems and Environment at the University of Virginia. This machine’s ability to prompt responses could have implications for automated learning, mental health treatment and even the use of robots as companions, she said.
Future work might focus on nonverbal expression between robot and humans, said Roth, now a Ph.D. student at the University of Maryland. Fang suggests that more needs to be learned about how different types of machines — say, a humanoid robot as compared to a computer box — might invoke different responses in humans.
In addition to Roth, Fang and Doryab, the research team included Manuela Veloso, professor of computer science; Samantha Reig, a Ph.D. student in the HCII; Umang Bhatt, who recently completed a joint bachelor’s-master’s degree program in electrical and computer engineering; Jonathan Shulgach, a master’s student in biomedical engineering; and Tamara Amin, who recently finished her master’s degree in civil and environmental engineering.
Funding: The National Science Foundation provided some support for this work.
Source:
Carnegie Mellon University
Media Contacts:
Byron Spice – Carnegie Mellon University
Image Source:
The image is credited to Carnegie Mellon University.
Original Research: The findings will be presented at the IEEE International Conference on Robot & Human Interactive Communication.
Stillness in the Storm Editor: Why did we post this?
Psychology is the study of the nature of mind. Philosophy is the use of that mind in life. Both are critically important to gain an understanding of as they are aspects of the self. All you do and experience will pass through these gateways of being. The preceding information provides an overview of this self-knowledge, offering points to consider that people often don’t take the time to contemplate. With the choice to gain self-awareness, one can begin to see how their being works. With the wisdom of self-awareness, one has the tools to master their being and life in general, bringing order to chaos through navigating the challenges with the capacity for right action.
– Justin
Not sure how to make sense of this? Want to learn how to discern like a pro? Read this essential guide to discernment, analysis of claims, and understanding the truth in a world of deception: 4 Key Steps of Discernment – Advanced Truth-Seeking Tools.
Stillness in the Storm Editor’s note: Did you find a spelling error or grammatical mistake? Send an email to [email protected], with the error and suggested correction, along with the headline and url. Do you think this article needs an update? Or do you just have some feedback? Send us an email at [email protected]. Thank you for reading.
Source:
https://neurosciencenews.com/robots-trash-talk-15230/
Leave a Reply