Tech UPTechnologyWhy do we train robots to be self-aware?

Why do we train robots to be self-aware?

In 2016, for the first time, the number of robots in homes, the military, shops and hospitals exceeded that used in industry. Rather than being concentrated in factories, robots have a growing presence in people’s homes and lives, a trend that is likely to increase as they become more sophisticated and “sensitive.”

“If we take the robot from a factory to a house, we want security,” said Pablo Lanillos, assistant professor at Radboud University in the Netherlands.

And for machines to safely interact with people, they must look more like humans, experts like Lanillos say. He has designed an algorithm that allows robots to recognize themselves, similar to humans.

An important distinction between humans and robots is that our senses are flawed, feeding misleading information into our brains. ‘We have a very imprecise proprioception (awareness of the position and movement of our body). For example, our muscles have sensors that are not accurate compared to robots, which have very accurate sensors, “he said.

The human brain takes this imprecise information to guide our movements and understanding of the world, but robots are not used to dealing with uncertainty in the same way.

“In real situations, there are errors, differences between the world and the model of the world that the robot has,” Lanillos clarifies. “The problem we have in robots is that when you change any condition, the robot starts to fail.”

By two years, humans can tell the difference between their bodies and other objects in the world. But this calculation that a two-year-old human brain can make is very complicated for a machine and makes it difficult for them to navigate the world.

To recognize

The algorithm that Lanillos and his colleagues developed in a project called SELFCEPTION allows three different robots to distinguish their “bodies” from other objects.

Their test robots included one comprised of arms covered in tactile skin, another with known sensory inaccuracies, and a commercial model. They wanted to see how robots would respond, given their different ways of collecting “sensory” information.

One test that algorithm-assisted robots passed was the rubber hand illusion , originally used on humans. “We put a plastic hand in front of you, we cover your real hand and then we start to stimulate your covered hand and the fake hand that you can see,” the expert explains. In a matter of minutes, people start to think that the fake hand is their hand.

The goal was to fool a robot with the same illusion that confuses humans. It is a measure of how well various sensors integrate and how well the robot can adapt to situations. Lanillos and his colleagues had a robot experience the fake hand as its hand, similar to how a human brain would.

The second test was the mirror test , originally proposed by primatologists. In this exercise, a red dot is placed on the forehead of an animal or person; then they look at themselves in a mirror. Humans, and some animals like monkeys, try to remove the red dot from their face instead of the mirror.

“In real situations, there are errors, differences between the world and the model of the world that the robot has,” continues Lanillos.

Testing is a way to determine how self-aware an animal or person is. Human children can generally pass the exam before their second birthday.

The team trained a robot to “recognize itself” in the mirror by connecting the movement of the limbs in the reflection with its own limbs. Now they are trying to get the robot to rub the red dot.

The next step in this research is to integrate more sensors into the robot, and increase the information it computes, to improve its perception of the world. A human has about 130 million receptors on his retina alone and 3,000 touch receptors on each finger. Dealing with large amounts of data is one of the crucial challenges in robotics. “Working out how to combine all this information in a meaningful way will improve body awareness and understanding of the world,” said the expert.

Improving the way robots perceive time can also help them operate in a more human way, allowing them to more easily integrate into people’s lives. This is particularly important for assistive robots, which will interact with people and have to cooperate with them to accomplish tasks. These include service robots that have been suggested as a way to help care for the elderly.

“(Human) behavior, our interaction with the world, depends on our perception of time,” said Anil Seth, co-director of the Sackler Center for the Science of Consciousness at the University of Sussex, UK. “Having a good sense of time is important for any complex behavior.”


Sense of time

Professor Seth collaborated on a project called TimeStorm that examined how humans perceive time and how to use this knowledge to give machines a sense of time as well.

Inserting a watch into a robot would not give them temporary consciousness, according to Professor Seth. “Humans, or animals, do not perceive time by having a clock in their head,” he said. There are biases and distortions in the way humans perceive time, he says.

Warrick Roseboom, a cognitive scientist also at the University of Sussex who spearheaded the university’s TimeStorm efforts, created a series of experiments to quantify how people experienced the passage of time.

“We asked humans to watch different videos from a few seconds to about a minute and tell us how long they thought the video was,” Roseboom said. The videos were first-person perspectives of everyday tasks, such as walking around campus or sitting in a cafe. The subjects experienced a time different from the actual duration, depending on how busy the scene was.

Using this information, the researchers created a deep learning-based system that could mimic human subjects’ perception of the length of the video. “It worked really well,” said Professor Seth. “And we were able to predict quite accurately how humans would perceive duration in our system.”

A main focus of the project was to investigate and demonstrate that machines and humans work together with the same expectations of time.

The researchers were able to do it with a demonstration of robots that assist in meal preparation, serving food according to customer preferences … something that requires an understanding of the perception of human time, planning and remembering what is already done. has done.

TimeStorm’s tracking project, Entiment, created software that companies can use to program robots with a sense of time for applications such as meal prep and table cleaning.

In the past 10 years, the field of robot awareness has progressed significantly, Lanillos says, and the next decade will see even more advancements, with robots becoming more and more self-aware.

“I am not saying that the robot is as conscious as a human is, in a thoughtful way, but it will be able to adapt its body to the world .”


Original article

This article was originally published in Horizon, the EU Research and Innovation Magazine

Go from a traditional CV to a digital and comprehensive one

The reality is that a person's CV on paper does not accurately reflect whether that person is suitable for a job, says Guillermo Elizondo.

Prime Day does not save Amazon and reports only 15% growth

The big tech companies are disappointing shareholders and Wall Street's response is to stop betting on them.

Goodbye to “irregular import” cell phones: ZTE will block them in Mexico

The company explained that it will send a message to the smartphones from which it "does not recognize" its import.

77% of the semiconductors that Intel manufactured in 2020 came from Asia

Upon the arrival of the new 13th Generation Intel Core in Mexico, the company spoke about its most relevant segments.

Japanese scientists create a 'washing machine for humans'

Can you imagine taking a relaxing bath in a machine that washes you with bubbles, plays relaxing music or videos?