homehome Home chatchat Notifications


You may not believe it, but this robotic hand can imagine its next move  

Robots are starting to think about themselves.

Rupendra Brahambhatt
July 26, 2022 @ 4:08 pm

share Share

A team of researchers from Columbia University has demonstrated a method that allows a robot to learn the model of its own body. This self-modeling process enabled the robot to decide the type of movements best suited under different circumstances and basically think about its next move. 

Robot WidowX successfully avoids the obstacle and touches the sphere. Image credits: Hod Lipson/YouTube

Every change in our body posture or position is commanded by our nervous system (motor cortex). The human brain knows how the different body parts can move and therefore, it can plan and coordinate our every action before it happens. This is possible because the brain has maps and models of our entire body.

These maps allow the brain to guide the movement of our different body parts, provide us with well-coordinated motion, and even save us from injuries while we face obstacles on our path. Could we do the same thing for robots? Boyuan Chen, the lead author of a new study and an assistant professor at Duke University believes so.

“We humans clearly have a notion of self. Somewhere inside our brain, we have a notion of self, a self-model that informs us what volume of our immediate surroundings we occupy, and how that volume changes as we move.”

Similar to how human body movements are guided using multiple brain maps, Boyuan and his team have demonstrated that a robot can also develop a kinematic model of itself.

A kinematic model is a mathematical information about a robot’s dimensions, moving capabilities and limitations, depth of field, and the workspace it can cover at any given time. It is used by robot operators to control the actions of a machine. However, after self-modeling, a robot can control itself as it becomes aware of how different motor commands trigger different body movements.

How did the scientists enable the robot to model itself?

There is no way scientists can see the brain maps formed inside a person’s mind or what a person thinks at any given point in time — at least, we don’t have the technology yet. Similarly, if a robot imagines something, a scientist can’t see the same by simply peeking into the robot’s neural network. The researchers suggest that a robot’s brain is like a “black box”, so in order to find out if a robot can model itself, they performed an interesting experiment. 

The different tests that confirmed the self-modeling ability of the robot. Image credits: Boyuan et al. 2022, Science Robotics

Describing the experiment in interview with ZME Science, one of the authors of the study and the director of Columbia University’s Creative Machines Lab, Hod Lipson explained: 

“You can imagine yourself, every human can imagine where they are in space but we don’t know exactly how this works. Nobody can look into the brain even of a mouse and say here is how the mouse sees itself.” 

So during their study, the researchers surrounded a robot arm called WidowX 200 with five cameras in a room. The live feed from all the cameras was connected to the robot’s neural network so the robot could see itself through the cameras. As WidowX performed different kinds of body movements in front of the live streaming cameras, it started observing how its different body parts behaved in response to different motor commands. 

After three hours, the robot stopped moving. Its deep neural network had collected all the information required to model the robot’s entire body. The researchers then performed another experiment to test if the robot had successfully modeled itself. They assigned a complex task to the robot that involved touching a 3D red sphere while avoiding a large obstacle in its path. 

Moreover, the robot has to touch the sphere with a particular body part (the end effector). To complete the task successfully, WidowX needed to propose and follow a safe trajectory that could allow it to reach the sphere without collision. Surprisingly, the robot did it without any human help, and for the first time, Boyuan Chen and his team proved that a robot can also learn to model itself. 

Self-modeling robots can advance the field of artificial intelligence

The WidowX robotic hand is not exactly an advanced machine, it can only perform a limited number of actions and movements. Humans in general looks forward to a future that will be run by robots and machines much more complex than WidowX. When asked if any robot could learn to model itself using the same approach, Professor Lipson told ZME Science:

“We did it with a very simple cheap robot (WidowX 200) that we can just buy on Amazon but this should work on other things. Now the question is how complex a robot can be and will this still work? This work for a six-degree robot, will this work for a driverless car? Will this work for 18 motors, a spider robot? And that’s what we gonna do next, we gonna try to push this to see how far it can go.”

Image credits: Possessed Photography/Unsplash

Many recent AI-based innovations such as drones, driverless cars, and humanoids like Sophia perform multiple functions at the same time. If these machines learn to imagine themselves and others including humans, this could lead to a robot revolution. The researchers believe that the ability to model self and others would allow robots to program, repair, and function on their own without human supervision.

“We rely on factory robots, we rely on drones, we rely more and more on these robots, and we can’t babysit all these robots all the time. We can’t always model them or program them, it’s a lot of work. We want the robots to model themselves and we are also interested in working on how robots can model other robots. So they can help each other, keep taking care of themselves, adapt, and be much more resilient and I think it’s gonna be important,” said Professor Lipson.  

The study is published in the journal Science Robotics.

share Share

Ronan the Sea Lion Can Keep a Beat Better Than You Can — and She Might Just Change What We Know About Music and the Brain

A rescued sea lion is shaking up what scientists thought they knew about rhythm and the brain

Did the Ancient Egyptians Paint the Milky Way on Their Coffins?

Tomb art suggests the sky goddess Nut from ancient Egypt might reveal the oldest depiction of our galaxy.

Dinosaurs Were Doing Just Fine Before the Asteroid Hit

New research overturns the idea that dinosaurs were already dying out before the asteroid hit.

Denmark could become the first country to ban deepfakes

Denmark hopes to pass a law prohibiting publishing deepfakes without the subject's consent.

Archaeologists find 2,000-year-old Roman military sandals in Germany with nails for traction

To march legionaries across the vast Roman Empire, solid footwear was required.

Mexico Will Give U.S. More Water to Avert More Tariffs

Droughts due to climate change are making Mexico increasingly water indebted to the USA.

Chinese Student Got Rescued from Mount Fuji—Then Went Back for His Phone and Needed Saving Again

A student was saved two times in four days after ignoring warnings to stay off Mount Fuji.

The perfect pub crawl: mathematicians solve most efficient way to visit all 81,998 bars in South Korea

This is the longest pub crawl ever solved by scientists.

This Film Shaped Like Shark Skin Makes Planes More Aerodynamic and Saves Billions in Fuel

Mimicking shark skin may help aviation shed fuel—and carbon

China Just Made the World's Fastest Transistor and It Is Not Made of Silicon

The new transistor runs 40% faster and uses less power.