homehome Home chatchat Notifications


You may not believe it, but this robotic hand can imagine its next move  

Robots are starting to think about themselves.

Rupendra Brahambhatt
July 26, 2022 @ 4:08 pm

share Share

A team of researchers from Columbia University has demonstrated a method that allows a robot to learn the model of its own body. This self-modeling process enabled the robot to decide the type of movements best suited under different circumstances and basically think about its next move. 

Robot WidowX successfully avoids the obstacle and touches the sphere. Image credits: Hod Lipson/YouTube

Every change in our body posture or position is commanded by our nervous system (motor cortex). The human brain knows how the different body parts can move and therefore, it can plan and coordinate our every action before it happens. This is possible because the brain has maps and models of our entire body.

These maps allow the brain to guide the movement of our different body parts, provide us with well-coordinated motion, and even save us from injuries while we face obstacles on our path. Could we do the same thing for robots? Boyuan Chen, the lead author of a new study and an assistant professor at Duke University believes so.

“We humans clearly have a notion of self. Somewhere inside our brain, we have a notion of self, a self-model that informs us what volume of our immediate surroundings we occupy, and how that volume changes as we move.”

Similar to how human body movements are guided using multiple brain maps, Boyuan and his team have demonstrated that a robot can also develop a kinematic model of itself.

A kinematic model is a mathematical information about a robot’s dimensions, moving capabilities and limitations, depth of field, and the workspace it can cover at any given time. It is used by robot operators to control the actions of a machine. However, after self-modeling, a robot can control itself as it becomes aware of how different motor commands trigger different body movements.

How did the scientists enable the robot to model itself?

There is no way scientists can see the brain maps formed inside a person’s mind or what a person thinks at any given point in time — at least, we don’t have the technology yet. Similarly, if a robot imagines something, a scientist can’t see the same by simply peeking into the robot’s neural network. The researchers suggest that a robot’s brain is like a “black box”, so in order to find out if a robot can model itself, they performed an interesting experiment. 

The different tests that confirmed the self-modeling ability of the robot. Image credits: Boyuan et al. 2022, Science Robotics

Describing the experiment in interview with ZME Science, one of the authors of the study and the director of Columbia University’s Creative Machines Lab, Hod Lipson explained: 

“You can imagine yourself, every human can imagine where they are in space but we don’t know exactly how this works. Nobody can look into the brain even of a mouse and say here is how the mouse sees itself.” 

So during their study, the researchers surrounded a robot arm called WidowX 200 with five cameras in a room. The live feed from all the cameras was connected to the robot’s neural network so the robot could see itself through the cameras. As WidowX performed different kinds of body movements in front of the live streaming cameras, it started observing how its different body parts behaved in response to different motor commands. 

After three hours, the robot stopped moving. Its deep neural network had collected all the information required to model the robot’s entire body. The researchers then performed another experiment to test if the robot had successfully modeled itself. They assigned a complex task to the robot that involved touching a 3D red sphere while avoiding a large obstacle in its path. 

Moreover, the robot has to touch the sphere with a particular body part (the end effector). To complete the task successfully, WidowX needed to propose and follow a safe trajectory that could allow it to reach the sphere without collision. Surprisingly, the robot did it without any human help, and for the first time, Boyuan Chen and his team proved that a robot can also learn to model itself. 

Self-modeling robots can advance the field of artificial intelligence

The WidowX robotic hand is not exactly an advanced machine, it can only perform a limited number of actions and movements. Humans in general looks forward to a future that will be run by robots and machines much more complex than WidowX. When asked if any robot could learn to model itself using the same approach, Professor Lipson told ZME Science:

“We did it with a very simple cheap robot (WidowX 200) that we can just buy on Amazon but this should work on other things. Now the question is how complex a robot can be and will this still work? This work for a six-degree robot, will this work for a driverless car? Will this work for 18 motors, a spider robot? And that’s what we gonna do next, we gonna try to push this to see how far it can go.”

Image credits: Possessed Photography/Unsplash

Many recent AI-based innovations such as drones, driverless cars, and humanoids like Sophia perform multiple functions at the same time. If these machines learn to imagine themselves and others including humans, this could lead to a robot revolution. The researchers believe that the ability to model self and others would allow robots to program, repair, and function on their own without human supervision.

“We rely on factory robots, we rely on drones, we rely more and more on these robots, and we can’t babysit all these robots all the time. We can’t always model them or program them, it’s a lot of work. We want the robots to model themselves and we are also interested in working on how robots can model other robots. So they can help each other, keep taking care of themselves, adapt, and be much more resilient and I think it’s gonna be important,” said Professor Lipson.  

The study is published in the journal Science Robotics.

share Share

The World’s Largest Sand Battery Just Went Online in Finland. It could change renewable energy

This sand battery system can store 1,000 megawatt-hours of heat for weeks at a time.

A Hidden Staircase in a French Church Just Led Archaeologists Into the Middle Ages

They pulled up a church floor and found a staircase that led to 1500 years of history.

The World’s Largest Camera Is About to Change Astronomy Forever

A new telescope camera promises a 10-year, 3.2-billion-pixel journey through the southern sky.

AI 'Reanimated' a Murder Victim Back to Life to Speak in Court (And Raises Ethical Quandaries)

AI avatars of dead people are teaching courses and testifying in court. Even with the best of intentions, the emerging practice of AI ‘reanimations’ is an ethical quagmire.

This Rare Viking Burial of a Woman and Her Dog Shows That Grief and Love Haven’t Changed in a Thousand Years

The power of loyalty, in this life and the next.

This EV Battery Charges in 18 Seconds and It’s Already Street Legal

RML’s VarEVolt battery is blazing a trail for ultra-fast EV charging and hypercar performance.

DARPA Just Beamed Power Over 5 Miles Using Lasers and Used It To Make Popcorn

A record-breaking laser beam could redefine how we send power to the world's hardest places.

Why Do Some Birds Sing More at Dawn? It's More About Social Behavior Than The Environment

Study suggests birdsong patterns are driven more by social needs than acoustics.

Nonproducing Oil Wells May Be Emitting 7 Times More Methane Than We Thought

A study measured methane flow from more than 450 nonproducing wells across Canada, but thousands more remain unevaluated.

CAR T Breakthrough Therapy Doubles Survival Time for Deadly Stomach Cancer

Scientists finally figured out a way to take CAR-T cell therapy beyond blood.