ZME Science
No Result
View All Result
ZME Science
No Result
View All Result
ZME Science

Home → Science → News

You may not believe it, but this robotic hand can imagine its next move  

Robots are starting to think about themselves.

Rupendra BrahambhattbyRupendra Brahambhatt
July 26, 2022
in Future, News, Robotics, Studies, Technology
A A
Share on FacebookShare on TwitterSubmit to Reddit

A team of researchers from Columbia University has demonstrated a method that allows a robot to learn the model of its own body. This self-modeling process enabled the robot to decide the type of movements best suited under different circumstances and basically think about its next move. 

Robot WidowX successfully avoids the obstacle and touches the sphere. Image credits: Hod Lipson/YouTube

Every change in our body posture or position is commanded by our nervous system (motor cortex). The human brain knows how the different body parts can move and therefore, it can plan and coordinate our every action before it happens. This is possible because the brain has maps and models of our entire body.

These maps allow the brain to guide the movement of our different body parts, provide us with well-coordinated motion, and even save us from injuries while we face obstacles on our path. Could we do the same thing for robots? Boyuan Chen, the lead author of a new study and an assistant professor at Duke University believes so.

“We humans clearly have a notion of self. Somewhere inside our brain, we have a notion of self, a self-model that informs us what volume of our immediate surroundings we occupy, and how that volume changes as we move.”

Similar to how human body movements are guided using multiple brain maps, Boyuan and his team have demonstrated that a robot can also develop a kinematic model of itself.

A kinematic model is a mathematical information about a robot’s dimensions, moving capabilities and limitations, depth of field, and the workspace it can cover at any given time. It is used by robot operators to control the actions of a machine. However, after self-modeling, a robot can control itself as it becomes aware of how different motor commands trigger different body movements.

How did the scientists enable the robot to model itself?

There is no way scientists can see the brain maps formed inside a person’s mind or what a person thinks at any given point in time — at least, we don’t have the technology yet. Similarly, if a robot imagines something, a scientist can’t see the same by simply peeking into the robot’s neural network. The researchers suggest that a robot’s brain is like a “black box”, so in order to find out if a robot can model itself, they performed an interesting experiment. 

The different tests that confirmed the self-modeling ability of the robot. Image credits: Boyuan et al. 2022, Science Robotics

Describing the experiment in interview with ZME Science, one of the authors of the study and the director of Columbia University’s Creative Machines Lab, Hod Lipson explained: 

RelatedPosts

How CCTV Cameras and AI Can Prevent Floods in Cities
This AI can detect Parkinson’s just by looking at a person’s breathing
Weapons shouldn’t be able to decide themselves to end a life – Hawking, Musk, Wozniak sign letter requesting the ban of autonomous weapons and military AI
Computer chip can mimic human neurons using only beams of light

“You can imagine yourself, every human can imagine where they are in space but we don’t know exactly how this works. Nobody can look into the brain even of a mouse and say here is how the mouse sees itself.” 

So during their study, the researchers surrounded a robot arm called WidowX 200 with five cameras in a room. The live feed from all the cameras was connected to the robot’s neural network so the robot could see itself through the cameras. As WidowX performed different kinds of body movements in front of the live streaming cameras, it started observing how its different body parts behaved in response to different motor commands. 

After three hours, the robot stopped moving. Its deep neural network had collected all the information required to model the robot’s entire body. The researchers then performed another experiment to test if the robot had successfully modeled itself. They assigned a complex task to the robot that involved touching a 3D red sphere while avoiding a large obstacle in its path. 

Moreover, the robot has to touch the sphere with a particular body part (the end effector). To complete the task successfully, WidowX needed to propose and follow a safe trajectory that could allow it to reach the sphere without collision. Surprisingly, the robot did it without any human help, and for the first time, Boyuan Chen and his team proved that a robot can also learn to model itself. 

Self-modeling robots can advance the field of artificial intelligence

The WidowX robotic hand is not exactly an advanced machine, it can only perform a limited number of actions and movements. Humans in general looks forward to a future that will be run by robots and machines much more complex than WidowX. When asked if any robot could learn to model itself using the same approach, Professor Lipson told ZME Science:

“We did it with a very simple cheap robot (WidowX 200) that we can just buy on Amazon but this should work on other things. Now the question is how complex a robot can be and will this still work? This work for a six-degree robot, will this work for a driverless car? Will this work for 18 motors, a spider robot? And that’s what we gonna do next, we gonna try to push this to see how far it can go.”

Image credits: Possessed Photography/Unsplash

Many recent AI-based innovations such as drones, driverless cars, and humanoids like Sophia perform multiple functions at the same time. If these machines learn to imagine themselves and others including humans, this could lead to a robot revolution. The researchers believe that the ability to model self and others would allow robots to program, repair, and function on their own without human supervision.

“We rely on factory robots, we rely on drones, we rely more and more on these robots, and we can’t babysit all these robots all the time. We can’t always model them or program them, it’s a lot of work. We want the robots to model themselves and we are also interested in working on how robots can model other robots. So they can help each other, keep taking care of themselves, adapt, and be much more resilient and I think it’s gonna be important,” said Professor Lipson.  

The study is published in the journal Science Robotics.

Tags: AIneural networkself-modeling robot

ShareTweetShare
Rupendra Brahambhatt

Rupendra Brahambhatt

Rupendra Brahambhatt is an experienced journalist and filmmaker covering culture, science, and entertainment news for the past five years. With a background in Zoology and Communication, he has been actively working with some of the most innovative media agencies in different parts of the globe.

Related Posts

News

Streaming services are being overrun by AI-generated music

byMihai Andrei
7 days ago
Biology

AI Could Help You Build a Virus. OpenAI Knows It — and It’s Worried

byMihai Andrei
1 week ago
Future

AI ‘Reanimated’ a Murder Victim Back to Life to Speak in Court (And Raises Ethical Quandaries)

byNir Eisikovitsand1 others
2 weeks ago
Art

AI-Based Method Restores Priceless Renaissance Art in Under 4 Hours Rather Than Months

byTibi Puiu
3 weeks ago

Recent news

Coolness Isn’t About Looks or Money. It’s About These Six Things, According to Science

July 1, 2025

Ancient Roman Pompeii had way more erotic art than you’d think

July 1, 2025

Wild Orcas Are Offering Fish to Humans and Scientists Say They May Be Trying to Bond with Us

July 1, 2025
  • About
  • Advertise
  • Editorial Policy
  • Privacy Policy and Terms of Use
  • How we review products
  • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.

No Result
View All Result
  • Science News
  • Environment
  • Health
  • Space
  • Future
  • Features
    • Natural Sciences
    • Physics
      • Matter and Energy
      • Quantum Mechanics
      • Thermodynamics
    • Chemistry
      • Periodic Table
      • Applied Chemistry
      • Materials
      • Physical Chemistry
    • Biology
      • Anatomy
      • Biochemistry
      • Ecology
      • Genetics
      • Microbiology
      • Plants and Fungi
    • Geology and Paleontology
      • Planet Earth
      • Earth Dynamics
      • Rocks and Minerals
      • Volcanoes
      • Dinosaurs
      • Fossils
    • Animals
      • Mammals
      • Birds
      • Fish
      • Amphibians
      • Reptiles
      • Invertebrates
      • Pets
      • Conservation
      • Animal facts
    • Climate and Weather
      • Climate change
      • Weather and atmosphere
    • Health
      • Drugs
      • Diseases and Conditions
      • Human Body
      • Mind and Brain
      • Food and Nutrition
      • Wellness
    • History and Humanities
      • Anthropology
      • Archaeology
      • History
      • Economics
      • People
      • Sociology
    • Space & Astronomy
      • The Solar System
      • Sun
      • The Moon
      • Planets
      • Asteroids, meteors & comets
      • Astronomy
      • Astrophysics
      • Cosmology
      • Exoplanets & Alien Life
      • Spaceflight and Exploration
    • Technology
      • Computer Science & IT
      • Engineering
      • Inventions
      • Sustainability
      • Renewable Energy
      • Green Living
    • Culture
    • Resources
  • Videos
  • Reviews
  • About Us
    • About
    • The Team
    • Advertise
    • Contribute
    • Editorial policy
    • Privacy Policy
    • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.