homehome Home chatchat Notifications


Robot workspace to get human touch remotely

Assemble, stitch, fasten, box, seal, ship. Wait, what can get done if assemblers, stitchers, and the others are staying at home?

Nancy Cohen
December 22, 2020 @ 12:21 am

share Share

It’s been fairly easy for some to adopt a remote working model during the pandemic, but manufacturing and warehouse workers have had it rougher — some tasks just need people to be physically present in the workplace.

But now, one team is working on a solution for the traditional factory floor that could allow more workers to carry out their labor from home.

The proposed human-in-the-loop assembly system. The robot workspace can be manipulated remotely. Image credits: Columbia Engineering.

Columbia Engineering announced that researchers have won a grant to develop the project titled “FMRG: Adaptable and Scalable Robot Teleoperation for Human-in-the-Loop Assembly.” The project’s raw ingredients include machine perception, human-computer interaction, human-robot interaction, and machine learning.

They have come up with a “physical-scene-understanding algorithm” to convert visual observations via camera shots of a robot workspace into a virtual 3D-scene representation.  

Handling 3D models

The system analyzes the robot worksite and can change it into a visual physical scene representation. Each object is represented by a 3D model that mimics its shape, size, and physical attributes. A human operator gets to specify the assembly goal by manipulating these virtual 3D models.

A reinforcement learning algorithm infers a planning policy, given the task goals and the robot configuration. Also, this algorithm can infer its probability of success and use it to determine when to request human assistance — otherwise, it carries out its work automatically.

The project is led by Shuran Song, an assistant professor of computer science at Columbia University. She said the system they envision will allow workers who are not trained roboticists to operate the robots and this pleases her.

“I am excited to see how this research could eventually provide greater job access to workers regardless of their geographical location or physical ability.”

Automation for the future

The team received $3.7m funding from the National Science Foundation (NSF). The NSF stated the award period starts from January 1 to an estimated end date of Dec. 31, 2025. The NSF award abstract reveals the positive impact such an effort could have on business and workers:

“The research will benefit both the manufacturing industry and the workforce by increasing access to manufacturing employment and improving working conditions and safety. By combining human-in-the-loop design with machine learning, this research can broaden the adoption of automation in manufacturing to new tasks. Beyond manufacturing, the research will also lower the entry barrier to using robotic systems for a wide range of real-world applications, such as assistive and service robots.”

The abstract said their team is collaborating with NYDesigns and LaGuardia Community College “to translate research results to industrial partners and develop training programs to educate and prepare the future manufacturing workforce.”

Song is directing the vision-based perception and machine learning algorithm designs for the physical-scene-understanding algorithms. Computer Science Professor Steven Feiner, Columbia University, is looking at the 3D and VR user interface. Matei Ciocarlie, associate professor of mechanical engineering, Columbia University, is building the robot learning and control algorithms. Before joining the faculty, Matei was a scientist at Willow Garage, and scientist at Google. Matei contributed to the development of the open-source Robot Operating System.

A takeaway: News of robots often results in hair-pulling remarks on a tradeoff that can result in lost jobs for humans. Here is a project that, once complete, has the potential to complement human capabilities by using robotics.

Nancy Cohen is a contributing author. Want to get involved like Nancy and send your story to ZME Science? Check out our contact and contribute page.

share Share

A Former Intelligence Officer Claimed This Photo Showed a Flying Saucer. Then Reddit Users Found It on Google Earth

A viral image sparks debate—and ridicule—in Washington's push for UFO transparency.

This Flying Squirrel Drone Can Brake in Midair and Outsmart Obstacles

An experimental drone with an unexpected design uses silicone wings and AI to master midair maneuvers.

Oldest Firearm in the US, A 500-Year-Old Cannon Unearthed in Arizona, Reveals Native Victory Over Conquistadores

In Arizona’s desert, a 500-year-old cannon sheds light on conquest, resistance, and survival.

No, RFK Jr, the MMR vaccine doesn’t contain ‘aborted fetus debris’

Jesus Christ.

“How Fat Is Kim Jong Un?” Is Now a Cybersecurity Test

North Korean IT operatives are gaming the global job market. This simple question has them beat.

This New Atomic Clock Is So Precise It Won’t Lose a Second for 140 Million Years

The new clock doesn't just keep time — it defines it.

A Soviet shuttle from the Space Race is about to fall uncontrollably from the sky

A ghost from time past is about to return to Earth. But it won't be smooth.

The world’s largest wildlife crossing is under construction in LA, and it’s no less than a miracle

But we need more of these massive wildlife crossings.

Your gold could come from some of the most violent stars in the universe

That gold in your phone could have originated from a magnetar.

Ronan the Sea Lion Can Keep a Beat Better Than You Can — and She Might Just Change What We Know About Music and the Brain

A rescued sea lion is shaking up what scientists thought they knew about rhythm and the brain