We’ve all been seeing automation take over more and more tasks and occupations over our lifetimes. But now, a new paper describes the development of a robot that can mimic the human movements of a graffiti artist.
A collaboration between graduate students at the Georgia Institute of Technology has harnessed motion capture technology to record the motions of a human artist while painting in order to program a robot to recreate them. Christened the GTGraffiti system, it can recreate the motions of artists to recreate their works.
Although robots are already well-established in applications such as manufacturing, biomedicine, automobiles, construction, agriculture, and the military, the GTGraffiti system showcases just what they can do in the world of arts, one that has traditionally been considered a human-only domain.
“The arts, especially painting or dancing, exemplify some of the most complex and nuanced motions humans can make,” says robotics Ph.D. student Gerry Chen, who devised the project.
“So if we want to create robots that can do the highly technical things that humans do, then creating robots that can dance or paint are great goals to shoot for. These are the types of skills that demonstrate the extraordinary capabilities of robots and can also be applied to a variety of other applications.”
To create the robot, the team started with motion capture technology to record human artists while painting their works. This data allowed them to observe the types of motions needed to produce spray-painted art. Two human artists were involved in this step, who were asked to paint the alphabet in a ‘bubble letter’ graffiti style. As they painted, the motions their hands made on the canvas were recorded, as well as the movement of the spray cans themselves.
Afterward, this data was processed to analyze the speed, acceleration, and amplitude of each individual motion. These figures were then used to understand what the robot would need to be able to do, in order to help the team design its physical body. For this step, the issues of portability and accuracy were also taken into account. In the end, the team decided to use a cable-driven model. Cable-driven robots employ a system of engines, pulleys, and cables to perform motions. They already see wide use in a variety of fields, and are easily scalable.
The third step involved translating the artists’ compositions into electrical signals that the robots could read. These were collated together into a digital library of characters that can be altered in size and perspective, or combined to produce words for the robot to paint. For the study, the robot was tasked with painting its own artist ‘tag’: “ASL”.
After the computer’s software charts the exact motions it needs to perform to paint the artwork, the competing movements necessary to do so could damage its mechanisms; as such, the team programmed its central computer to recalculate motor commands 1,000 times per second to ensure it cannot cause damage to itself.
Chen says his work is motivated by his personal hope that people will see robots as helpful to humanity instead of perceiving them as job-stealers or outright threats, as they are often depicted in movies. In the future, Chen hopes to use GTGraffiti to study artists painting graffiti in the wild and use the data to reproduce the artwork were it ever painted over or destroyed.
“Graffiti is an art form that is inherently meant to be seen by the masses,” Chen said. “In that respect, I feel hopeful that we can use graffiti to communicate this idea—that robots working together with humans can make positive contributions to society. The robot is not generating the art itself, but rather working together with the human artist to enable them to achieve more than they could without the robot,” Chen said.
“We hope that our research can help artists compose artwork that, executed by a superhuman robot, communicates messages more powerfully than any piece they could have physically painted themselves,” said Chen.
The paper “GTGraffiti: Spray painting graffiti art from human painting motions with a cable driven parallel robot” has been published in the preprint server arXiv.