
For over two decades, surgical robots have been an extra pair of steady hands. They help human doctors stitch, cut, and probe with superhuman precision, but always under the surgeon’s command.
Now, technology has advanced far enough that robots can perform surgery on their own.
Researchers from Johns Hopkins, Stanford, and Columbia universities unveiled a robot that doesn’t just follow orders. It thinks, adjusts, cuts, all completely on its own.
“We were able to perform a surgical procedure with a really high level of autonomy,” Axel Krieger, assistant professor of mechanical engineering at Johns Hopkins, told the Guardian. “What we’ve done here is really a full procedure.”
The team’s creation, called SRT-H, removed gallbladders from pig cadavers in eight separate procedures. The robot completed each one with 100% accuracy—no human intervention required. It identified arteries and ducts, clipped them carefully, and snipped them free with surgical scissors.
If that sounds like science fiction, it isn’t. It’s just today’s cutting edge of surgical AI.
Not Just a Task—A Procedure
Autonomous robots have tied sutures and completed needlework in the past. But those were narrow tasks—think tying a knot rather than sewing a shirt.
This time, the robot tackled a full phase of a cholecystectomy, the surgical removal of the gallbladder. It’s one of the most common minimally invasive procedures in the world, performed more than 700,000 times a year in the U.S. alone.
Each operation demands intricate maneuvering in a tight space. Surgeons must identify the cystic duct and artery, clip both in specific places, then cut between the clips to remove the organ. Botched steps can spill bile or damage surrounding tissues.
The SRT-H system rose to the challenge by using a combination of language-conditioned imitation learning and a two-layer AI brain. A high-level “planner” issued task-level instructions in natural language. A low-level controller translated those instructions into physical movements.
“This advancement moves us from robots that can execute specific surgical tasks to ones that truly understand surgical procedures,” Krieger told Reuters.
In every one of the eight trials, SRT-H completed all 17 surgical steps without error. That included grabbing, clipping, cutting, tool-switching, and self-correcting (on average six times per procedure).
How It Learned to Cut
The robot’s medical education began in a database. Researchers fed SRT-H over 16,000 surgical demonstrations, compiled from dozens of pig gallbladders. These included both optimal actions and recoveries from mistakes.
The robot watched human surgeons through their eyes, so to speak. Cameras mounted on the robotic wrists and an overhead endoscope captured close-up footage of the tools interacting with tissue. Unlike other systems, SRT-H learned from what it saw.
From there, the high-level policy issued instructions like “clip third clip right tube,” while a low-level transformer mapped each instruction to a series of hand movements. The robot knew when it needed a new clip. It even asked for a tool change.
The result was a smooth, measured performance.
Compared to human surgeons, SRT-H was slower but more precise. Its motions were less jerky. Its instrument paths were shorter. In direct comparisons, surgeons completed tasks faster, but the robot delivered neater movements.
Where Does It Fit in the Operating Room?
Today’s surgical robots, like the widely used da Vinci remote system, are completely controlled by humans. They act as extensions of the surgeon’s hands, offering more precision, but no independence. These robots are very handy if a patient lives in New York, but they require a delicate surgery that only a doctor in Sydney can perform at the earliest.
That’s changing.
SRT-H represents a jump in what engineers call the “level of autonomy” (LoA). While da Vinci is at LoA 0 (no autonomy), SRT-H sits at LoA IV: high autonomy with human supervision. Full autonomy, LoA V, would mean handling everything from incision to closure without human input.
The authors of the study see their robot as a milestone.
“Our work demonstrates a greater step forward,” they wrote, “requiring no special fixtures, tracking markers, or specialized surgical devices.”
The Royal College of Surgeons of England called the advance “an exciting development that shows great promise.” Nuha Yassin, who leads on robotic surgery at the College, added a note of caution:
“The next step must involve a careful exploration of the nuances… to assess how these findings can be safely and effectively translated into a human pilot.”
John McGrath, chair of NHS England’s robotics steering committee, agrees the results are “impressive” and “novel,” but said true clinical deployment is still years away.

Promise and Practical Limits
For now, SRT-H has only operated on organs removed from dead pigs. That sidesteps the unpredictable elements of live surgery: bleeding, breathing, unexpected anatomy, and the fogging of camera lenses. The team acknowledges this.
“Translating from ex-vivo to in-vivo brings several challenges,” they wrote in their paper. “Further studies are required to confirm this.”
There are hardware limitations, too. The robot used bulky wrist cameras that wouldn’t currently fit through laparoscopic ports. And while SRT-H was capable of asking for tool changes, it still relied on a human to reload clips and change instruments.
The team is exploring ways to make the robot safer. That includes adding uncertainty estimates to flag when it might be confused, and offering options for surgeons to step in mid-procedure. SRT-H can already be overridden via voice or touchscreen commands.
One day, autonomous surgical systems could operate under a doctor’s distant supervision or even perform surgeries in remote or underserved areas. This breakthrough could one day allow the replication of top surgical skills at scale across the globe.
The findings appeared in Science Robotics.