Experimental surgery performed by AI-driven surgical robot

Date:

Share:



Intuitive Surgical, an American biotechnology company, introduced DaVinci surgical robots in the late 1990s, and they became groundbreaking teleoperation equipment. Expert surgeons could operate on patients remotely, manipulating the robotic arms and their surgical tools based on a video feed from DaVinci’s built-in cameras and endoscopes.

Now, John Hopkins University researchers put a ChatGPT-like AI in charge of a DaVinci robot and taught it to perform a gallbladder-removal surgery.

Kuka surgeries

The idea to put a computer behind the wheel of a surgical robot is not entirely new, but these had mostly relied on using pre-programmed actions. “The program told the robot exactly how to move and what to do. It worked like in these Kuka robotic arms, welding cars on factory floors,” says Ji Woong Kim, a robotics researcher who led the study on autonomous surgery. To improve on that, a team led by Axel Krieger, an assistant professor of mechanical engineering at John Hopkins University, built STAR: the Smart Tissue Autonomous Robot. In 2022, it successfully performed a surgery on a live pig.

But even STAR couldn’t do it without specially marked tissues and a predetermined plan. STAR’s key difference was that its AI could make adjustments to this plan based on the feed from cameras.

The new robot can do considerably more. “Our current work is much more flexible,” Kim says. “It is an AI that learns from demonstrations.” The new system is called SRT-H (Surgical Robot Transformer) and was developed by Kim and his colleagues, Krieger added.

The first change they made was to the hardware. Instead of using a custom robot like STAR, the new work relied on the DaVinci robot, which has become a de facto industry standard in teleoperation surgeries, with over 10,000 units already deployed in hospitals worldwide. The second change was the software driving the system. It relied on two transformer models, the same architecture that powers ChatGPT. One was a high-level policy module, which was responsible for task planning and ensuring the procedure went smoothly. The low-level module was responsible for executing the tasks issued by the high-level module, translating its instructions into specific trajectories for the robotic arms.



Source link

━ more like this

Android is changing the rules for sideloading, but they won’t hinder your phone upgrade

Starting August 2026, Google is tightening the screws on sideloading. If you want to install apps from unverified sources, you will have to...

Israel to widen the ground invasion in southern Lebanon – London Business News | Londonlovesbusiness.com

Benjamin Netanyahu has announced an expansion of Israel’s military operations in southern Lebanon, targeting the Iranian-backed Hezbollah militant group. The Israeli Prime Minister stated...

Bluesky’s next product is an AI assistant that helps build custom social media feeds

Bluesky is the latest social media platform to throw its hat into the AI chatbot ring. Bluesky, but specifically its chief innovation officer...

The PS5 has been my best investment in the last 6 years (because it actually went up in value)

Remember when buying a console felt like buying tech… not stocks? Back in the good old days of the PlayStation 2, PlayStation 3,...

Brit tourists arrested in UAE under ‘draconian’ laws for simply filming Iranian missile strikes – London Business News | Londonlovesbusiness.com

Around 70 British nationals have reportedly been arrested in the United Arab Emirates for taking photos and videos of Iranian missiles striking the...
spot_img