Delivering small, yet perceptible buzzes of electrical currents to fingertips can give surgeons an accurate perception of distance to contact when they’re using robotic arms for surgeries, researchers report.
The insight enabled users to control robotic fingers precisely enough to gently land on fragile surfaces.
Steady hands and uninterrupted, sharp vision are critical when performing surgery on delicate structures like the brain or hair-thin blood vessels. While surgical cameras have improved what surgeons see during operative procedures, the “steady hand” needs enhancement—new surgical technologies, including sophisticated surgeon-guided robotic hands, cannot prevent accidental injuries when operating close to fragile tissue.
The researchers say that the new technique might offer an effective way to help surgeons reduce inadvertent injuries during robot-assisted operative procedures.
“One of the challenges with robotic fingers is ensuring that they can be controlled precisely enough to softly land on biological tissue,” says Hangue Park, assistant professor in the electrical and computer engineering department at Texas A&M University.
“With our design, surgeons will be able to get an intuitive sense of how far their robotic fingers are from contact, information they can then use to touch fragile structures with just the right amount of force.”
Robot-assisted surgical systems, also known as telerobotic surgical systems, are physical extensions of a surgeon. By controlling robotic fingers with movements of their own fingers, surgeons can perform intricate procedures remotely, thus expanding the number of patients that they can provide medical attention. Also, the tiny size of the robotic fingers means that surgeries are possible with much smaller incisions since surgeons need not make large cuts to accommodate for their hands in the patient’s body during operations.
To move their robotic fingers precisely, surgeons rely on live streaming of visual information from cameras fitted on telerobotic arms. Thus, they look into monitors to match their finger movements with those of the telerobotic fingers. In this way, they know where their robotic fingers are in space and how close these fingers are to each other.
However, Park notes that just visual information is not enough to guide fine finger movements, which is critical when the fingers are in the close vicinity of the brain or other delicate tissue.
“Surgeons can only know how far apart their actual fingers are from each other indirectly, that is, by looking at where their robotic fingers are relative to each other on a monitor,” Park says. “This roundabout view diminishes their sense of how far apart their actual fingers are from each other, which then affects how they control their robotic fingers.”
To address this problem, Park and his team came up with an alternate way to deliver distance information that is independent of visual feedback. By passing different frequencies of electrical currents onto fingertips via gloves fitted with stimulation probes, the researchers were able to train users to associate the frequency of current pulses with distance, that is, increasing current frequencies indicated the closing distance from a test object.
They then compared if users receiving current stimulation along with visual information about closing distance on their monitors did better at estimating proximity than those who received visual information alone.
Park and his team also tailored their technology according to the user’s sensitivity to electrical current frequencies. In other words, if a user was sensitive to a wider range of current frequencies, the distance information was delivered with smaller steps of increasing currents to maximize the accuracy of proximity estimation.
The researchers found that users receiving electrical pulses were more aware of the proximity to underlying surfaces and could lower their force of contact by around 70%, performing much better than the other group. Overall, they observed that proximity information delivered through mild electric pulses was about three times more effective than the visual information alone.
Park says the new approach has the potential to significantly increase maneuverability during surgery while minimizing risks of unintended tissue damage. He also says their technique would add little to the existing mental load of surgeons during operative procedures.
“Our goal was to come up with a solution that would improve the accuracy in proximity estimation without increasing the burden of active thinking needed for this task,” he says.
“When our technique is ready for use in surgical settings, physicians will be able to intuitively know how far their robotic fingers are from underlying structures, which means that they can keep their active focus on optimizing the surgical outcome of their patients.”
The study appears in Scientific Reports.
Other contributors to the research are from the Texas A&M University and Ewha Womans University in South Korea.
Source: Texas A&M University
The post Tiny zaps could steady human hands for robotic surgery appeared first on Futurity.
from Futurity https://ift.tt/2y64Zui
No comments:
Post a Comment