Then, they suit up for data collection. Each surgeon dons a special lab coat that holds a variety of wired sensors. Three motion sensors — so fine they fit under surgical gloves — poke out of the sleeves and are secured to the thumbs, index fingers and wrists with a piece of tape. Finally, Pugh sets up audio and video recordings, which run as the surgeon operates. The integrated approach to data collection not only shows how the surgeons’ hands move, but also how they talk through tricky parts of a procedure and how their brain waves spike or dip.
The Proof Is in the Data Pattern
So far, the idea of surgical wearables has been met with mixed reactions, Pugh said. Mostly, there’s a sense of excitement and an eagerness to participate, she said. But there’s concern too — namely, that they would be used to unfairly judge a surgeon’s skills during a difficult procedure. It’s true that the wearables could be used to one day test surgical skill, but to Pugh, it would be a mistake to limit the data to that purpose.
“To me, collecting surgical data is less about evaluating the skill of a surgeon and far more about quantifying what it took to take care of a specific patient,” Pugh said.
She gives an example: Patients in intensive care units often need a central line, a type of IV that can withdraw fluid or deliver medicine. But inserting a central like into the vein of a frail 90-year-old patient is extremely different than doing so in a morbidly obese patient, or a patient who already has had multiple lines placed during previous care. “We all know the difference as practiced physicians, but there’s no data to show it,” Pugh said. “We walk around with more detailed data about our bank accounts than how we perform clinical procedures, which are 10-times more complex.”
Pugh and her team are still just getting off the starting blocks, but the data they’ve collected — through early pilot studies and at a handful of medical and surgical conferences — have already started to yield intriguing insights through data patterns.
Instead of parsing every dataset of a surgery, Pugh and her team look for overarching trends. The motion-tracking sensors feed visual data back to a computer, allowing the researchers to see movement patterns of a surgeon’s hands, including where they pause and where they spend more time.
“People would ask me, ‘Why would you want to measure surgical technique? Everyone operates so differently.’ But our data essentially shows the opposite. Whether surgeons use different instruments or add their own finesses to a procedure doesn’t really matter,” Pugh said. The overall movement patterns that are generated at the end are very similar, so long as there aren’t complications — such as abnormal patient anatomy or the rare surgical error.
Such data patterns can show where surgeries hit a snag. Take, for instance, a successful surgery with a movement pattern that, at the end, looks roughly like the body and wings of a butterfly. Those who perform a surgery without complications will see a movement pattern in the rough shape of a butterfly. Those who don’t might have a pattern with lopsided wings, or one with two bodies. “The motion sensors that track that surgeon’s fingers and hands produce a very visual result,” Pugh said. “And what’s even more interesting to see is that there doesn’t seem to be a correlation with instrument choice or whether the surgeon switched step 5 for step 6 — it’s the patient’s anatomy that most accurately correlates to the end pattern.”
Big (Data) Dreams
The intertwining data streams from various wearables on the surgeon’s body can reveal quite a bit about the procedure and the patient on the table, but more than that, Pugh and her colleagues see it as a data-first approach to teaching, learning and improvement.
“The innovative research led by Dr. Pugh’s team will provide incredible data-informed insights into surgeon efficiency of motion, tactile pressure and cognitive load while performing a variety of medical and surgical tasks,” said Mary Hawn, MD, professor and chair of surgery. “These types of data could be used to identify when a surgeon has mastered a procedure and when there may be a deficit.”
Some of the wearable applications are still a ways off, Pugh said, as the technology is now only used for procedures on mannequins and tissue bits. But there is one wearable Pugh has tested in the operating room: the EEG sensor.
During two surgeries, a gall bladder removal and an appendectomy, Pugh has volunteered to stick the brain-wave-reading sensor onto her forehead. “First we just need to verify that it works in the OR and that the data comes in successfully,” Pugh said. And, so far, it does. Through the EEG data, Pugh’s team could see that the peaks of Pugh’s brain waves while operating corresponded with the most trying moments of the surgery, while lower level activity synchronized with menial surgical tasks, like suturing.
After a successful surgery, Pugh closed the patient and left the OR, forgetting to remove the long strip on her forehead. “My colleagues who are aware of my research saw the EEG sensor and immediately knew what I had been doing,” she said. Now, Pugh’s getting peppered with the same question: When can others test out the technology?
“This is an entirely new data endeavor; we’re learning in real time how best to propel this work, analyze the data and fast-track it in a safe way so that other surgeons can begin to use it in their ORs, too,” Pugh said. “Right now, it’s just me who’s tested it during surgery, but my big dream is to have this be routine. I can’t tell you all the ways the data will be used, but it will definitely improve the care we provide.”