When Augmented Reality Became Reality
25 March 2021
“We are living in one of the most exciting times in the history of surgery and medicine,” says Daniel Sciubba, who with his team accomplished the world’s first augmented reality (AR)-supported spine surgeries earlier this year.
“What inspires me most is innovation." The answer comes without a blink of an eye when Sciubba talks about inspiration. "Whether it happens with surgery, research, teaching, or raising children—seeing things that can be done better than they are being done now.”
Tell us about your idea to develop an AR system platform for spine surgery.
Some years ago, I had the opportunity to become involved in the development of AR for spine surgery. This was extremely inspiring. The question was, to put it simply, how to combine the natural operating that we do every day by looking at a patient with the benefits of navigation in a less bulky way?
Augmented reality is different than virtual reality. Virtual reality is if you closed your eyes and you were surrounded by a completely new environment. You don't see the world around you; you see an environment someone created.
Augmented reality is different: you see the world around you but with things added to it in a specific time and space. So, if you would be driving and looking down the street and had to turn left, an arrow would appear on the street, maybe on the ground to guide you. Imagine seeing the sign on the street and not on your car’s navigation screen—that would be augmented reality. Our idea was to find out if can we create such for spine surgery.
In what kind of surgery was the AR system used and why was this specific surgery chosen?
The first surgery was an open lumbar fusion case. We wanted to show a case that went smoothly from beginning to end. We knew that whether or not the augmented reality provided us with what we thought it could, the surgery would go fine.
The second surgery was much more advanced. We used en bloc resection for chordoma, in which we not only used the navigation to place screws, but also the navigation to plan our osteotomies around the tumor to be as accurate and safe as possible, getting the tumor out with good oncological margins, but minimizing the damage to the local tissues.
One can see from the very first case, which was an open lumbar fusion, to the more advanced tumor and MIS operations, that the platform is extremely robust, and I think has saved us time and energy. It is adding value to these very challenging cases.
How was the patient chosen? Did it require any special arrangements?
The augmented reality system went through the FDA clearance and was based on a predicate of prior navigation. In light of talking to patients about AR, we've used that same protocol. In other words, we've told patients, "We use the navigation at times in the operating room, and this is no different." But we still wanted to educate them that this was a new technology when we first used it.
The patients that we chose were extremely excited about the opportunity to be involved and saw that this may benefit their surgery. Secondly, they knew, as we told them, that if the technology did not work, it would not affect their outcome in any way. I think this is in part of the partnership we had, not only with the government but also with the patients themselves to make this happen. Everyone has been on the same page with the partnership to innovate.
The AR system consists of the following:
- Wireless headset – Think of the headset as the wand or probe used in navigation, by just wearing the headset which is basically like wearing a headlight.
- Wireless foot pedal – This is for changing the view of what you want to see on your augmented display. If you want to see nothing, just look through the headset, and you will see the patient as you normally see them. If you hit a button of the pedal, configure it differently, you can add in different components like sagittal navigation, axial navigation, et cetera, or you can have a completely topographic 3D reconstruction of the spine overlaid over the spine in the operating room.
- Patient reference frame – A clamp placed on a patient's spine. The clamp is already placed when the person comes into the operation room.
- There is no camera needed outside like in other navigation. The cameras and the navigation points are inside the headset.
You can quickly and easily turn the AR on and off or layer in the amount of augmented reality that you would like to add to your augmented world while operating.
It can picture where your head is in space so that as you move your head around looking at the patient, you can see through the augmented reality the spine or at least the 3D reconstructions of the spine in real-time.
Was there any backup used during the surgery?
In the first few patients, we did use backup. We had prepared without the presence of augmented reality to do that surgery without any navigation.
In other words, we placed screws freehand and corroborated the placement of our hardware using intraoperative imaging, with x-rays or even intraoperative CT scans.
Without the AR platform, it would have been business as usual. So, as a backup, we had our normal standard. We felt very confident that, if there were any problems with the system, the patient would be protected, because we would revert to the normal safe operations that we have been doing for decades.
Dr. Daniel Michael Sciubba is professor of neurological surgery, oncology, and orthopaedic surgery at the Johns Hopkins University School of Medicine. He serves as Director of spine tumor and spinal deformity research in the Department of Neurosurgery.
Sciubba specializes in the surgical treatment of complex spinal conditions including tumors, degenerative spine diseases, spinal deformities, and scoliosis, employing minimally invasive techniques when possible.
He looks to create and capitalize on innovative opportunities at the crossroads of healthcare, medical sciences, and business. His goal is to make transformative improvements to the current care paradigms for such challenging conditions while still treating patients with precise individualized care.
Sciubba is the AO Spine North America Research Committee Chairperson, a valued member of the AO Spine Knowledge Forum, and has participated in several AO Spine studies.
How do you think AR will shape the future of spine surgeons' work?
The basic thing that it will give is an easier, faster, and more accurate operation in a way that's simpler and more intuitive than all the previous navigation systems to date.
It's going to be much more natural for a surgeon to be looking at the patient rather than a monitor when navigating. It is going to be much more natural for the surgeon to put their hands on the patient, rather than having a robot in between.
There are so many times in the operating room where we don't see what we want to see and we have to assume, or we have to use our judgment. For example, making maneuvers around structures that are too delicate because we can't see around them, or manipulating things without actually seeing through all the bone, the vessels, or the spinal canal, and knowing where our instrument is and where it's going to be.
If we can see those things, we can be less invasive. If we can be less invasive, we can do surgeries that we've never done before in a minimally invasive manner. AR is going to become more robust and more refined; we are going to be able to see things and do things that we found too difficult to do in the past.
What new technologies do you think could be launched in the next few years?
There will be some novel innovations that we've not thought of but which will have an even greater impact combining technology that we already have.
Currently, predictive analytics means taking pictures of patients in the clinic or the operating room and running a series of analyses to try to figure out angles, failures, complications, and outcomes in a way that the average human cannot compute. So, rather than augmented reality just showing me pictures and navigating, it's giving me guidance or judgment about what I should be doing when I'm looking at the incision. AR is giving me red flags of areas to avoid. It's giving me green target areas that I should address by showing opportunity based on big data and machine learning data.
I can imagine a time in the future where the benefits of robotics combined with augmented reality and predictive analytics will revolutionize the way that the surgeries are done. Surgeons may not be interacting with patients in the operating room as they used to in a very old-fashioned way, but will be using real-time input from machines, from computer analysis, and from improved imaging to give us insight on the best way a surgery should be done in real-time.
But even more inspirational and aspirational is that you should look for what you can provide as you see different technologies working together. It is through our combined experience that we can innovate together, something that the AO has laid its foundation on for decades. If we continue trying new things together, look at them critically, and continue to advance, we're helping each other, our fellow man, and all of our patients simultaneously.