The following insight came from my podcast interview with Ryan Ribeira, where he talked about how virtual reality is changing the healthcare industry.
Whether you are trained to be a doctor, a nurse, or an EMT, there are generally three phases to your training. There’s the didactic phase, there’s the skills building, and then there’s a simulation where you’re bringing it all together, practicing your skills in simulated encounters.
Generally speaking, in medical training, now, that’s done with either a mannequin or a standardized patient, which is a trained actor. What SimX does is instead of using that mannequin or a trained actor, you put on a headset, and you’ve got a virtual patient in front of you. So it’s a way for you to get reps in, especially for those medical conditions that are important to know how to treat but what you’re not going to see very often in real life. So it helps doctors, nurses, EMTs be prepared for those kinds of circumstances when they come.
It’s much more immersive that way and more cost-efficient. If you’re using a mannequin, for example, they can’t have a missing limb, they can’t even have rashes, they can’t have neurologic symptoms. Even a trained actor can’t do that very well either. But you can do all of that in VR.
Also, the environmental complexity you can contribute adds a ton of value because it doesn’t have to be an ED; we have cases in moving transport helicopters or where you’re pulling someone out of a burning car and resuscitating them on the ground. And then, even beyond that, the ability to add psychosocial elements is powerful. We have cases where you’re, you know, resuscitating a child, and their parents are there, and they’re crying, and you have to explain to them what’s happening while you resuscitate the baby. Cases like those are what make real medicine hard. It’s not just knowing what to do. It’s about managing some of that psychosocial complexity while you’re doing what you need to do for the patient. And that’s hard to incorporate into traditional simulation methods but easy to do in VR.
My academic background was in Quality Improvement and Patient Safety. I worked for a CMS Center for Medicare and Medicaid Services and HR, building packages of metrics for regulatory programs to try to make care safer. As I went through that process, I realized these are essential programs for improving safety by 5% or 10%. But not 80%, which is where we need to get to eventually.
We have to look at examples like the airline industry, which successfully made flying across the country safer than driving down the street. And the way that they did that is through simulation through just a high volume of a high-quality simulation. So both of those elements are important; they do it a lot. Also, the simulators are perfect one-to-one matchups with the environment in which they’re going to work. So it was pretty clear to me as I was going through my training that we did not have that in healthcare. We have a similar problem to solve all these low-frequency high consequence events that we need to prepare for, just like pilots do. But we have such limited tools in regards to simulation that are unrealistic and then logistically hard to use.
A mannequin is a 150-pound robot that you have to keep in a specialized room. We were never going to get to the realism or the volume that we needed to get to if we were going to take care 75-80% safer. I was very familiar with upcoming VR tech in 2012. Back then, you couldn’t get a development kit for any VR headsets yet, but we knew it was coming, and it would be a good use case. It was something that was going to make simulation a lot more accessible and a lot more realistic.
People in medical education circles have been writing about virtual reality simulation as the future of healthcare training since the 90s. That literature significantly influenced me. I was training and hearing about how this was going to be the future.
We were the first ones to build VR products for this use case. There was a group of cutting-edge academicians for whom this was going to be a familiar topic. Our task was not to convince them that VR is useful but that the technology is mature enough to be used right. A lot of that is done through hands-on experience. We found great early success by just going to educational conferences and trade shows and putting it on people’s heads. They’d come out of the headset, all smiles and realizing that it finally was technology that was ready for primetime. So that was very helpful.
Frankly, we didn’t do any early marketing. Some of that was the limitations of a scrappy bootstrap startup and the lack of resources. But we still had a lot of big-name institutions reach to find us on the internet. As expected, the Mayo Clinic, University of Pennsylvania, and Northwestern were the very academic centers and the early movers in this space. So we built our business model around the assumption that that’s how it would be.
What we did is we marketed this as a platform, and we would make custom cases for people. We would work with the education experts at these institutions to build the actual scenario content. But they signed a contract saying that we would then take those cases and resell them to other institutions. So it’s a win-win because we’re able to attach some of these early customers’ names to some of the first significant VR curricula out there. And some of them are then follow-on customers who maybe are from smaller and less-resourced institutions were able to get access to really high-quality scenarios built by simulation experts at these big-name institutions for a meager cost. So that’s how we went about things and how we still go about things, slowly building up the curriculum. We’ve got about 140 cases now in the system.