Article
Author(s):
The Edward F. and Barbara A. Bell Endowed Chair at Cleveland Clinic discussed a new virtual reality tool that quantifies prodromal changes in activities of daily living for Parkinson disease, without incurring issues of sensory fatigue.
Before an official diagnosis of Parkinson disease (PD), there are often several signs and symptoms that may precede the disease and have a major impact on quality of life. Previous research has identified hyposmia, constipation, mood disorders, and REM sleep behavior disorder, among others, as well-characterized prodromal symptoms. Early detection and treatment of these prodromal symptoms is essential for high-quality care, as there is no specific test to confirm diagnosis of the condition.
Previous research has suggested that a decline in the performance of instrumental activities of daily living (IADL) may be used as a prodromal marker of neurological disease; however, existing clinical and performance-based IADL assessments are not feasible for integration into clinical medicine. To combat this, researchers at Cleveland Clinic developed a virtual reality (VR) experience to see if neurodegenerative diseases, such as PD, can be identified before symptoms start.
In the study, led by Jay Alberts, PhD, patients wore a VR headset and walked on an omnidirectional treadmill, navigating through a virtual grocery store to complete simple and complex tasks. The complex experience had additional scenarios that increased the cognitive and motor demands of the tasks to better represent the continuum of activities associated with real-world shopping.
Alberts, the Edward F. and Barbara A. Bell Endowed Chair at the Cleveland Clinic Lerner Research Institute, believes there is significant opportunity for VR in the medical field, including those with neurodegenerative disorders. In a new iteration of NeuroVoices, Alberts discussed the build of this VR platform, how it works towards exposing and quantifying errors in IADL, and why it differs from previously failed VR technologies. He also provided insight on the biggest take-home points clinicians should be aware of from the research, including significant troubles with multi-tasking and cognitive function.
Jay Alberts, PhD: Virtual reality, different than augmented reality, is very immersive. You are immersed in the environment, and that's a huge advantage. It's a great technology. In fact, in 2008, the National Academy of Engineering issued these challenges, one of which was to integrate virtual reality into medicine. We have not delivered on that grand challenge yet because it's used for teaching here and there and some other things, but fundamentally, we haven't leveraged it very well. The reason is that people get sick. There’s this nauseous feeling, because there's a disconnect in visual information, and somatosensory information. That's what they call the locomotion problem in the VR world. That has really been a barrier. And so, that was what we were trying to focus on: to overcome that barrier with this platform.
We worked with Infinadeck, the omnidirectional treadmill company, and Allen Park Lab, to build the virtual reality shopping task. To think about it from a technology perspective, the omnidirectional treadmill allows someone to walk in any direction. Different than a standard treadmill, where you have a linear motion only, what you can do here is make a turn, go backwards, everything. You can walk in circles if you want. The way it works is there's basically a treadmill on 1000 treadmills. It has a linear component as well as a rotational component. You put a little puck on the back of someone on their waist, and the idea is then that you have a little system that can monitor the position of that puck, and the treadmill moves in such a way that it is always trying to keep the puck in the middle of the treadmill. If you move this way, the treadmill is going to move that way at an angle and so it's using both linear and rotational components at the same time.
The exciting part of that is that we think this can eliminate that locomotion problem and decreased or eliminate nausea/sickness. In fact, that's what we see. We've tested older adults, patients with Parkinson disease, younger adults as well. And generally, they have very little VR sickness, which is super encouraging. That was our first step, and then we went and started to build this virtual reality shopping task. When you think about Parkinson's disease, and even other neurological diseases, things that cause patients problems are instrumental activities of daily living. These are the things that allow someone to be independent. It’s things like, can you drive? Can you go to the shop? Can you go to the store and get groceries? Can you prepare your food? Can you do mail sorting, bill paying, and things like that? If you look at those types of tasks, they have a strong cognitive component and strong motor component. It’s really a dual task, these two things have to be performed simultaneously.
When we developed this virtual reality shopping task, we first did a task analysis to think about the shopping tasks to each individual item and whether it's cognitive and motor, and then to replicate. Let's replicate that with a set environment in a high-fidelity manner. You'd be amazed at how much time we spent on the background noise of a grocery store. It was hours and hours of fighting about what is the most appropriate. It makes it real and very immersive. The fact now that someone can physically navigate a virtual environment without getting sick is an accomplishment. In terms of where we go, and we're where we are now, I'm very enthusiastic about the paradigm because it now provides us a way to objectively quantify important instrumental activities of daily living. We can quantify the cognitive component, the motor component, and other aspects of function as well.
The first thing was that there's a misconception that people who have neurological diseases or even older adults, can't use technology. That was absolutely untrue. They embrace this system, they used it like a champ, and they had very good usability scores. This misnomer of older adults who can't use technology goes back to the early 80s, when they were not older adults, and they had a VCR on their TV that was always flashing 12. They couldn't set the time on the VCR, which is what prompted people to say, “Oh, see, they can't do that!” But what I would argue is that technology was horribly designed. It was a horrible interface. The first take home is that older adults and patients with PD can use technology and they will embrace it. We have to design technology with the end user in mind.
A couple of others’ results were interesting. One was that with individuals with Parkinson disease, they tend to, in terms of the dual tasking, have very clear issues and start to freeze. We identified freezing when they are occasionally looking at the list. Even though they're not looking at an item to purchase, but they're trying to remember what's on the list, we see these issues. Rather than continuing to walk while looking at the list, they tend to stop and have a freezing episode. That’s interesting because it's a dual task. People have talked about this a lot in terms of what patients report, but that is very difficult to replicate in a clinical environment. If you think about clinical environments, they're very sterile. We want to have a very clean and great patient flow, without distraction. It’s difficult to elicit these freezes or changes in postural stability in the clinical environment. If you can't see it, sometimes it makes it a little more difficult to treat those types of postural instabilities and freezing episodes. That’s another interesting finding.
The number of list activations that individuals with Parkinson were doing were significantly different than those older adults and young adults. The way they slowed down—also the way we could induce freezing, in some cases—in the situations where they had to navigate a narrow space, is something that we can now see. I'm excited about that. Not that I want people to freeze, but I'm excited about it because we have a new project going that's funded by the Michael J. Fox Foundation, where we'll be recording from the subthalamic nucleus of individuals with deep brain stimulation, while they're walking through a home environment and doing the grocery tasks. What we'll be able to do is to identify the neural signature associated with freezing of gait. Once we identify that neural signature, then I think we can potentially use DBS [deep brain stimulation] to change it. We're very excited about that. Ken Baker and I are working on that project right now.
Transcript edited for clarity. Click here for more iterations of NeuroVoices.