Slip on a VR headset, and an ordinary room transforms into a foggy mountain landscape dotted with colourful landmarks.

 

Annie gesturing towards her scientific poster at CAN

This virtual world is part of a new tool developed by Annie Kim, a Master’s student supervised by Dr. Manu Madhav, who is working to diagnose Alzheimer’s disease earlier by detecting subtle navigation problems. Her team’s virtual reality approach could make screening more accurate and accessible — from clinics to care homes, and even people’s living rooms.

 

 

What is Alzheimer’s disease?

Alzheimer’s disease begins with a buildup of proteins in the brain known as amyloid plaques and tau tangles, which slowly destroys memory and thinking skills, eventually becoming severe enough to impair daily functioning. There is currently no cure, but early recognition helps manage symptoms and delay dementia.

Diagnosing Alzheimer’s disease can be challenging. Tests like spinal taps and bloodwork are invasive and often inaccessible to vulnerable populations. Non-invasive tools usually focus on episodic memory—the recall of personal experiences—but this declines naturally with age, making it unreliable.

“By the time you see episodic memory deficits, the disease has often progressed significantly,” Annie adds. “This highlights the need for more specific, earlier detection methods.”

 

Navigation and Alzheimer’s disease

Recent research has suggested that spatial navigation deficits are more specific to Alzheimer’s disease and could allow for earlier diagnosis.

“During spatial navigation, we utilize this neural representation of our environment called the cognitive map,” Annie explains. “This cognitive map is formed by the firing of various different cells in the hippocampal formation, and this is the area that’s affected early on in the disease progress.”

 

The Egocentric and the Allocentric

While there are many factors involved in successful navigation, two are particularly popular with researchers: the egocentric and allocentric reference frames.

The egocentric reference frame relies on self-centered navigation—using your own body as the reference point to track where you are in the environment. It involves continuously updating your location based on self-motion cues. For example, you might think “I went straight, turned left, then turned right.”

“The analogy I like to use involves Google Map’s navigation mode,” Annie remarks. “When you start your trip, the rotating arrow points towards the direction you’re facing and tells you where to go, relative to your current position.”

With allocentric reference frames, navigation is world-centered and based on fixed landmarks, independent of your location. This is more like a traditional map which represents the spatial relationships of landmarks within a region. For example, the DMCBH is located in front of the UBC Hospital and beside the Detwiller Pavilion.

“In real world navigation, we use both of these reference frames,” Annie explains. “We often switch between the two, integrating both of them to figure out where we are in the environment, where we need to go and how we can get there most efficiently.”

Healthy aging often affects allocentric navigation, while egocentric remains relatively intact. Alzheimer’s impairs both, including the ability to switch between them. Evaluating these differences may allow for earlier, more accurate diagnosis.

 

Limitations of current navigation tests

A lot of standard tests for spatial navigation deficits used in clinics and research are paper-based or done on a computer. Importantly, they tend to be housed within a 2-dimensional environment, which comes with important limitations.

“Real life navigation involves a lot of different sensory cues,” Annie notes. “How we sense our bodies moving through the environment, what we are seeing, and what we are actually experiencing in the world.”

These 2D tests fail to capture the complexities of real-life navigation, lacking the immersiveness required to accurately assess spatial navigation skills. Moreover, they often lack a clear isolation of allocentric and egocentric components, involving tasks which could be easily solved using either method, muddling the results. Lastly, traditional tests have limited difficulty levels, minimizing complexity and the ability to detect more subtle changes.

“We wanted to create a task which could capture more complex aspects of navigation,” Annie said. “Observing how behaviour changes in response to more challenging tasks could reveal deeper insights into how spatial memory is affected.”

 

The journey to a VR diagnostic tool

By leveraging the immersive nature of virtual reality, Annie and her team aimed to create a tool which could be used in clinics to assess spatial memory and help clinical diagnosis of Alzheimer’s disease.

“One of the things we really prioritized was the translational and clinical impact of this tool,” Annie emphasized. “To do this, we decided to make the task as accessible as possible and open source so that we can eventually share it with others.”

This informed the decision to build the entire task in Unity Game Engine, a commercially available game development software. In addition, they chose to use a commercially available VR headset, the Meta Quest 3.

In collaboration with the UBC Emerging Media Lab and undergraduate software developer Inzaghi Moniaga, the lab developed the final virtual reality task, featuring high quality graphics and a user-friendly interface.

 

How does the task work?

Seated in a comfortable swivel chair, participants place the VR headset over their eyes, immediately immersed in a rich mountainous landscape surrounded by trees and fog. After a brief interactive tutorial, the task begins.

Participants find themselves surrounded by white walls and windows; a place called the starting room. Gazing out into the landscape, they can observe the locations of a towering red lighthouse and a cozy looking cabin. Once they are ready, they can step out into the hallway, at which point fog descends and obscures the outside view. By rotating their body and moving a joystick, participants can stroll through this virtual world.

“As they walk down the hallway, they must rely on egocentric navigation,” Annie explains. “They’re recounting, how much did I move straight, and how often did I turn left or right?”

Upon reaching the end of the hallway, participants must point towards where they think they started. Since they have no landmarks to base their judgment on, they must use self-motion cues to make their decision. After their guess, the actual start location is revealed, and now the participants must identify where they think the lighthouse and cabin are. The accuracy of their guess in the start location is used as a measure of egocentric navigation, while the identification of landmark locations assesses allocentric navigation as participants piece together the global map.

 

A virtual future for diagnosis?

Currently, this tool is in its early stages of development and validation. For Annie’s thesis, the VR task was mainly tested in healthy young and older adults. Hopefully, it can eventually be used by Alzheimer’s patients to find specific differences and impairments in navigation.

“The ultimate goal would be to have virtual reality headsets available in clinics, in care homes or even in the comfort of someone’s house,” Annie envisions. “It could be used as a screening tool for measuring navigation abilities, or as a monitoring tool to assess changes over time.”

Accessibility, having been built into its very design, is very important for the NC4 lab and this VR tool.

“Even if people don’t have access to very elaborate medical tests like neuroimaging scans or blood work, they still deserve access to care.” Annie says. “This would be a tool that could be implemented a little bit easier in those kinds of places.”