Back to Projects

AI-enabled Imaging, Emerging Imaging

Bringing the outside world into the scanner: functional MRI in an immersive, interactive fully integrated Virtual Reality environment

Project ID: 2020_038

1st supervisor: Tomoki Arichi, King’s College London
2nd supervisor: Jo Hajnal, King’s College London
Additional supervisor: Bernhard Kainz, Imperial College London

Aim of the PhD Project:

  • Develop a completely immersive Virtual Reality (VR) experience that is fully integrated with the MRI scanner to achieve a new kind of multi-modal data capability for functional MRI experiments.
  • Exploit the new functionality in pilot fMRI studies that and develop novel analysis tools to explore the resulting multimodal data.

Project Description / Background:

In recent years, Functional MRI (fMRI) has become the tool of choice in neuroscientific and clinical applications for studying patterns of neural activity across the living brain in a completely safe and non-invasive way. In a typical experiment, the MR signal is sampled as a subject receives a distinct sensory stimulus or performs a structured task. However, whilst this approach has provided new and important insights into human brain function and organisation, constraints imposed by the MR environment and hardware limitations mean that experiments are often unsuitable for subjects (particularly children) who cannot cooperate with specific conditions or those who may find the alien MR environment distressing. As a result, existing fMRI studies are only representative of populations who can meet certain criteria and under specific “unnatural” conditions.

With these factors in mind, we have recently developed a non-intrusive MR compatible VR system which can provide users with an immersive interactive simulated environment whilst lying inside the confines of the MRI scanner. Subjects are immersed in the visual environment via an MR compatible projector placed inside the scanner bore projecting directly into the VR headset. They receive auditory stimulation via active noise-cancelling headphones. A pair of MRI compatible  cameras mounted inside the VR headset then provide real-time information about visual behavior and head position. An external microphone and camera allow the subject to interact through voice and even gestures. Together, this means that subjects can be fully immersed in a new environment within which they not only receive stimuli but crucially can also actively interact. The system therefore not only has huge potential for relaxing vulnerable subjects, but additionally represents an important new platform for conducting a new generation of “natural” fMRI experiments such as those studying fundamental (but hitherto poorly understood) cognitive processes such as social communication which are central to conditions such as autism.


This project would ideally suit a student with a background in computer science, bioengineering or electrical engineering and an interest in their application to neuroscience and clinical practice. The first challenge of the project will be to fuse the complex VR stimuli and subject responses with MRI data and physiological parameters to achieve a new generation of multimodal fMRI studies. This will involve learning and applying skills in signal processing, physics, and engineering. They will then explore how to optimally extract information from the diverse data types (video, audio, MR images, metadata). This will involve applying advanced computer vision and machine learning methods. Lastly this work will all be brought combined in pilot fMRI studies on which analysis strategies can be explored and refined. Together this will necessitate that the student works closely with the multidisciplinary supervisory team and learns to use unconventional thinking to solve the considerable challenges inherent to working in the MRI scanner environment and with vulnerable subjects.


Figure 1: The MR compatible Virtual Reality system which will be used in the project in use. It consists of a MR compatible projector which is placed on the examination table which enters the scanner bore and a VR headset mounted on the receive head coil. There is a pair of infrared cameras integrated into the headset which can precisely track subject eye and head movements. The subject wears active noise cancelling headphones and can also talk into a microphone attached to the headcoil.

Back to Projects