1st Supervisor: Emma Robinson, King’s College London
2nd Supervisor: Jonathan O’Muircheartaigh, King’s College London
Clinical Champion: Serena Counsell, King’s College London
Industrial Supervisor: Nikos Paragios, Therapanacea; Thais Roque
Aim of the PhD Project:
- Develop novel algorithms for image registration consolidating concepts from discrete and deep optimisation
- Incorporate adaptive-regularisation and biomechanical models of tissue growth to support generalisable prediction of large-deformation mappings.
- Investigate extensive opportunities for clinical translation; monitoring i) Impaired fetal brain development; ii) growth of glioblastoma tumours
Project Description / Background:
During the third trimester of pregnancy the human brain undergoes a period of rapid growth and maturation. Therefore, any disruption to the natural process of brain development over this time, for example following preterm birth, or as a result of congenital heart defects, risks long-term neurological impairment [1,2].
The goals of this project are to develop novel deep learning tools for monitoring brain growth, from longitudinally acquired Magnetic Resonance Imaging (MRI) of the fetal and neonatal brain, in order to support development of novel biomarkers predictive of neurodevelopmental outcome.
This is a highly challenging problem, as not only does brain shape and composition change dramatically over this time period, but the differing constraints of fetal and neonatal imaging, creates inconsistencies in the contrast of the images.
Traditional approaches for mapping correspondences between images work by solving an image matching problem which seeks to maximise the similarity of features in both images, subject to constraints which enforce smooth and biomechanically plausible deformations. Several recent studies have shown significant success for deep learning in this domain [3,4,5].
The challenge with mapping correspondences between fetal and neonatal scans, however, is that images are typically matched through correspondences (or overlap) of shape; whereas, fetal brains, over the period of interest, begin as smooth and featureless, but rapidly develop complex and highly variable folding patterns prior to birth [6,7]. At the same time, correspondence of shape is typically assessed from image similarity metrics which assume a degree of consistency in image contrast. However, fetal and neonatal scans require very different image protocols, since the fetus is scanned, free moving in the fluid sac of the uterus, whereas neonates are scanned stationary, in natural sleep, within the scanner. This leads to very different contrasts and noise profiles, which also vary across images due to changes in tissue maturation.
The objective of this project is therefore to develop a novel deep learning tool for large-deformation image registration, with specific focus on fetal brain development; making unique contributions to the integration of physics inspired biophysical models [8,9] and multimodal metric learning within deep-discrete image registration frameworks [3].
In this way, the project will combine the tremendous potential of convolutional neural networks to perform unsupervised metric learning [5], with the power of discrete (graph-based) optimisation to train robust image registration frameworks [3,10-15], well suited to large deformation modelling [3], for relatively small and heterogenous datasets [3,10-15], noisy or cross-model alignment [10-12], and addressing missing correspondences [15]). Ambiguities during image matching will be solved through incorporating biophysical models of cortical brain growth [6,7], and training image similarity metrics to match both image shape and tissue microstructure.
The desired outcome will consist of highly innovative solutions that harness the extreme power of modern artificial intelligence methods while being associated with robustness, excellent generalization properties and interpretability/explicability. This approach will deliver precise models of cortical growth and sub-cortical microstructure development which will be used to derive biomarkers predictive of cognitive outcome following pre-term birth or diagnosis of congenital heart defects.
We expect the best candidates for this project will have a strong numerical and programming background. Time willing the techniques will be also adapted for monitoring of brain tumour growth through industry placements [16].
Figure 1: Left – demonstrating dramatic patterns of brain growth and folding in the third trimester, by term many of the cortical folds are present despite the brain being much smaller than that of an adult; Right – a schematic of the proposed project.
References:
- Counsell, Serena J., et al. Brain 131.12 (2008): 3201-3208. https://doi.org/10.1093/brain/awn268
- Kelly, Christopher J., et al. Scientific reports 7.1 (2017): 1-10. https://doi.org/10.1038/s41598-017-14939-z
- Heinrich, MP. MICCAI 2019. https://arxiv.org/abs/1907.10931
- Dalca, Adrian V., et al. Medical image analysis 57 (2019): 226-236. https://doi.org/10.1016/j.media.2019.07.006
- Xu, Zhe, et al. MICCAI, 2020. https://doi.org/10.1007/978-3-030-59716-0_22
- Tallinen, T et al PNAS 11.35 (2014): 12667-12672. doi:10.1073/pnas.1406015111
- Garcia, Kara E., et al. PNAS 115.12 (2018): 3156-3161. https://doi.org/10.1073/pnas.1715451115
- Han, J et al. PNAS 115.34 (2018): 8505-8510. https://doi.org/10.1073/pnas.1718942115
- Raissi, M et al Journal of Computational Physics 378 (2019): 686-707. https://doi.org/10.1016/j.jcp.2018.10.045
- Robinson, EC., et al. Neuroimage 100 (2014): 414-426. https://doi.org/10.1016/j.neuroimage.2014.05.069
- Robinson, EC., et al. Neuroimage 167 (2018): 453-465. https://doi.org/10.1016/j.neuroimage.2017.10.037
- Ferrante, E, et al. “ IEEE journal of biomedical and health informatics 23.4 (2018): 1374-1384.
- Ferrante, E and N Paragios. International Journal of Computer Vision 126.1 (2018): 36-58. https://doi.org/10.1007/s11263-017-1040-8
- Glocker, B et al. ” Medical image analysis 12.6 (2008): 731-741. https://doi.org/10.1016/j.media.2008.03.006
- Parisot, S Medical image analysis 18.4 (2014): 647-659. https://doi.org/10.1016/j.media.2014.02.006
- Roque, T et al. IEEE transactions on medical imaging 37.3 (2017): 724-732. doi: 10.1109/TMI.2017.2779811.