Thesis

Deep learning networks for automatic brain tumour segmentation from MRI data

Creator
Rights statement
Awarding institution
  • University of Strathclyde
Date of award
  • 2023
Thesis identifier
  • T16715
Person Identifier (Local)
  • 201752348
Qualification Level
Qualification Name
Department, School or Faculty
Abstract
  • Early diagnosis and appropriate treatment planning are the keys to brain tumour patients’ survial rate. Radiotherapy (RT) is a common treatment for brain tumour. RT planning requires segmentation of a gross tumour volume (GTV). Manual segmentation of the brain tumour done by experts oncologists or clinicians is time-consuming and subject to intra- and inter-observer variability. This research presents novel image processing and deep learning methods for automatic brain tumour regions segmentation from MRI data. The MRI data of brain tumour patients from Brain Tumour Segmentation or BraTS dataset from 2018-2021 are used in this study. 2D deep neural networks for semantic segmentation of brain tumour regions from 2D axial multimodal (T1, T1Gd, T2, and FLAIR) MRI slices is presented. This proposed network is trained and tested on manual consensus labels by experts from BraTS 2018 dataset. The network has similar architecture to U-Net, which consists of a stream of down-sampling blocks for feature extraction and a reduction in the image resolution, then a stream of up-sampling blocks to recover image’s resolution, integrate features, and classify pixels. The proposed network improved feature extraction by introducing two-pathways feature extraction in the first block of the down-sampling to extract local and global features directly from the input images. Transposed convolution was employed in up-sampling path. The proposed network was evaluated for the segmentation of five tumour regions: whole tumour (WT), tumour core(TC), necrotic and nonenhancing tumour (NCR/NET), edema (ED), and enhancing tumour (ET). The results obtained from the modified U-Net achieved mean Dice Similarity Coefficient (DSC) of 0.83, 0.62, 0.45, 0.69, and 0.70 for WT, TC, NCR/NET, ED, and ET, respectively. These results show a 9% improvement compared to the original U-Net’s performance. 2D predicted segmentation obtained from the proposed network are stacked to visualise the tumour volume. A novel deep neural network called 2D TwoPath U-Net for multi-class segmentation of brain tumour region is described. The proposed network has improved two-pathways feature extraction to provided cascaded local and global features from 2D multimodal MRI input. The proposed networks was trained using MRI data from BraTS 2019 dataset and test using MRI data from BraTS 2020 dataset. Data augmentation and different training strategies including the use of full-size images and patches were employed to improve the predicted segmentation. The results obtained from the proposed network feature all intra-structure (NCR/NET, ED, ET) of tumour to form the segmentation of WT and TC regions, and achieved mean DSC of 0.72 and 0.66 for WT and TC, respectively. A novel 3D deep neural network for brain tumour regions segmentation from MRI data called 3D TwoPath U-Net is described. The network has a similar structure to the 2D TwoPath U-Net, and uses two-pathways feature extraction to capture local and global features from volumetric MRI data from BraTS 2021 dataset. The volumetric data were created using T1Gd and FLAIR modalities. To construct a 3D deep neural network with significantly high computational parameters, cropped voxels from volumetric MRI were used to reduce the input resolution. Furthermore, high-performance GPUs were employed to implement the network. The proposed network achieved the mean DSC of 0.87, 0.70, and 0.58 for WT, TC, and ET segmentation, respectively, which represents a 25% improvement compared to the previous segmentation results obtained using the 2D approach. Moreover, the 3D smooth tumour volume generated from the proposed network output provide a more visually representative depiction of the tumour.
Advisor / supervisor
  • Soraghan, John
  • Di-Caterina, Gaetano
Resource Type
DOI

Relations

Items