Thesis
Deep learning : assisted visualisation of cellular structure in microscopy
- Creator
- Rights statement
- Awarding institution
- University of Strathclyde
- Date of award
- 2025
- Thesis identifier
- T17482
- Person Identifier (Local)
- 201771929
- Qualification Level
- Qualification Name
- Department, School or Faculty
- Abstract
- Within a few years, deep learning (DL) has become an important staple in image processing in microscopy, often showing superior performance in comparison to classical methods. Specifically, the ability of an artificial neural network (ANN) to learn a complex mapping function between image domains has found wide application. Its utilisation in fields such as image restoration, image segmentation, and artificial labelling allows cell biologists to increase knowledge output from their microscopy data. However, the performance of an ANN for such tasks is limited by the quality of the data that is used for the training. In practice, high quality ground truth data can be difficult to acquire with microscopy due to the multitude of limitations of imaging setups as well as labelling protocols. Alternatively, target data for the training can be generated through manual annotation but this is often unfeasible as it is extremely time- and labour-intensive, and requires a high level of expert knowledge. In this thesis, methods are introduced for image-to-image translation that were developed while taking those limitations into account, with the main aim of enhancing the microscopic visualisation of routinely studied cellular structures in cell biology such as the actin cytoskeleton and the microtubule network. Two methods are presented. The first, Label2Label (L2L), is a new restoration method in fluorescence microscopy that enhances the image contrast of cellular structure. L2L does not require clean target data for the training. Instead, clear differences in image quality observable between non-identical fluorescent labels are capitalised on to systematically train a network to increase structural contrast. The idea to dual-label a cell to acquire training data makes L2L relatively straightforward to implement in practice. Moreover, results show that a multi-scale structural similarity loss function can enhance the performance in L2L when implemented carefully. Furthermore, the use of artificial labelling in interference reflection microscopy (IRM) is explored. In artificial labelling, an ANN is trained to translate between images acquired with the label-free IRM technique and fluorescence images. Both images differ in their specificity; contrast in label-free images stems from all cell components, whereas fluorescence images are ideally highly specific to the cellular target that was labelled. Consequently, with artificial labelling, specific cellular structures can be selectively visualised in label-free images. For the translation between IRM images and fluorescence images of a focal adhesion-marker, different cases were explored where target data exhibited either a low or a high contrast of the focal adhesions, and were paired or unpaired to the IRM inputs. Results presented in this work show that a network can be successfully trained for artificial labelling in IRM. Moreover, high quality paired target images are not necessarily required for that task. Here, a newly developed framework, the so-called 2LGAN, that is a variant of a generative adversarial network, performed highest for such a case. Notably, L2L and artificial labelling in IRM out-perform classical methods as pre-processing step for downstream quantitative image analysis. Both methods could have important impact in the future for the visualisation of other cellular structures not explored in this work. One such structure could be the plasma membrane which is difficult to visualise with microscopy but is of high interest in cell biology.
- Advisor / supervisor
- Hansen, Carsten Gram
- McConnell, Gail
- Resource Type
- DOI
- Date Created
- 2024
- Embargo Note
Relations
Items
| Thumbnail | Title | Date Uploaded | Visibility | Actions |
|---|---|---|---|---|
|
|
File | 2025-09-23 | Embargo |