Using deep learning to generate synthetic B-mode musculoskeletal ultrasound images

Cronin, Neil ORCID: 0000-0002-5332-1188, Finni, Taija and Seynnes, Olivier R. (2020) Using deep learning to generate synthetic B-mode musculoskeletal ultrasound images. Computer Methods and Programs in Biomedicine, 196. Art 105583. doi:10.1016/j.cmpb.2020.105583

[img]
Preview
Text (Published version)
8927-Cronin-(2020)-Using-deep-learning-to-generate-synthetic-B-mode.pdf - Published Version
Available under License Creative Commons Attribution 4.0.

Download (1MB) | Preview

Abstract

Background and objective Deep learning approaches are common in image processing, but often rely on supervised learning, which requires a large volume of training images, usually accompanied by hand-crafted labels. As labelled data are often not available, it would be desirable to develop methods that allow such data to be compiled automatically. In this study, we used a Generative Adversarial Network (GAN) to generate realistic B-mode musculoskeletal ultrasound images, and tested the suitability of two automated labelling approaches. Methods We used a model including two GANs each trained to transfer an image from one domain to another. The two inputs were a set of 100 longitudinal images of the gastrocnemius medialis muscle, and a set of 100 synthetic segmented masks that featured two aponeuroses and a random number of ‘fascicles’. The model output a set of synthetic ultrasound images and an automated segmentation of each real input image. This automated segmentation process was one of the two approaches we assessed. The second approach involved synthesising ultrasound images and then feeding these images into an ImageJ/Fiji-based automated algorithm, to determine whether it could detect the aponeuroses and muscle fascicles. Results Histogram distributions were similar between real and synthetic images, but synthetic images displayed less variation between samples and a narrower range. Mean entropy values were statistically similar (real: 6.97, synthetic: 7.03; p = 0.218), but the range was much narrower for synthetic images (6.91 – 7.11 versus 6.30 – 7.62). When comparing GAN-derived and manually labelled segmentations, intersection-over-union values- denoting the degree of overlap between aponeurosis labels- varied between 0.0280 – 0.612 (mean ± SD: 0.312 ± 0.159), and pennation angles were higher for the GAN-derived segmentations (25.1° vs. 19.3°; p < 0.001). For the second segmentation approach, the algorithm generally performed equally well on synthetic and real images, yielding pennation angles within the physiological range (13.8–20°). Conclusions We used a GAN to generate realistic B-mode ultrasound images, and extracted muscle architectural parameters from these images automatically. This approach could enable generation of large labelled datasets for image segmentation tasks, and may also be useful for data sharing. Automatic generation and labelling of ultrasound images minimises user input and overcomes several limitations associated with manual analysis.

Item Type: Article
Article Type: Article
Uncontrolled Keywords: Ultrasound; Muscle; Deep learning; Medical imaging; Generative adversarial network; cycleGAN; Synthetic image
Subjects: Q Science > QA Mathematics > QA76 Computer software
Q Science > QP Physiology
R Medicine > RM Therapeutics. Pharmacology > RM695 Physical medicine. physical therapy including massage, exercise, occupational therapy, hydrotherapy, phototherapy, radiotherapy, thermotherapy, electrotherapy
Divisions: Schools and Research Institutes > School of Education and Science
Research Priority Areas: Health, Life Sciences, Sport and Wellbeing
Depositing User: Rhiannon Goodland
Date Deposited: 28 Oct 2020 16:56
Last Modified: 31 Aug 2023 09:07
URI: https://eprints.glos.ac.uk/id/eprint/8927

University Staff: Request a correction | Repository Editors: Update this record

University Of Gloucestershire

Bookmark and Share

Find Us On Social Media:

Social Media Icons Facebook Twitter Google+ YouTube Pinterest Linkedin

Other University Web Sites

University of Gloucestershire, The Park, Cheltenham, Gloucestershire, GL50 2RH. Telephone +44 (0)844 8010001.