Enhancing Photoacoustic Visualisation of Clinical Needles with Deep Learning

Mengjie Shi, Zhaoyang Wang, Tianrui Zhao, Simeon J. West, Adrien E. Desjardins, Tom Vercauteren, Wenfeng Xia

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

1 Citation (Scopus)
42 Downloads (Pure)


Photoacoustic imaging has shown great potential in guiding various minimally invasive procedures by providing complementary information to ultrasound imaging, visualising critical tissue targets as well as surgical tools such as metallic needles with rich optical contrast. The use of light emitting diodes (LEDs) as excitation sources further accelerates the clinical translation of this modality due to its favorable affordability, portability, and cost-efficiency but results in the degradation of image quality associated with low pulse energy. In this work, we propose a framework based on a modified U-Net to enhance the visualisation of clinical metallic needles with a commercial LED-based photoacoustic and ultrasound imaging system. This framework included the generation of semi-synthetic datasets combining both simulated data and in vivo measurements. Evaluation of the trained neural network was performed with needle insertions into a blood-vessel-mimicking phantom, and pork joint ex vivo tissue. This deep learning-based framework significantly enhanced the visualisation of the needle with photoacoustic imaging, achieving 4.3- and 3.2-×higher signal-to-noise ratios (SNRs) compared to conventional reconstructions, which could be helpful for guiding minimally invasive procedures.

Original languageEnglish
Title of host publication2021 IEEE International Ultrasonics Symposium (IUS)
Number of pages4
Publication statusPublished - 16 Nov 2021

Publication series

NameIEEE International Ultrasonics Symposium, IUS
PublisherI E E E
ISSN (Print)1948-5719


Dive into the research topics of 'Enhancing Photoacoustic Visualisation of Clinical Needles with Deep Learning'. Together they form a unique fingerprint.

Cite this