TY - JOUR
T1 - Improving needle visibility in LED-based photoacoustic imaging using deep learning with semi-synthetic datasets
AU - Shi, Mengjie
AU - Zhao, Tianrui
AU - West, Simeon J.
AU - Desjardins, Adrien E.
AU - Vercauteren, Tom
AU - Xia, Wenfeng
N1 - Funding Information:
This work was funded in whole, or in part, by the Wellcome Trust, United Kingdom [ 203148/Z/16/Z , WT101957 , 203145Z/16/Z ], the Engineering and Physical Sciences Research Council (EPSRC), United Kingdom ( NS/A000027/1 , NS/A000050/1 , NS/A000049/1 ), and King’s–China Scholarship Council PhD Scholarship Program (K-CSC) ( 202008060071 ). For the purpose of open access, the author has applied a CC BY public copyright licence to any Author Accepted Manuscript version arising from this submission.
Publisher Copyright:
© 2022 The Authors
PY - 2022/6
Y1 - 2022/6
N2 - Photoacoustic imaging has shown great potential for guiding minimally invasive procedures by accurate identification of critical tissue targets and invasive medical devices (such as metallic needles). The use of light emitting diodes (LEDs) as the excitation light sources accelerates its clinical translation owing to its high affordability and portability. However, needle visibility in LED-based photoacoustic imaging is compromised primarily due to its low optical fluence. In this work, we propose a deep learning framework based on U-Net to improve the visibility of clinical metallic needles with a LED-based photoacoustic and ultrasound imaging system. To address the complexity of capturing ground truth for real data and the poor realism of purely simulated data, this framework included the generation of semi-synthetic training datasets combining both simulated data to represent features from the needles and in vivo measurements for tissue background. Evaluation of the trained neural network was performed with needle insertions into blood-vessel-mimicking phantoms, pork joint tissue ex vivo and measurements on human volunteers. This deep learning-based framework substantially improved the needle visibility in photoacoustic imaging in vivo compared to conventional reconstruction by suppressing background noise and image artefacts, achieving 5.8 and 4.5 times improvements in terms of signal-to-noise ratio and the modified Hausdorff distance, respectively. Thus, the proposed framework could be helpful for reducing complications during percutaneous needle insertions by accurate identification of clinical needles in photoacoustic imaging.
AB - Photoacoustic imaging has shown great potential for guiding minimally invasive procedures by accurate identification of critical tissue targets and invasive medical devices (such as metallic needles). The use of light emitting diodes (LEDs) as the excitation light sources accelerates its clinical translation owing to its high affordability and portability. However, needle visibility in LED-based photoacoustic imaging is compromised primarily due to its low optical fluence. In this work, we propose a deep learning framework based on U-Net to improve the visibility of clinical metallic needles with a LED-based photoacoustic and ultrasound imaging system. To address the complexity of capturing ground truth for real data and the poor realism of purely simulated data, this framework included the generation of semi-synthetic training datasets combining both simulated data to represent features from the needles and in vivo measurements for tissue background. Evaluation of the trained neural network was performed with needle insertions into blood-vessel-mimicking phantoms, pork joint tissue ex vivo and measurements on human volunteers. This deep learning-based framework substantially improved the needle visibility in photoacoustic imaging in vivo compared to conventional reconstruction by suppressing background noise and image artefacts, achieving 5.8 and 4.5 times improvements in terms of signal-to-noise ratio and the modified Hausdorff distance, respectively. Thus, the proposed framework could be helpful for reducing complications during percutaneous needle insertions by accurate identification of clinical needles in photoacoustic imaging.
UR - http://www.scopus.com/inward/record.url?scp=85128536203&partnerID=8YFLogxK
U2 - 10.1016/j.pacs.2022.100351
DO - 10.1016/j.pacs.2022.100351
M3 - Article
SN - 2213-5979
VL - 26
JO - Photoacoustics
JF - Photoacoustics
IS - 100351
M1 - 100351
ER -