TY - JOUR
T1 - Enhancement of instrumented ultrasonic tracking images using deep learning
AU - Maneas, Efthymios
AU - Hauptmann, Andreas
AU - Alles, Erwin J
AU - Xia, Wenfeng
AU - Noimark, Sacha
AU - David, Anna L
AU - Arridge, Simon
AU - Desjardins, Adrien E
N1 - Funding Information:
This work was funded by the Wellcome (WT101957; 203145Z/16/Z; 203148/Z/16/Z) and the Engineering and Physical Sciences Research Council (EPSRC) (NS/A000027/1; NS/A000050/1; NS/A000049/1; EP/L016478/1; EP/M020533/1; EP/S001506/1), by the European Research Council (ERC-2012-StG, Proposal 310970 MOPHIM), by the Rosetrees Trust (PGS19-2/10006) and by the Academy of Finland (336796; 338408). A.L.D. is supported by the UCL/UCL Hospital National Institute for Health Research Comprehensive Biomedical Research Centre.
Publisher Copyright:
© 2022, The Author(s).
PY - 2022/9/3
Y1 - 2022/9/3
N2 - PURPOSE: Instrumented ultrasonic tracking provides needle localisation during ultrasound-guided minimally invasive percutaneous procedures. Here, a post-processing framework based on a convolutional neural network (CNN) is proposed to improve the spatial resolution of ultrasonic tracking images.METHODS: The custom ultrasonic tracking system comprised a needle with an integrated fibre-optic ultrasound (US) transmitter and a clinical US probe for receiving those transmissions and for acquiring B-mode US images. For post-processing of tracking images reconstructed from the received fibre-optic US transmissions, a recently-developed framework based on ResNet architecture, trained with a purely synthetic dataset, was employed. A preliminary evaluation of this framework was performed with data acquired from needle insertions in the heart of a fetal sheep in vivo. The axial and lateral spatial resolution of the tracking images were used as performance metrics of the trained network.RESULTS: Application of the CNN yielded improvements in the spatial resolution of the tracking images. In three needle insertions, in which the tip depth ranged from 23.9 to 38.4 mm, the lateral resolution improved from 2.11 to 1.58 mm, and the axial resolution improved from 1.29 to 0.46 mm.CONCLUSION: The results provide strong indications of the potential of CNNs to improve the spatial resolution of ultrasonic tracking images and thereby to increase the accuracy of needle tip localisation. These improvements could have broad applicability and impact across multiple clinical fields, which could lead to improvements in procedural efficiency and reductions in risk of complications.
AB - PURPOSE: Instrumented ultrasonic tracking provides needle localisation during ultrasound-guided minimally invasive percutaneous procedures. Here, a post-processing framework based on a convolutional neural network (CNN) is proposed to improve the spatial resolution of ultrasonic tracking images.METHODS: The custom ultrasonic tracking system comprised a needle with an integrated fibre-optic ultrasound (US) transmitter and a clinical US probe for receiving those transmissions and for acquiring B-mode US images. For post-processing of tracking images reconstructed from the received fibre-optic US transmissions, a recently-developed framework based on ResNet architecture, trained with a purely synthetic dataset, was employed. A preliminary evaluation of this framework was performed with data acquired from needle insertions in the heart of a fetal sheep in vivo. The axial and lateral spatial resolution of the tracking images were used as performance metrics of the trained network.RESULTS: Application of the CNN yielded improvements in the spatial resolution of the tracking images. In three needle insertions, in which the tip depth ranged from 23.9 to 38.4 mm, the lateral resolution improved from 2.11 to 1.58 mm, and the axial resolution improved from 1.29 to 0.46 mm.CONCLUSION: The results provide strong indications of the potential of CNNs to improve the spatial resolution of ultrasonic tracking images and thereby to increase the accuracy of needle tip localisation. These improvements could have broad applicability and impact across multiple clinical fields, which could lead to improvements in procedural efficiency and reductions in risk of complications.
UR - http://www.scopus.com/inward/record.url?scp=85137525754&partnerID=8YFLogxK
U2 - 10.1007/s11548-022-02728-7
DO - 10.1007/s11548-022-02728-7
M3 - Article
C2 - 36057759
SN - 1861-6410
JO - International Journal of Computer Assisted Radiology and Surgery
JF - International Journal of Computer Assisted Radiology and Surgery
ER -