King's College London

Research portal

Robotic Intra-Operative Ultrasound: Virtual Environments and Parallel Systems

Research output: Contribution to journalArticlepeer-review

Shuangyi Wang, James Housden, Tianxiang Bai, Hongbin Liu, Junghwan Back, Davinder Singh, Kawal Rhode, Zeng Guang Hou, Fei Yue Wang

Original languageEnglish
Article number9395537
Pages (from-to)1095-1106
Number of pages12
JournalIEEE/CAA Journal of Automatica Sinica
Volume8
Issue number5
DOIs
PublishedMay 2021

Bibliographical note

Funding Information: Manuscript received January 28, 2021; accepted February 27, 2021. This work was supported in part by the Key Research and Development Program 2020 of Guangzhou (202007050002), in part by the National Natural Science Foundation of China (62003339, U1811463), and in part by the Intel Collaborative Research Institute for Intelligent and Automated Connected Vehicles (“ICRI-IACV”). Recommended by Associate Editor Shouguang Wang. (Corresponding author: Fei-Yue Wang.) Citation: S. Y. Wang, J. Housden, T. X. Bai, H. B. Liu, J. Back, D. Singh, K. Rhode, Z.-G. Hou, and F.-Y. Wang, “Robotic intra-operative ultrasound: Virtual environments and parallel systems,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 5, pp. 1095–1106, May 2021. Publisher Copyright: © 2014 Chinese Association of Automation. Copyright: Copyright 2021 Elsevier B.V., All rights reserved.

King's Authors

Abstract

Robotic intra-operative ultrasound has the potential to improve the conventional practice of diagnosis and procedure guidance that are currently performed manually. Working towards automatic or semi-Automatic ultrasound, being able to define ultrasound views and the corresponding probe poses via intelligent approaches become crucial. Based on the concept of parallel system which incorporates the ingredients of artificial systems, computational experiments, and parallel execution, this paper utilized a recent developed robotic trans-esophageal ultrasound system as the study object to explore the method for developing the corresponding virtual environments and present the potential applications of such systems. The proposed virtual system includes the use of 3D slicer as the main workspace and graphic user interface (GUI), Matlab engine to provide robotic control algorithms and customized functions, and PLUS (Public software Library for UltraSound imaging research) toolkit to generate simulated ultrasound images. Detailed implementation methods were presented and the proposed features of the system were explained. Based on this virtual system, example uses and case studies were presented to demonstrate its capabilities when used together with the physical TEE robot. This includes standard view definition and customized view optimization for pre-planning and navigation, as well as robotic control algorithm evaluations to facilitate real-Time automatic probe pose adjustments. To conclude, the proposed virtual system would be a powerful tool to facilitate the further developments and clinical uses of the robotic intra-operative ultrasound systems.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454