King's College London

Research portal

Depth Estimation of Hard Inclusions in Soft Tissue by Autonomous Robotic Palpation Using Deep Recurrent Neural Network

Research output: Contribution to journalArticle

Bo Xiao, W. Xu, J. Guo, H. Lam, G. Jia, Wuzhou Hong, Hongliang Ren

Original languageUndefined/Unknown
Pages (from-to)1-9
Number of pages9
JournalIEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
Accepted/In press18 Feb 2020

Documents

King's Authors

Abstract

Accurately detecting tumors and estimating the depth of tumors is essential in the surgical removal of tumors. In robotic-assisted surgery, autonomous robotic palpation has the potential to provide more precise detection, tumors' depth estimation, and less intrusion when normal tissues surround tumors. In this article, by mimicking the human finger touch, we propose a tactile sensing-based deep recurrent neural network (DRNN) with long short-term memory (LSTM) architecture to improve the accuracy of the detection and depth estimation of tumors embedded in soft tissue. In the experimental setup, the hard inclusions simulate the tumors, while the phantom tissue is fabricated by silicon to simulate the soft tissue. During the experiment, the data from the force sensor and displacement of the robot palpation probe are for detection and depth estimation purposes. The collected sequential data set of the force and the displacement of the probe during one completed palpation process will go through the proposed DRNN network with deep LSTM architecture, in which the temporal dependencies of the sequential data will be captured in the cell states in the deep LSTM layers. Subsequently, the softmax classifier is adopted to determine if there is any hard inclusion exists and offer the depth estimation of the hard inclusions. Experiments based on 396 real data sets demonstrate that the detection accuracy for the testing data set is 99.2% and the depth estimation accuracy for the testing data set is 95.8%. The accuracy of the proposed method is best when comparing with other widely used methods.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454