3-D pose estimation of articulated instruments in robotic minimally invasive surgery

M. Allan*, S. Ourselin, D. J. Hawkes, J. D. Kelly, D. Stoyanov

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

62 Citations (Scopus)

Abstract

Estimating the 3-D pose of instruments is an important part of robotic minimally invasive surgery for automation of basic procedures as well as providing safety features, such as virtual fixtures. Image-based methods of 3-D pose estimation provide a non-invasive low cost solution compared with methods that incorporate external tracking systems. In this paper, we extend our recent work in estimating rigid 3-D pose with silhouette and optical flow-based features to incorporate the articulated degrees-of-freedom (DOFs) of robotic instruments within a gradient-based optimization framework. Validation of the technique is provided with a calibrated ex-vivo study from the da Vinci Research Kit (DVRK) robotic system, where we perform quantitative analysis on the errors each DOF of our tracker. Additionally, we perform several detailed comparisons with recently published techniques that combine visual methods with kinematic data acquired from the joint encoders. Our experiments demonstrate that our method is competitively accurate while relying solely on image data.

Original languageEnglish
Pages (from-to)1204-1213
Number of pages10
JournalIEEE Transactions on Medical Imaging
Volume37
Issue number5
DOIs
Publication statusPublished - 1 May 2018

Keywords

  • articulated pose estimation
  • robotic surgery
  • Surgical instrument detection

Fingerprint

Dive into the research topics of '3-D pose estimation of articulated instruments in robotic minimally invasive surgery'. Together they form a unique fingerprint.

Cite this