King's College London

Research portal

Real-time segmentation of non-rigid surgical tools based on deep learning and tracking

Research output: Chapter in Book/Report/Conference proceedingConference paper

Standard

Real-time segmentation of non-rigid surgical tools based on deep learning and tracking. / García-Peraza-Herrera, Luis C.; Li, Wenqi; Gruijthuijsen, Caspar; Devreker, Alain; Attilakos, George; Deprest, Jan; Poorten, Emmanuel Vander; Stoyanov, Danail; Vercauteren, Tom; Ourselin, Sébastien.

Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers. Vol. 10170 LNCS Springer Verlag, 2017. p. 84-95 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10170 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference paper

Harvard

García-Peraza-Herrera, LC, Li, W, Gruijthuijsen, C, Devreker, A, Attilakos, G, Deprest, J, Poorten, EV, Stoyanov, D, Vercauteren, T & Ourselin, S 2017, Real-time segmentation of non-rigid surgical tools based on deep learning and tracking. in Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers. vol. 10170 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10170 LNCS, Springer Verlag, pp. 84-95, 2nd International Workshop on Patch-Based Techniques in Medical Imaging, Patch-MI 2016 held in conjunction with 19th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2016, Athens, Greece, 17/10/2016. https://doi.org/10.1007/978-3-319-54057-3_8

APA

García-Peraza-Herrera, L. C., Li, W., Gruijthuijsen, C., Devreker, A., Attilakos, G., Deprest, J., ... Ourselin, S. (2017). Real-time segmentation of non-rigid surgical tools based on deep learning and tracking. In Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers (Vol. 10170 LNCS, pp. 84-95). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10170 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-319-54057-3_8

Vancouver

García-Peraza-Herrera LC, Li W, Gruijthuijsen C, Devreker A, Attilakos G, Deprest J et al. Real-time segmentation of non-rigid surgical tools based on deep learning and tracking. In Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers. Vol. 10170 LNCS. Springer Verlag. 2017. p. 84-95. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-54057-3_8

Author

García-Peraza-Herrera, Luis C. ; Li, Wenqi ; Gruijthuijsen, Caspar ; Devreker, Alain ; Attilakos, George ; Deprest, Jan ; Poorten, Emmanuel Vander ; Stoyanov, Danail ; Vercauteren, Tom ; Ourselin, Sébastien. / Real-time segmentation of non-rigid surgical tools based on deep learning and tracking. Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers. Vol. 10170 LNCS Springer Verlag, 2017. pp. 84-95 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).

Bibtex Download

@inbook{a574718e987b4bea8edc2753e806bba9,
title = "Real-time segmentation of non-rigid surgical tools based on deep learning and tracking",
abstract = "Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We validated our method using existing and new benchmark datasets, covering both ex vivo and in vivo real clinical cases where different surgical instruments are employed. Two versions of the method are presented, non-real-time and real-time. The former, using only deep learning, achieves a balanced accuracy of 89.6{\%} on a real clinical dataset, outperforming the (non-real-time) state of the art by 3.8{\%} points. The latter, a combination of deep learning with optical flow tracking, yields an average balanced accuracy of 78.2{\%} across all the validated datasets.",
author = "Garc{\'i}a-Peraza-Herrera, {Luis C.} and Wenqi Li and Caspar Gruijthuijsen and Alain Devreker and George Attilakos and Jan Deprest and Poorten, {Emmanuel Vander} and Danail Stoyanov and Tom Vercauteren and S{\'e}bastien Ourselin",
year = "2017",
month = "2",
day = "22",
doi = "10.1007/978-3-319-54057-3_8",
language = "English",
isbn = "9783319540566",
volume = "10170 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "84--95",
booktitle = "Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers",
address = "Germany",

}

RIS (suitable for import to EndNote) Download

TY - CHAP

T1 - Real-time segmentation of non-rigid surgical tools based on deep learning and tracking

AU - García-Peraza-Herrera, Luis C.

AU - Li, Wenqi

AU - Gruijthuijsen, Caspar

AU - Devreker, Alain

AU - Attilakos, George

AU - Deprest, Jan

AU - Poorten, Emmanuel Vander

AU - Stoyanov, Danail

AU - Vercauteren, Tom

AU - Ourselin, Sébastien

PY - 2017/2/22

Y1 - 2017/2/22

N2 - Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We validated our method using existing and new benchmark datasets, covering both ex vivo and in vivo real clinical cases where different surgical instruments are employed. Two versions of the method are presented, non-real-time and real-time. The former, using only deep learning, achieves a balanced accuracy of 89.6% on a real clinical dataset, outperforming the (non-real-time) state of the art by 3.8% points. The latter, a combination of deep learning with optical flow tracking, yields an average balanced accuracy of 78.2% across all the validated datasets.

AB - Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We validated our method using existing and new benchmark datasets, covering both ex vivo and in vivo real clinical cases where different surgical instruments are employed. Two versions of the method are presented, non-real-time and real-time. The former, using only deep learning, achieves a balanced accuracy of 89.6% on a real clinical dataset, outperforming the (non-real-time) state of the art by 3.8% points. The latter, a combination of deep learning with optical flow tracking, yields an average balanced accuracy of 78.2% across all the validated datasets.

UR - http://www.scopus.com/inward/record.url?scp=85013895020&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-54057-3_8

DO - 10.1007/978-3-319-54057-3_8

M3 - Conference paper

AN - SCOPUS:85013895020

SN - 9783319540566

VL - 10170 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 84

EP - 95

BT - Computer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers

PB - Springer Verlag

ER -

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454