Iterative Closest Labeled Point for tactile object shape recognition

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

40 Citations (Scopus)

Abstract

Tactile data and kinesthetic cues are two important sensing sources in robot object recognition and are complementary to each other. In this paper, we propose a novel algorithm named Iterative Closest Labeled Point (iCLAP) to recognize objects using both tactile and kinesthetic information. The iCLAP first assigns different local tactile features with distinct label numbers. The label numbers of the tactile features together with their associated 3D positions form a 4D point cloud of the object. In this manner, the two sensing modalities are merged to form a synthesized perception of the touched object. To recognize an object, the partial 4D point cloud obtained from a number of touches iteratively matches with all the reference cloud models to identify the best fit. An extensive evaluation study with 20 real objects shows that our proposed iCLAP approach outperforms those using either of the separate sensing modalities, with a substantial recognition rate improvement of up to 18%.

Original languageEnglish
Title of host publicationIROS 2016 - 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3137-3142
Number of pages6
Volume2016-November
ISBN (Electronic)9781509037629
DOIs
Publication statusPublished - 28 Nov 2016
Event2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2016 - Daejeon, Korea, Republic of
Duration: 9 Oct 201614 Oct 2016

Conference

Conference2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2016
Country/TerritoryKorea, Republic of
CityDaejeon
Period9/10/201614/10/2016

Fingerprint

Dive into the research topics of 'Iterative Closest Labeled Point for tactile object shape recognition'. Together they form a unique fingerprint.

Cite this