Rotation and translation invariant object recognition with a tactile sensor

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

21 Citations (Scopus)

Abstract

In this paper a novel approach is proposed to recognise different objects invariant to their translation and rotation by utilising a tactile sensor attached to a robotic arm. As the sensor is small compared to the tested objects, the robot needs to access those objects multiple times at different positions and is prone to move or rotate them. This inevitably increases difficulty in object recognition during manipulations. To solve this problem, it is proposed to extract tactile translation and rotation invariant local features to represent objects; a dictionary of k words is therefore learned by κ-means unsupervised learning and a histogram codebook is then used to identify objects. The proposed system has been validated by classifying real objects with data from an off-the-shelf tactile sensor. The average overall accuracy of 91.2% has been achieved with only 10 touches and a dictionary size of 50 clusters.

Original languageEnglish
Title of host publicationIEEE SENSORS 2014, Proceedings
EditorsFrancisco J. Arregui
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1030-1033
Number of pages4
EditionDecember
ISBN (Electronic)9781479901616
DOIs
Publication statusPublished - 12 Dec 2014
Event13th IEEE SENSORS Conference, SENSORS 2014 - Valencia, Spain
Duration: 2 Nov 20145 Nov 2014

Publication series

NameProceedings of IEEE Sensors
NumberDecember
Volume2014-December
ISSN (Print)1930-0395
ISSN (Electronic)2168-9229

Conference

Conference13th IEEE SENSORS Conference, SENSORS 2014
Country/TerritorySpain
CityValencia
Period2/11/20145/11/2014

Fingerprint

Dive into the research topics of 'Rotation and translation invariant object recognition with a tactile sensor'. Together they form a unique fingerprint.

Cite this