King's College London

Research portal

CobNet: cross attention on object and background for few-shot segmentation

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Original languageEnglish
Title of host publication2022 26th International Conference on Pattern Recognition, ICPR 2022
Pages39-45
Number of pages7
ISBN (Electronic)9781665490627
DOIs
Published29 Nov 2022
EventInternational Conference on Pattern Recognition - Montréal Québec, Montréal, Canada
Duration: 22 Aug 202225 Aug 2022
https://www.icpr2022.com/

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume2022-August
ISSN (Print)1051-4651

Conference

ConferenceInternational Conference on Pattern Recognition
Country/TerritoryCanada
CityMontréal
Period22/08/202225/08/2022
Internet address

Bibliographical note

Publisher Copyright: © 2022 IEEE.

King's Authors

Abstract

Few-shot segmentation aims to segment images containing objects from previously unseen classes using only a few annotated samples. Most current methods focus on using object information extracted, with the aid of human annotations, from support images to identify the same objects in new query images. However, background information can also be useful to distinguish objects from their surroundings. Hence, some previous methods also extract background information from the support images. In this paper, we argue that such information is of limited utility, as the background in different images can vary widely. To overcome this issue, we propose CobNet which utilises information about the background that is extracted from the query images without annotations of those images. Experiments show that our method achieves a mean Intersection-over-Union score of 61.4% and 37.8% for 1-shot segmentation on PASCAL-5i and COCO-20i respectively, outperforming previous methods. It is also shown to produce state-of-the-art performances of 53.7% for weakly-supervised few-shot segmentation, where no annotations are provided for the support images.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454