King's College London

Research portal

CobNet: cross attention on object and background for few-shot segmentation

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Original languageEnglish
Title of host publication2022 26th International Conference on Pattern Recognition, ICPR 2022
Number of pages7
ISBN (Electronic)9781665490627
Published29 Nov 2022
EventInternational Conference on Pattern Recognition - Montréal Québec, Montréal, Canada
Duration: 22 Aug 202225 Aug 2022

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651


ConferenceInternational Conference on Pattern Recognition
Internet address

Bibliographical note

Publisher Copyright: © 2022 IEEE.

King's Authors


Few-shot segmentation aims to segment images containing objects from previously unseen classes using only a few annotated samples. Most current methods focus on using object information extracted, with the aid of human annotations, from support images to identify the same objects in new query images. However, background information can also be useful to distinguish objects from their surroundings. Hence, some previous methods also extract background information from the support images. In this paper, we argue that such information is of limited utility, as the background in different images can vary widely. To overcome this issue, we propose CobNet which utilises information about the background that is extracted from the query images without annotations of those images. Experiments show that our method achieves a mean Intersection-over-Union score of 61.4% and 37.8% for 1-shot segmentation on PASCAL-5i and COCO-20i respectively, outperforming previous methods. It is also shown to produce state-of-the-art performances of 53.7% for weakly-supervised few-shot segmentation, where no annotations are provided for the support images.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454