King's College London

Research portal

Crowd Counting via Scale-Adaptive Convolutional Neural Network

Research output: Contribution to journalConference paperpeer-review

Lu Zhang, Miaojing Shi, Qiaobo Chen

Original languageEnglish
Pages (from-to)1113-1121
JournalIEEE Computer Society Conference on Applications of Computer Vision (WACV)
Early online date7 May 2018
E-pub ahead of print7 May 2018


  • Crowd counting via scale_ZHANG_Epub7May2018_GREEN AAM

    1711.04433.pdf, 3.35 MB, application/pdf

    Uploaded date:11 May 2020

    Version:Accepted author manuscript

    © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works

King's Authors


The task of crowd counting is to automatically estimate the pedestrian number in crowd images. To cope with the scale and perspective changes that commonly exist in crowd images, state-of-the-art approaches employ multi-column CNN architectures to regress density maps of crowd images. Multiple columns have different receptive fields corresponding to pedestrians (heads) of different scales. We instead propose a scale-adaptive CNN (SaCNN) architecture with a backbone of fixed small receptive fields. We extract feature maps from multiple layers and adapt them to have the same output size; we combine them to produce the final density map. The number of people is computed by integrating the density map. We also introduce a relative count loss along with the density map loss to improve the network generalization on crowd scenes with few pedestrians, where most representative approaches perform poorly on. We conduct extensive experiments on the ShanghaiTech, UCF_CC_50 and WorldExpo'10 datasets as well as a new dataset SmartCity that we collect for crowd scenes with few people. The results demonstrate significant improvements of SaCNN over the state-of-the-art.

Download statistics

No data available

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454