Download PDFOpen PDF in browser

Improved Knowledge Distillation for Crowd Counting on IoT Devices

EasyChair Preprint no. 10722

8 pagesDate: August 15, 2023

Abstract

Manual crowd counting for real world problems is either impossible and/or results in wildly inaccurate estimations. Deep learning is one area that has been applied to address this issue. However crowd counting is a computationally intensive task. Many crowd counting models employ large-scale deep convolutional neural networks (CNN) to achieve higher accuracy, however these are typically at the cost of performance and inference speed. This makes such approaches difficult to apply in real world settings, e.g., on Internet-of-Things (IoT) devices. To tackle this problem, one method is to compress models using pruning and quantization or use of lightweight model backbones. However, such methods often result in a significant loss in accuracy. To address this, some studies have explored knowledge distillation methods to extract useful information from large state-of-the-art (teacher) models to guide/train smaller (student) models. However, knowledge distillation methods suffer from the problem of information loss caused by hint-transformers. Furthermore, teacher models may have a negative impact on student models. In this work, we propose a method based on knowledge distillation that uses self-transformed hints and loss functions that ignore outliers to tackle real world and challenging crowd counting tasks. Through our approach we achieve a MAE of 77.24 and a MSE of 276.17 using the JHU-CROWD++ test set. This is comparable to state-of-the-art deep crowd counting models, but at a fraction of the original model size and complexity, thus making the solution suitable for IoT devices.

Keyphrases: crowd counting, deep learning, Knowledge Distillation

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:10722,
  author = {Zuo Huang and Richard Sinnott},
  title = {Improved Knowledge Distillation for Crowd Counting on IoT Devices},
  howpublished = {EasyChair Preprint no. 10722},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser