Abstract
Knowledge distillation is an attractive approach for learning compact deep
neural networks, which learns a lightweight student model by distilling
knowledge from a complex teacher model. Attention-based knowledge distillation
is a specific form of intermediate feature-based knowledge distillation that
uses attention mechanisms to encourage the student to better mimic the teacher.
However, most of the previous attention-based distillation approaches perform
attention in the spatial domain, which primarily affects local regions in the
input image. This may not be sufficient when we need to capture the broader
context or global information necessary for effective knowledge transfer. In
frequency domain, since each frequency is determined from all pixels of the
image in spatial domain, it can contain global information about the image.
Inspired by the benefits of the frequency domain, we propose a novel module
that functions as an attention mechanism in the frequency domain. The module
consists of a learnable global filter that can adjust the frequencies of
student's features under the guidance of the teacher's features, which
encourages the student's features to have patterns similar to the teacher's
features. We then propose an enhanced knowledge review-based distillation model
by leveraging the proposed frequency attention module. The extensive
experiments with various teacher and student architectures on image
classification and object detection benchmark datasets show that the proposed
approach outperforms other knowledge distillation methods.