Class Attention Map Distillation for Efficient Semantic Segmentation |
Paper ID : 1152-MVIP2020 |
Authors: |
Nader Karimi Bavandpou *, Shohreh Kasaei none,none |
Abstract: |
In this paper, we introduce a novel way ofcapturing the information of a powerful and trained deepconvolutional neural network and distill it to a training smallernetwork. Our method, despite of many others which work onfinal layers, can successfully extract suitable information fordistillation from intermediate layers of a network by making classspecific attention maps and then forcing the student network tomimic those attentions. We apply this method to state of the artsemantic segmentation architectures and show its effectivenessby experiments on the standard Pascal Voc 2012 dataset. |
Keywords: |
Semantic Segmentation, Knowledge Distillation, Saliency Maps |
Status : Paper Accepted (Oral Presentation) |