计算机科学
熵(时间箭头)
蒸馏
人工智能
数据挖掘
模式识别(心理学)
作者
Jianwei Chen,Yongxuan Lai,Yifeng Zeng,Fan Yang
出处
期刊:International Conference on Computer Science and Education
日期:2021-08-17
标识
DOI:10.1109/iccse51940.2021.9569689
摘要
In recent years, state-of-the-art scene text detectors use cumbersome segmentation models as their detection frameworks, but this type of detection model is difficult to deploy to mobile devices with limited computing resources. An effective method to achieve a good compromise between detection accuracy and model complexity is knowledge distillation. The existing knowledge distillation methods cannot make full use of the output characteristics of the segmentation-based text detection, that is, each pixel on the segmentation map represents the probability that the pixel belongs to the text. The results of the entropy map corresponding to the visualized segmentation map show that the degree of confusion between the information entropy map of the student network and the teacher network reflects the gap in generalization ability. In this work, we propose a novel knowledge distillation via entropy map(KDEM). Specifically, the entropy map of the teacher network segmentation map is used as knowledge to guide student network learning. In order to eliminate the possible adverse effects of entropy in non-target regions, we multiply the information entropy map of the teacher network with the mask of the text region to extract the knowledge related to the target. Experiments on three benchmark datasets: MSRA-TD500, ICDAR 2015 and Total-Text, show that our proposed knowledge distillation via entropy map consistently improve the F1-score of the student network and is better than the other three mainstream knowledge distillation methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI