异常检测
异常(物理)
计算机科学
训练集
集合(抽象数据类型)
培训(气象学)
人工智能
蒸馏
机器学习
数据集
数据挖掘
模式识别(心理学)
化学
物理
有机化学
气象学
程序设计语言
凝聚态物理
作者
Hongbo Liu,Kai Li,Xiu Li,Yulun Zhang
标识
DOI:10.1109/icip46576.2022.9897777
摘要
Anomaly Detection (AD) aims to find defective patterns or abnormal samples among data, and has been a hot research topic due to various real-world applications. While various AD methods have been proposed, most of them assume the availability of a clean (anomaly-free) training set, which however may be hard to guarantee in many real-world industry applications. This motivates us to investigate Unsupervised Anomaly Detection (UAD) in which the training set includes both normal and abnormal samples. In this paper, we address the UAD problem by proposing a Self-Training and Knowledge Distillation (STKD) model. STKD combats anomalies in the training set by iteratively alternating between excluding samples of high anomaly probabilities and training the model with the purified training set. Despite that the model is trained with a cleaner training set, the inevitably existing anomalies may still cause negative impact. STKD alleviates this by regularizing the model to respond similarly to a teacher model which has not been trained with noisy data. Experiments show that STKD consistently produces more robust performance with different levels of anomalies.
科研通智能强力驱动
Strongly Powered by AbleSci AI