计算机科学
人工智能
稳健性(进化)
变更检测
机器学习
特征提取
瓶颈
模式识别(心理学)
特征学习
深度学习
目标检测
数据挖掘
嵌入式系统
生物化学
化学
基因
作者
Congcong Wang,Shouhang Du,Wei Sun,Deng-Ping Fan
标识
DOI:10.1109/jstars.2023.3288294
摘要
Notable achievements have been made in remote sensing images change detection with sample-driven supervised deep learning methods. However, the requirement of the number of labeled samples is impractical for many practical applications, which is a major constraint to the development of supervised deep learning methods. Self-supervised learning using unlabeled data to construct pretext tasks for model pre-training can largely alleviate the sample dilemma faced by deep learning. And the construction of pretext task is the key to the performance of downstream task. In this work, an improved contrastive self-supervised pretext task that is more suitable for the downstream change detection is proposed. Specifically, an improved Siamese network which is a change detection-like architecture is trained to extract multi-level fusion features from different image pairs, both globally and locally. And on this basis, the contrastive loss between feature pairs is minimized to extract more valuable feature representation for downstream change detection. In addition, to further alleviate the problem of little priori information and much image noise in the downstream few-sample change detection, we propose to use variational information bottleneck (VIB) theory to provide explicit regularization constraint for the model. Compared with other methods, our method shows better performance with stronger robustness and finer detection results in both quantitative and qualitative results of two publicly available datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI