期刊:IEEE Transactions on Geoscience and Remote Sensing [Institute of Electrical and Electronics Engineers] 日期:2024-01-01卷期号:62: 1-16被引量:4
标识
DOI:10.1109/tgrs.2024.3376352
摘要
Semantic segmentation of remote sensing images is vital in remote sensing technology. High-quality models for this task require a vast amount of images, manual annotation is a process that is time-consuming and labor-intensive. Consequently, this has catalyzed the emergence of semi-supervised semantic segmentation methods. However, the complexity of foreground categories in remote sensing images poses a challenge to maintaining prediction consistency. Moreover, inherent characteristics such as intra-class variations and inter-class similarities result in a certain degree of confusion among features of different classes in the feature space. This impacts the final classification results. In order to improve the model's consistency and optimize the classification of categories based on features, this paper proposes a new semi-supervised semantic segmentation framework that combines consistency regularization and contrastive learning. In terms of consistency regularization, the proposed method incorporates dual teacher networks, introduces ClassMix for image augmentation, and utilizes confidence levels to integrate the predictions from these networks. By introducing perturbations at both the network and image levels, while simultaneously maintaining consistency, the predictive prowess and generalization ability of the model are enhanced. For contrastive learning, Postive-Unlabeled Learning (PU-Learning) is employed to improve the problem of mis-sampling when selecting features. At the same time, higher biased weights are allocated to more challenging negative samples, thereby elevating the complexity of feature learning and enhancing the discriminative capability of the final feature representation space. Our extensive experiments on the ISPRS Vaihingen dataset and the challenging iSAID dataset have served to underscore the superior performance of our approach.