计算机科学
RGB颜色模型
稳健性(进化)
人工智能
分割
登普斯特-沙弗理论
传感器融合
数据挖掘
基本事实
模式识别(心理学)
机器学习
生物化学
化学
基因
作者
Qingwang Wang,Cheng Yin,Haochen Song,Tao Shen,Yanfeng Gu
出处
期刊:IEEE Geoscience and Remote Sensing Letters
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:20: 1-5
被引量:12
标识
DOI:10.1109/lgrs.2023.3322452
摘要
In real-world scenarios, the information quality provided by RGB and thermal (RGB-T) sensors often varies across samples. This variation will negatively impact the performance of semantic segmentation models in utilizing complementary information from RGB-T modalities, resulting in a decrease in accuracy and fusion credibility. Dynamically estimating the uncertainty of each modality for different samples could help the model perceive such information quality variation and then provide guidance for a reliable fusion. With this in mind, we propose a novel uncertainty-guided trustworthy fusion network (UTFNet) for RGB-T semantic segmentation. Specifically, we design an uncertainty estimation and evidential fusion (UEEF) module to quantify the uncertainty of each modality and then utilize the uncertainty to guide the information fusion. In the UEEF module, we introduce the Dirichlet distribution to model the distribution of the predicted probabilities, parameterized with evidence from each modality and then integrate them with the Dempster-Shafer theory (DST). Moreover, illumination evidence gathering (IEG) and multi-scale evidence gathering (MEG) modules by considering illumination and target multi-scale information respectively are designed to gather more reliable evidence. In the IEG module, we calculate the illumination probability and model it as the illumination evidence. The MEG module can collect evidence for each modality across multiple scales. Both qualitative and quantitative results demonstrate the effectiveness of our proposed model in accuracy, robustness and trustworthiness. The code will be accessible at https://github.com/KustTeamWQW/UTFNet.
科研通智能强力驱动
Strongly Powered by AbleSci AI