Multiscale feature U-Net for remote sensing image segmentation

计算机科学 分割 特征提取 人工智能 特征(语言学) 过度拟合 图像分割 模式识别(心理学) 编码器 遥感 卷积(计算机科学) 计算机视觉 人工神经网络 地理 语言学 操作系统 哲学
作者
Youhua Wei,Xuzhi Liu,Jingxiong Lei,Ruihan Yue,Jun Feng
出处
期刊:Journal of Applied Remote Sensing [SPIE]
卷期号:16 (01) 被引量:6
标识
DOI:10.1117/1.jrs.16.016507
摘要

The segmentation and extraction of buildings in high-resolution remote sensing images has good application prospects in military, civil, and other fields. With a depth encoder–decoder structure, U-Net is a frequently used model for high-precision image segmentation. However, the design of U-Net makes it hard to retain the detailed information of edges when processing the building segmentation. Specifically, the low-level features extracted from the shallow layer and the abstract features extracted from the deep layer cannot be completely merged, resulting in inaccurate segmentation. In response to this problem, we design a new multiscale feature extraction module that extracts target information through three convolution kernels of different scales. Taking U-Net as the baseline, by replacing skip connections with this module, we propose a multiscale feature extraction U-Net. This method can perform secondary feature extraction on the shallow feature information in the skip connection, refine the detailed information, and narrow the semantic gap between the low-level features and high-level features. It can not only improve the ability of the network to extract multiscale feature information, from a larger range to more layers to extract the edge detail information of the building in the remote sensing image, but also increase the number of skip connections to reduce network overfitting. Experimental results on Massachusetts remote sensing data and Massachusetts building data show that the method proposed offers significant improvement in terms of precision and accuracy compared with the methods full convolutional network, U-Net, SegNet, and high-resolution network, with an F1 score of 88.73%, mean IoU of 91.15%, precision of 89.74%, accuracy of 97.36%, and recall of 87.74%.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
12332145678完成签到,获得积分10
刚刚
刚刚
方旭完成签到,获得积分10
刚刚
刚刚
xcz完成签到 ,获得积分10
1秒前
粒粒完成签到,获得积分10
1秒前
华仔应助babypping采纳,获得10
1秒前
2秒前
山本无山完成签到,获得积分10
3秒前
动听小笼包完成签到,获得积分10
3秒前
4秒前
好好发布了新的文献求助10
5秒前
小蘑菇应助会飞的猪采纳,获得10
5秒前
suye发布了新的文献求助10
6秒前
6秒前
Orange应助Liu采纳,获得10
7秒前
田様应助xxhhhhhh采纳,获得10
7秒前
yin2发布了新的文献求助10
8秒前
8秒前
9秒前
慕青应助Wang采纳,获得10
9秒前
露露完成签到,获得积分10
10秒前
10秒前
11秒前
超级发布了新的文献求助10
12秒前
Drwenlu发布了新的文献求助10
13秒前
kjh应助山晴采纳,获得10
13秒前
Hearn发布了新的文献求助10
14秒前
赘婿应助dllneu采纳,获得10
14秒前
15秒前
15秒前
赵晨发布了新的文献求助10
15秒前
Zzz应助yin2采纳,获得10
16秒前
21秒前
21秒前
爆米花应助Hearn采纳,获得10
21秒前
goblue完成签到,获得积分10
21秒前
超级发布了新的文献求助10
21秒前
风轻云淡发布了新的文献求助20
22秒前
纯真的寒凡完成签到,获得积分10
23秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Wiley Blackwell Companion to Diachronic and Historical Linguistics 3000
The impact of workplace variables on juvenile probation officers’ job satisfaction 1000
When the badge of honor holds no meaning anymore 1000
HANDBOOK OF CHEMISTRY AND PHYSICS 106th edition 1000
ASPEN Adult Nutrition Support Core Curriculum, Fourth Edition 1000
AnnualResearch andConsultation Report of Panorama survey and Investment strategy onChinaIndustry 1000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6280334
求助须知:如何正确求助?哪些是违规求助? 8099603
关于积分的说明 16933519
捐赠科研通 5347952
什么是DOI,文献DOI怎么找? 2842842
邀请新用户注册赠送积分活动 1820252
关于科研通互助平台的介绍 1677163