已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Multiscale feature U-Net for remote sensing image segmentation

计算机科学 分割 特征提取 人工智能 特征(语言学) 过度拟合 图像分割 模式识别(心理学) 编码器 遥感 卷积(计算机科学) 计算机视觉 人工神经网络 地理 语言学 操作系统 哲学
作者
Youhua Wei,Xuzhi Liu,Jingxiong Lei,Ruihan Yue,Jun Feng
出处
期刊:Journal of Applied Remote Sensing [SPIE - International Society for Optical Engineering]
卷期号:16 (01) 被引量:6
标识
DOI:10.1117/1.jrs.16.016507
摘要

The segmentation and extraction of buildings in high-resolution remote sensing images has good application prospects in military, civil, and other fields. With a depth encoder–decoder structure, U-Net is a frequently used model for high-precision image segmentation. However, the design of U-Net makes it hard to retain the detailed information of edges when processing the building segmentation. Specifically, the low-level features extracted from the shallow layer and the abstract features extracted from the deep layer cannot be completely merged, resulting in inaccurate segmentation. In response to this problem, we design a new multiscale feature extraction module that extracts target information through three convolution kernels of different scales. Taking U-Net as the baseline, by replacing skip connections with this module, we propose a multiscale feature extraction U-Net. This method can perform secondary feature extraction on the shallow feature information in the skip connection, refine the detailed information, and narrow the semantic gap between the low-level features and high-level features. It can not only improve the ability of the network to extract multiscale feature information, from a larger range to more layers to extract the edge detail information of the building in the remote sensing image, but also increase the number of skip connections to reduce network overfitting. Experimental results on Massachusetts remote sensing data and Massachusetts building data show that the method proposed offers significant improvement in terms of precision and accuracy compared with the methods full convolutional network, U-Net, SegNet, and high-resolution network, with an F1 score of 88.73%, mean IoU of 91.15%, precision of 89.74%, accuracy of 97.36%, and recall of 87.74%.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
guan完成签到,获得积分10
3秒前
赵狗儿完成签到,获得积分10
3秒前
Zhaoyuemeng发布了新的文献求助10
4秒前
4秒前
科研通AI6.3应助PearRay采纳,获得10
4秒前
6秒前
赵狗儿发布了新的文献求助10
6秒前
7秒前
完美世界应助积极乌龟采纳,获得10
7秒前
Yikao完成签到 ,获得积分10
8秒前
yooga完成签到,获得积分10
8秒前
10秒前
沐风发布了新的文献求助10
11秒前
AZN完成签到 ,获得积分10
11秒前
文艺问柳完成签到 ,获得积分10
11秒前
秋博发布了新的文献求助10
12秒前
13秒前
penxyy应助hujie采纳,获得50
14秒前
Annie完成签到 ,获得积分10
14秒前
15秒前
核桃应助科研通管家采纳,获得10
16秒前
ame发布了新的文献求助10
16秒前
核桃应助科研通管家采纳,获得30
16秒前
所所应助科研通管家采纳,获得10
16秒前
烟花应助科研通管家采纳,获得10
16秒前
田様应助科研通管家采纳,获得10
16秒前
核桃应助科研通管家采纳,获得10
16秒前
核桃应助科研通管家采纳,获得20
16秒前
星辰大海应助科研通管家采纳,获得10
16秒前
在水一方应助科研通管家采纳,获得10
16秒前
科研通AI2S应助科研通管家采纳,获得80
16秒前
19秒前
哈哈哈哈发布了新的文献求助10
20秒前
20秒前
22秒前
夏同学完成签到 ,获得积分10
23秒前
Zyou发布了新的文献求助10
23秒前
zl完成签到,获得积分10
25秒前
Menand完成签到,获得积分10
25秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Modern Epidemiology, Fourth Edition 5000
Handbook of pharmaceutical excipients, Ninth edition 5000
Digital Twins of Advanced Materials Processing 2000
Weaponeering, Fourth Edition – Two Volume SET 2000
Polymorphism and polytypism in crystals 1000
Social Cognition: Understanding People and Events 800
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 纳米技术 有机化学 物理 生物化学 化学工程 计算机科学 复合材料 内科学 催化作用 光电子学 物理化学 电极 冶金 遗传学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 6027192
求助须知:如何正确求助?哪些是违规求助? 7674801
关于积分的说明 16184774
捐赠科研通 5174836
什么是DOI,文献DOI怎么找? 2769013
邀请新用户注册赠送积分活动 1752443
关于科研通互助平台的介绍 1638214