足迹
萃取(化学)
边界(拓扑)
分解
计算机科学
网(多面体)
人工智能
高分辨率
地质学
遥感
数学
几何学
化学
古生物学
色谱法
数学分析
有机化学
作者
Y. Li,Danfeng Hong,Chenyu Li,Jing Yao,Jocelyn Chanussot
出处
期刊:Isprs Journal of Photogrammetry and Remote Sensing
日期:2024-02-05
卷期号:209: 51-65
被引量:9
标识
DOI:10.1016/j.isprsjprs.2024.01.022
摘要
The extraction of building footprints, as a highly challenging task in remote sensing (RS) image-based geospatial object detection and recognition, holds significant importance. Due to the strong coupling in RS images between the body and boundary of buildings, the ability of most currently advanced deep learning models in building footprint extraction remains limited, inevitably meeting the extraction performance bottleneck. To this end, we propose a novel High-resolution Decoupled Network, HD-Net for short, for precious building footprint extraction in RS. HD-Net follows the well-known high-resolution network (HRNet) architecture, which can to a great extent alleviate the coupling issues between body and boundary using its multi-scale information interaction in parallel. More specifically, Our HD-Net innovatively designs the multiple stacked multi-scale feature fusion (MFF) modules, where the MFF module is performed by combining the deep supervision technique and a feature decoupling–recoupling (FDR) module. The FDR module adeptly untangles coupled features into two distinct elements: body and boundary, yielding feature maps enriched with semantic information. This configuration facilitates a step-wise refinement of building extraction and boundary predictions, ensuring the overall continuity of buildings and the precision of their boundaries. Experiments conducted on the three widely-used building datasets, i.e., Massachusetts, WHU, and Inria, demonstrate that HD-Net achieves the most competitive results with minimal parameter count. In detail, HD-Net outperforms contour-guided and local structure-aware network (CGSANet) with intersection over union (IoU) improvements of 0.40%, 0.95%, and 0.73% on the mentioned datasets, while CGSANet is the state-of-the-art algorithm using a hybrid loss function and deep supervision strategy. Furthermore, the code of the HD-Net will be made available freely at https://github.com/danfenghong/ISPRS_HD-Net.
科研通智能强力驱动
Strongly Powered by AbleSci AI