遥感
计算机科学
萃取(化学)
图像分辨率
人工智能
计算机视觉
地质学
化学
色谱法
作者
Ruoyu Yang,Yanfei Zhong,Yinhe Liu,Xiaoyan Lu,Liangpei Zhang
标识
DOI:10.1109/tgrs.2024.3387945
摘要
Road occlusion seriously affects the connectivity of extracted roads, and has a negative effect in practical applications. The dense road occlusion problem is caused by high-rise buildings and street trees, and is a more serious and unique problem than simple occlusion caused by low buildings and scattered trees. The existing methods mainly solve the road occlusion problem by enhancing the encoder ability to capture the long connectivity feature of roads. Unfortunately, the existing methods can only solve small and sparse road occlusion situations, and they cannot deal with the dense road occlusions caused by dense high-rise buildings or trees. In this article, to solve the dense road occlusion problem, the occlusion-aware road extraction network, namely OARENet, is proposed for road extraction from high-resolution remote sensing imagery. In OARENet, an occlusion-aware decoder (OADecoder) is designed by explicit modeling the texture feature for road regions with dense occlusions. The OADecoder is made up of a regular occlusion-aware (ROA) module and a stochastic occlusion-aware (SOA) module. The ROA module is implemented by adopting different dilation rates to fit the texture feature in the semantic feature maps. The SOA module is proposed by designing stochastic convolutions to adaptively fit the spatial details of road regions with dense occlusions. In order to evaluate the dense occlusion problem, a dense occlusion road dataset (JHWV) was built and annotated. The experimental results obtained on the DeepGlobe dataset, the newly built JHWV dataset, and large-scale urban images demonstrate the superiority of OARENet, especially when faced with a dense road occlusion situation. Code has been made available at: https://github.com/WanderRainy/OARENet.
科研通智能强力驱动
Strongly Powered by AbleSci AI