882
Views
45
CrossRef citations to date
0
Altmetric
Research Article

Cloud/shadow segmentation based on global attention feature fusion residual network for remote sensing imagery

ORCID Icon, , , & ORCID Icon
Pages 2022-2045 | Received 18 Feb 2020, Accepted 09 Oct 2020, Published online: 29 Dec 2020
 

ABSTRACT

Cloud and cloud shadow segmentation of satellite imageries is a prerequisite for many remote sensing applications. Due to the limited number of available spectral bands and the complexity of background information, the traditional detection methods have some problems such as false detection, missing detection and inaccurate boundary information in segmentation. To solve these problems, a global attention fusion residual network method is proposed to segment cloud and cloud shadow of satellite imageries. The proposed model adopts Residual Network (ResNet) as backbone to extract semantic information at different feature levels. In order to improve the ability of the network to deal with the boundary information, an improved atrous spatial pyramid pooling method is introduced to extract the multi-scale deep semantic information. Then, the deep semantic information is fused with the shallow spatial information through the Global Attention up-sample mechanism in different scales, which improves the network’s ability to utilize the global and local features. Finally, a boundary refinement module is utilized to predict the boundary of cloud and shadow, consequently the boundary information is refined. The experimental results on Sentinel-2 satellite and Land Remote-Sensing Satellite (Landsat) imageries show that the segmentation accuracy and speed of proposed method are superior to the existing methods, it is of great significance for realizing practical cloud and shadow segmentation.

Disclosure statement

No potential conflict of interest was reported by the authors.

Data availability

The data and the code of this study are available from the corresponding author upon request. [email protected]

Additional information

Funding

This work is supported by the National Natural Science Foundation of PR China [41875027, 42075130].

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.