538
Views
0
CrossRef citations to date
0
Altmetric
Review

Crop field extraction from high resolution remote sensing images based on semantic edges and spatial structure map

, , , , , & show all
Article: 2302176 | Received 30 Oct 2023, Accepted 15 Dec 2023, Published online: 24 Jan 2024
 

Abstract

Crop field boundary extraction is crucial to remote sensing images attained to support agricultural production and planning. In recent years, deep convolutional neural networks (CNNs) have gained significant attention for edge detection tasks. Moreover, transformers have shown superior feature extraction and classification capabilities compared to CNNs due to their self-attention mechanism. We proposed a novel structure that combines full edge extraction with CNNs and enhances connectivity with transformers, consisting of three stages: a) preprocessing the training data; b) training the semantic edge and spatial structure graph models; and c) vectorizing the fusion of semantic edge and spatial structure graph outputs. To cater specifically to high-resolution remote sensing image crop-field boundary extraction, we developed a CNN model called Densification D-LinkNet. Its full-scale skip connections and edge-guided module adapted well to different crop-field boundary features. Additionally, we employed a spatial graph structure generator (Relationformer) based on object detection that directly outputs the structural graph of the crop field boundary. This method relies on good connectivity to repair fragmented edges that may appear in semantic edge detection. Through multiple experiments and comparisons with other edge-detection methods, such as BDCN, DexiNed, PidiNet, and EDTER, we demonstrated that our proposed method can achieve at least 9.77% improvement in boundary intersection over union (IoU) and 2.07% improvement in polygon IoU on two customized datasets. These results indicate the effectiveness and robustness of our approach.

Acknowledgments

Our experiment was based on two research datasets, both of which are proprietary to our laboratory. All authors express their gratitude to the reviewers and editors for their helpful comments and suggestions.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The dataset used in the article can be downloaded at https://pan.baidu.com/s/1JWivVLK2RVURm1MgGcQOyg?pwd=dgt8 –dgt8 and https://drive.google.com/drive/folders/1xHla26EzwOC_KiS2FkHcLkJSd7sQbhnk?usp=sharing; Some network-related codes are publicly available at https://github.com/649064287/DDLNet-main.git.

Additional information

Funding

This work was supported in part by the National Key Research and Development Program of China under Grant 2018YFB0505300 and in part by the National Natural Science Foundation of China under Grant 41701472 and 41971375.