159
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Crop classification methods and influencing factors of reusing historical samples based on 2D-CNN

ORCID Icon, , , , ORCID Icon, , , , , & show all
Pages 3278-3305 | Received 10 Mar 2023, Accepted 17 May 2023, Published online: 16 Jun 2023
 

ABSTRACT

Crop classification is a crucial task in agricultural remote sensing, with the accuracy of such classification heavily relies on field sampling. Reusing historical samples can minimize the reliance on annual field sampling for crop classification. Although previous research primarily focused on the classification accuracy based on the reused historical samples, the underlying factors that influence the accuracy have not been adequately investigated. In this study, we employed a two-dimensional convolutional neural network (2D-CNN) model for crop classification reusing historical samples and investigated the factors influencing the classification accuracy. First, we calculated a normalized difference vegetation index (NDVI) time series from historical data to characterize crop growth patterns. Secondly, we assessed three different time-series construction methods. Subsequently, we used the 2D-CNN model to automatically extract abstract features of crop growth patterns. Finally, we designed various strategies for reusing historical samples to explore the influencing factors. Experiments were conducted in Kuitun, Xinjiang Uygur Autonomous Region, China, from 2016 to 2020, employing a long time series of Sentinel-2 images as remote sensing data. Our results indicated that optimal 2D-CNN models using an irregular satellite image time series (irSITS) outperformed random forest models in inter-annual classifications (the overall accuracies for 2016–2020 were 0.78, 0.61, 0.89, 0.89, and 0.70, respectively). In addition, we identified that the primary factors affecting classification accuracy were 1) the time-series construction method used; 2) the crop growth patterns; and 3) the sample diversity. By reusing historical samples and considering the factors influencing classification accuracy, this study provides valuable insights into high-quality crop classification mapping under limited field sample conditions.

Nomenclature

2D-CNN=

Two-dimensional convolutional neural network

DL=

Deep learning

DT=

Decision tree

OB=

Optical bands

VI=

Vegetation index

PM=

Phenological metrics

NDVI=

Normalized difference vegetation index

ML=

Machine learning

SVM=

Support vector machine

DT=

Decision tree

RF=

Random forest

DL=

Deep learning

LSTM=

Long short term memory

RNN=

Recurrent neural network

CNN=

Convolutional neural network

1D-CNN=

One-dimensional convolutional neural network

SITS=

Satellite image time series

rSITS=

Regular satellite image time series

irSITS=

Irregular satellite image time series

GEE=

Google earth engine

SR=

Surface reflectance

RDCRMG=

Raster dataset clean and reconstitution multi-grid

SY=

Same-year

CY=

Cross-year

OA=

Overall accuracy

F1=

F1-score

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Third comprehensive scientific expedition to Xinjiang (2021×jkk1403), National Key research and Development Program of China (2021YFC1523503), National Natural Science Foundation of China (41971375), the Key Research and Development Programme of Xinjiang Uygur Autonomous Region (2022B03001-3).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.