2,027
Views
67
CrossRef citations to date
0
Altmetric
Research Article

Deep learning versus Object-based Image Analysis (OBIA) in weed mapping of UAV imagery

ORCID Icon, , , , &
Pages 3446-3479 | Received 15 Aug 2019, Accepted 19 Oct 2019, Published online: 06 Jan 2020
 

ABSTRACT

Rice is the most important food crop in the world, which is meaningful to ensure the quality and quantity of rice production. During the plantation process, weeds are the key factor to influence the rice yields. In recent years, the chemical control becomes the most widely used means to control the weed infestation because of its advantage in pesticide effects and efficiency. However, excessive use of herbicides has caused negative effects on the rice quality as well as the environment. An accurate weed cover map can provide support information for specific site weed management (SSWM) applications, which may well address the problem of traditional chemical controls. In this work, the unmanned aerial vehicle (UAV) imagery was captured on four different dates over two different rice fields. Object-based image analysis (OBIA) and deep learning approaches were applied to the weed mapping task of the UAV imagery. For the OBIA methods, the multiresolution segmentation and an improved k-means method were applied to segment the imagery into different objects; the colour and texture features were extracted and concatenated into a feature vector; back propagation (BP) neural network, support vector machine (SVM) and random forest were used for classification. After careful hyperparameter optimization and model selection, it was proven that the OBIA method achieved the accuracy of 66.6% mean intersection over union (MIU) on the testing set, and the inference speed is 2343.5 ms for an image sample. For the deep learning approach, the fully convolutional network (FCN) was applied for the pixel-wise classification task; transfer learning was used, and four pretrained convolutional neural networks (AlexNet, VGGNet, GoogLeNet, and ResNet) were transferred to our dataset via fine-tuning technique. Traditional skip architecture and fully connected conditional random fields (CRF) were used to improve the spatial details of FCN; after that, this work proposed to use a partially connected CRF as post processing, which may significantly accelerate the inference speed of fully connected CRF. Besides one single improvement method, hybrid improvement methods were applied and tested. Experimental results showed that the VGGNet-based FCN achieved the highest accuracy; for the improvement methods, the skip architecture and newly proposed partially connected CRF effectively improved the accuracy, and the hybrid improvement method (skip architecture and partially connected CRF) further improved the performance. The hybrid improvement method achieved 80.2% MIU on the testing set, and the inference speed for an image sample is 326.8 ms. The experimental results of this work demonstrated that the UAV remote-sensing utilizing deep learning method can provide reliable support information for SSWM applications in rice fields.

Author Contributions

Prof. Yubin Lan and Jizhong Deng designed the experiments; Huasheng Huang and Aqing Yang conducted the data collection; Jizhong Deng, Huasheng Huang and Aqing Yang performed the data analyzing, code programming and wrote the manuscript; Yali Zhang and Sheng Wen revised the manuscript.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This research was funded by G Guangdong Provincial Innovation Team for General Key Technologies in Modern Agricultural Industry [Grant No. 2019KJ133], the Science and Technology Planning Project of Guangdong Province, China [Grant No. 2017A020208046], the leading talents of Guangdong province program [2016LJ06G689], the Science and Technology Planning Project of Guangdong Province [2019B020214003], the 111 Project (D18019), the Key Area Research and Development Planning Project of Guangdong Province [Grant No. 2019B020221001], the Science and Technology Planning Project of Guangdong Province, China [Grant No. 2018A050506073], the National Key Research and Development Plan, China [Grant No. 2016YFD0200700], the Science and Technology Planning Project of Guangdong Province, China [Grant No. 2016A020210100], the Science and Technology Planning Project of Guangdong Province, China (Grant No. 2017B010117010), and the Science and Technology Planning Project of Guangzhou city, China [Grant No. 201707010047].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 689.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.