423
Views
2
CrossRef citations to date
0
Altmetric
Research Articles

Crop classification for UAV visible imagery using deep semantic segmentation methods

, , ORCID Icon, , , , , , , ORCID Icon & show all
Pages 10033-10057 | Received 23 Jun 2021, Accepted 17 Jan 2022, Published online: 26 Mar 2022

References

  • Ammour N, Alhichri H, Bazi Y, Benjdira B, Alajlan N, Zuair M. 2017. Deep learning approach for car detection in UAV imagery. Remote Sens. 9(4):312.  
  • Atzberger C. 2013. Advances in remote sensing of agriculture: context description, existing operational monitoring systems and major information needs. Remote Sens. 5(2):949–981.
  • Badrinarayanan V, Kendall A, Cipolla R. 2017. SegNet: a deep convolutional encoder–decoder architecture for image segmentation. IEEE Trans Pattern Anal Mach Intell. 39(12):2481–2495.
  • Bhosle K, Musande V. 2019. Evaluation of deep learning CNN model for land use land cover classification and crop identification using hyperspectral remote sensing images. J Indian Soc Remote Sens. 47(11):1949–1958.
  • Bin W, Shengyi H, Qingwen M. 2020. Soil nutrient characteristics and spatial variability of farmland in high multiple cropping index area of Pidu District, Chengdu City. Ecologic Sci. 39(3):151–159.
  • Blaschke T. 2010. Object based image analysis for remote sensing. ISPRS J Photogramm Remote Sens. 65(1):2–16. [Internet]. http://dx.doi.org/10.1016/j.isprsjprs.2009.06.004.
  • Chen J, Chen T, Mei X, Shao Q, Deng M. 2014. Hilly farmland extraction from high resolution remote sensing imagery based on optimal scale selection. Nongye Gongcheng Xuebao/Transactions Chinese Soc Agric Eng. 30(5):99–107.
  • Chen LC, Papandreou G, Kokkinos I, Murphy K, Yuille AL. 2018. DeepLab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE Trans Pattern Anal Mach Intell. 40(4):834–848.
  • Chen Y, Zhang C, Wang S, Li J, Li F, Yang X, Wang Y, Yin L. 2019. Extracting crop spatial distribution from Gaofen 2 imagery using a convolutional neural network. Appl Sci. 9(14):2917.
  • Cheng G, Han J, Lu X. 2017. Remote sensing image scene classification: benchmark and state of the art. Proc IEEE. 105(10):1865–1883.
  • Cheng G, Yang C, Yao X, Guo L, Han J. 2018. When deep learning meets metric learning: remote sensing image scene classification via learning discriminative CNNs. IEEE Trans Geosci Remote Sens. 56(5):2811–2821.
  • Chollet F. 2017. Xception: deep learning with depthwise separable convolutions. In: Proc – 30th IEEE Conf Comput Vis Pattern Recognition. CVPR 2017; p. 1800–1807.
  • Du M, Noguchi N. 2017. Monitoring of wheat growth status and mapping of wheat yield’s within-field spatial variations using color images acquired from UAV–camera system. Remote Sens. 9(3):289.
  • Fu G, Liu C, Zhou R, Sun T, Zhang Q, Asokan A, Anitha J, Patrut B, Danciulescu D, Hemanth DJ. 2020. Classification for high resolution remote sensing imagery using a fully convolutional network. Remote Sens. 9(5):1–21.
  • Gebrehiwot A, Hashemi-Beni L, Thompson G, Kordjamshidi P, Langan TE. 2019. Deep convolutional neural network for flood extent mapping using unmanned aerial vehicles data. Sensors (Switzerland). 19(7).
  • Gevaert CM, Suomalainen J, Tang J, Kooistra L. 2015. Generation of STRS by combining hyperspectral UAV and multispectral satellite and hyperspectral UAV imagery for precision agriculture application. IEEE J Sel Top Appl Earth Observ Remote Sens. 8(6):3140–3146.
  • Gómez-Candón D, Virlet N, Labbé S, Jolivot A, Regnard JL. 2016. Field phenotyping of water stress at tree scale by UAV-sensed imagery: new insights for thermal acquisition and calibration. Precis Agric. 17(6):786–800.
  • Guo P, Fadong W, Jianguo D, Haijiang W, Liping X, Guoshun Z. 2017. Comparison of farmland crop classification methods based on visible light images of unmanned aerial vehicles. Trans Chinese Soc Agric Eng. 33(13):112–119.
  • Han Z, Dian Y, Xia H, Zhou J, Jian Y. 2020. Comparing fully deep convolutional neural networks for land cover classification with high-spatial-resolution Gaofen-2 images. Int J Geo-Infor. 9(8):478.
  • Hinton G, Osindero S, Welling M, Teh YW. 2006. Unsupervised discovery of nonlinear structure using contrastive backpropagation. Cogn Sci. 30(4):725–731.
  • Huang H, Deng J, Lan Y, Yang A, Deng X, Zhang L. 2018. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS One. 13(4):e0196302.
  • Ji S, Wei S, Lu M. 2019. Fully Convolutional networks for multisource building extraction from an open aerial and satellite imagery data set. IEEE Trans Geosci Remote Sens. 57(1):574–586.
  • Jiang H, Wang X, Wu B, Chen Y. 2010. Atopography-adjusted vegetation index (TAVI)and its application in vegetation fraction monitoring. J Fuzhou Univ Sci. 38(4):2–7.
  • Jing J, Wang Z, Rätsch M, Zhang H. 2020. Mobile-Unet an efficient convolutional neural network for fabric defect detection. Text Res J. 92(1–2):30–42.
  • Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun. 2016. Deep Residual Learning for Image Recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR); p. 770–778.
  • Karim F, Majumdar S, Darabi H. 2019. Insights into LSTM fully convolutional networks for time series classification. IEEE Access. 7:67718–67725.
  • Kestur R, Farooq S, Abdal R, Mehraj E, Narasipura O, Mudigere M. 2018. UFCN: a fully convolutional neural network for road extraction in RGB imagery acquired by remote sensing from an unmanned aerial vehicle. J Appl Rem Sens. 12(01):1.
  • Kingma DP, Ba JL. 2015. Adam: a method for stochastic optimization. 3rd Int Conf Learn Represent ICLR 2015 – Conf Track Proc. [Internet]. https://arxiv.org/abs/1412.6980.
  • Kumar P, Prasad R, Choudhary A, Mishra VN, Gupta DK, Srivastava PK. 2017. A statistical significance of differences in classification accuracy of crop types using different classification algorithms. Geocarto Int. 32(2):206–224.
  • Kwak GH, Park NW. 2019. Impact of texture information on crop classification with machine learning and UAV images. Appl Sci. 9(4):643.
  • Li D, Gu X, Pang Y, Chen B, Liu L. 2018. Estimation of forest aboveground biomass and leaf area index based on digital aerial photograph data in northeast China. Forests. 9(5):1–23.
  • Li Ming H, Yuqi Li Xumeng, Peng Dongxing XJ. 2018. Extraction of rice planting information based on remote sensing image from UAV. Trans Chinese Soc Agric Eng. 34(4):108–114.
  • Li X, Yang C, Huang W, Tang J, Tian Y, Zhang Q. 2020. Identification of cotton root rot by multifeature selection from sentinel-2 images using random forest. Remote Sens. 12(21):1–25.
  • Liang J, Zheng ZW, Xia ST, Zhang XT, Tang YY. 2020. Crop recognition and evaluation using red edge features of GF-6 satellite. J Remote Sensing (Chinese). 24(10):1168–1179.
  • Liebisch F, Kirchgessner N, Schneider D, Walter A, Hund A. 2015. Remote, aerial phenotyping of maize traits with a mobile multi-sensor approach. Plant Methods. 11(1):9.
  • Lin G, Liu F, Milan A, Shen C, Reid I. 2020. RefineNet: multi-path refinement networks for dense prediction. IEEE Trans Pattern Anal Mach Intell. 42(5):1228–1242.
  • Liu Y, Chen Y. 2019. Overview of deep learning research. Adv Soc Sci Edu Humani Res. 376:719–725.
  • Liu L, Dong Y, Huang W, Du X, Luo J, Shi Y, Ma H. 2019. Enhanced regional monitoring of wheat powdery mildew based on an instance-based transfer learning method. Remote Sens. 11(3):1–19.
  • Lu J, Hu J, Zhao G, Mei F, Zhang C. 2017. An in-field automatic wheat disease diagnosis system. Comput Electron Agric. 142:369–379. [Internet].
  • Meyer GE, Neto JC. 2008. Verification of color vegetation indices for automated crop imaging applications. Comput Electron Agric. 63(2):282–293.
  • Mingquan W, Liang chuang Y, Bo Y. 2014. Mapping crops acreages based on remote sensing and sampling investigation by multivariate probability proportional to size. Trans Chinese Soc Agric Eng (Trans). 30(2):146–152.
  • Moser G, Serpico SB, Benediktsson JA. 2013. Land-cover mapping by Markov modeling of spatial-contextual information in very-high-resolution remote sensing images. Proc IEEE. 101(3):631–651.
  • Navab N, Hornegger J, Wells WM. Frangi AF 2015. Medical image computing and computer-assisted intervention – MICCAI. 2015: 18th International Conference Munich, Germany, October 5–9, 2015 Proceedings, Part III. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics). 9351(CVD); p. 12–20.
  • Ozdarici-Ok A, Ok AO, Schindler K. 2015. Mapping of agricultural crops from single high-resolution multispectral images-data-driven smoothing vs. parcel-based smoothing. Remote Sens. 7(5):5611–5638.
  • Qian W, Huang Y, Liu Q, Fan W, Sun Z, Dong H, Wan F, Qiao X. 2020. UAV and a deep convolutional neural network for monitoring invasive alien plants in the wild. Comput Electron Agric. 174(April):105519. [Internet].
  • Rydberg A, Borgefors G. 2001. Integrated method for boundary delineation of agricultural fields in multispectral satellite images. IEEE Trans Geosci Remote Sensing. 39(11):2514–2520.
  • Sandler M, Howard A, Zhu M, Zhmoginov A, Chen LC. 2018. MobileNetV2: inverted residuals and linear bottlenecks.  Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). p. 4510–4520.
  • Shelhamer E, Long J, Darrell T. 2017. Fully convolutional networks for semantic segmentation. IEEE Trans Pattern Anal Mach Intell. 39(4):640–651.
  • Stöcker C, Bennett R, Nex F, Gerke M, Zevenbergen J. 2017. Review of the current state of UAV regulations. Remote Sens. 9(5):33–35.
  • Su T, Zhang S. 2021. Object-based crop classification in Hetao plain using random forest. Earth Sci Inform. 14(1):119–131.
  • Tatsumi K, Yamashiki Y, Morante AKM, Fernández LR, Nalvarte RA. 2016. Pixel-based crop classification in Peru from Landsat 7 ETM + images using a random forest model. J Agric Meteorol. 72(1):1–11.
  • Torres-Sánchez J, Peña JM, de Castro AI, López-Granados F. 2014. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput Electron Agric. 103:104–113.
  • Wang X, Wang M, Wang S. 2015. Extraction of vegetation information from visible unmanned aerial vehicle images. Pap Knowl Towards Media Hist Doc. 31(5):152–159.
  • Weizman L, Goldberger J. 2009. Urban-area segmentation using visual words. IEEE Geosci Remote Sensing Lett. 6(3):388–392.
  • Yang M, Der Tseng HH, Hsu YC, Tsai HP. 2020. Semantic segmentation using deep learning with vegetation indices for rice lodging identification in multi-date UAV visible images. Remote Sens. 12(4):633.
  • Yang G, He Y, Yang Y, Xu B. 2020. Fine-grained image classification for crop disease based on attention mechanism. Front Plant Sci. 11(December):1–15.
  • Yang Q, Shi L, Han J, Zha Y, Zhu P. 2019. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Research. 235(February):142–153. [Internet].
  • Yang Y, Zhuang Y, Bi F, Shi H, Xie Y. 2017. M-FCN: effective fully convolutional XY. IEEE Geosci Remote Sensing Lett. 14(8):1293–1297.
  • Yuan-Yuan W, LIJ. 2004. Classification methods of land use/cover based on remote sensing technology. Remote Sens Inform (Chinese). 1(1):53–59.
  • Zhang Z, Liu Q, Wang Y, Member S. 2018. Road extraction by deep residual U-Net. IEEE Geosci Remote Sens Lett. 15(5):749–753.
  • Zhao W, Du S. 2016. Learning multiscale and deep representations for classifying remotely sensed imagery. ISPRS J Photogramm Remote Sens. 113:155–165. [Internet]http://dx.doi.org/10.1016/j.isprsjprs.2016.01.004.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.