568
Views
7
CrossRef citations to date
0
Altmetric
Review Article

Urban land-use analysis using proximate sensing imagery: a survey

&
Pages 2129-2148 | Received 30 Nov 2020, Accepted 16 Apr 2021, Published online: 03 May 2021
 

ABSTRACT

Urban regions are complicated functional systems that are closely associated with and reshaped by human activities. The propagation of online geographic information-sharing platforms and mobile devices equipped with the Global Positioning System (GPS) greatly proliferates proximate sensing images taken near or on the ground at a close distance to urban targets. Studies leveraging proximate sensing images have demonstrated great potential to address the need for local data in the urban land-use analysis. This paper reviews and summarizes the state-of-the-art methods and publicly available data sets from proximate sensing to support land-use analysis. We identify several research problems in the perspective of examples to support the training of models and means of integrating diverse data sets. Our discussions highlight the challenges, strategies, and opportunities faced by the existing methods using proximate sensing images in urban land-use studies.

Acknowledgement

The first and second author have equal contribution to this paper.

Supplemental data

Supplemental data for this article can be accessed here.

Data and codes availability statement

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Zhinan Qiao

Zhinan Qiao received her B.S. degree in Microelectronics from Xi'an University of Posts & Telecommunications in 2011, an M.S. degree in Computer Science from Georgia Southwestern State University in 2016, and an M.A. degree in Digital Media Art from Northeast Dianli University in 2017. She is currently working towards a Ph.D. degree in Computer Science and Engineering at the University of North Texas. Her research interests include machine learning and deep learning with a focus on computer vision, scene recognition, and geo-spatial information discovery.

Xiaohui Yuan

Xiaohui Yuan received a B.S. degree in Electrical Engineering from the Hefei University of Technology, China in 1996 and a Ph.D. degree in Computer Science from the Tulane University in 2004. He is an Associate Professor in the Department of Computer Science and Engineering at the University of North Texas. His research interests include computer vision, data mining, machine learning, and artificial intelligence. His research has been funded by Air Force Lab, National Science Foundation, Texas Advanced Research Program, and Oak Ridge Associated Universities. He has published more than 180 peer-reviewed papers. Dr. Yuan is a recipient of the Ralph E. Powe Junior Faculty Enhancement award in 2008 and the Air Force Summer Faculty Fellowship in 2011, 2012, and 2013, and was featured in the 2020 Higher Education Review Magazine and EurekAlert! Science News 2017. He served on the editorial board of several international journals and served as session chairs in many conferences, as well as a panel reviewer for funding agencies including NSF, NASA, NIH, and Louisiana Board of Regent’s Research Competitiveness program. He is a senior member of IEEE.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 704.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.