284
Views
0
CrossRef citations to date
0
Altmetric
Research article

A multi-objective comparison of CNN architectures in Arctic human-built infrastructure mapping from sub-meter resolution satellite imagery

ORCID Icon, , &
Pages 7670-7705 | Received 21 Aug 2023, Accepted 12 Nov 2023, Published online: 11 Dec 2023
 

ABSTRACT

Risk assessment of infrastructure exposed to ice-rich permafrost hazards is essential for climate change adaptation in the Arctic. As this process requires up-to-date, comprehensive, high-resolution maps of human-built infrastructure, gaps in such geospatial information and knowledge of the applications required to produce it must be addressed. Therefore, this study highlights the ongoing development of a deep learning approach to efficiently map the Arctic built environment by detecting nine different types of structures (detached houses, row houses, multi-story blocks, non-residential buildings, roads, runways, gravel pads, pipelines, and storage tanks) from recently-acquired Maxar commercial satellite imagery (<1 m resolution). We conducted a multi-objective comparison, focusing on generalization performance and computational cost, of nine different semantic segmentation architectures. K-fold cross validation was used to estimate the average F1-score of each architecture and the Friedman Aligned Ranks test with the Bergmann-Hommel post-hoc procedure was applied to test for significant differences in generalization performance. ResNet-50-UNet++ performs significantly better than five out of the other eight candidate architectures; no significant difference was found in the pairwise comparisons of ResNet-50-UNet++ to ResNet-50-MANet, ResNet-101-MANet, and ResNet-101-UNet++. We then conducted a high-performance computing scaling experiment to compare the number of service units and runtime required for model inferencing on a hypothetical pan-Arctic scale dataset. We found that the ResNet-50-UNet++ model could save up to ~ 54% on service unit expenditure, or ~ 18% on runtime, when considering operational deployment of our mapping approach. Our results suggest that ResNet-50-UNet++ could be the most suitable architecture (out of the nine that were examined) for deep learning-enabled Arctic infrastructure mapping efforts. Overall, our findings regarding the differences between the examined CNN architectures and our methodological framework for multi-objective architecture comparison can provide a foundation that may propel future pan-Arctic GeoAI mapping efforts of infrastructure.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The Python code used for deep learning model development will be made available at https://github.com/eliasm56/Arctic-Infrastructure-Detection-Paper. The training dataset used in this study is not available due to restrictions on sharing commercial satellite imagery.

Additional information

Funding

This work is funded by the U.S. National Science Foundation’s Office of Polar Programs (NSF-OPP) (grant No. 1927723, 1927872, and 2052107). Furthermore, this work used the Delta supercomputer at the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign through allocation #EES220055 from the Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS) program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603, and #2138296.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.