756
Views
12
CrossRef citations to date
0
Altmetric
Articles

Domain adaptation for unsupervised change detection of multisensor multitemporal remote-sensing images

& ORCID Icon
Pages 3902-3923 | Received 01 Mar 2019, Accepted 05 Nov 2019, Published online: 19 Jan 2020
 

ABSTRACT

With the advancement of the space-based imaging technology, large amounts of images of different modality and spatial, spectral, and temporal resolutions are available. However, it is clearly demonstrated that combining images from different sources improves results in remote-sensing applications, it still remains a challenge to make full use of their benefits. These challenges become complicate, extending to multitemporal analysis and very high resolution (VHR) images, which includes different acquisition conditions, and consequently radiometric and geometric differences between images. To design methods that are robust to data set shifts, recent remote-sensing literature has considered solutions based on domain adaptation (DA) approaches. In this research, we aim to propose a method based on autoencoder which is a DA method to achieve fused features of synthetic aperture radar (SAR) and optical to benefit from their complementary information, and simultaneously aligning multitemporal features by reduction in spectral and radiometric differences and make multitemporal features more similar for better accuracy in change detection (CD). In continuation, to cope with the problem of detecting changes using VHR images, we introduce a framework to obtain the change map, in which, the differences caused by the registration error as well as the geometric differences of VHR images will be minimized, by maximum use of spatial information of images. For these purposes, a multitemporal pair of co-registered images, each composed of optical multispectral channels and a SAR amplitude channel acquired over the same urban area before and after changes, is assumed. Finally, the strategy of detecting changes resulted in improvement compared to common and simpler methods and also individual sensor CD method, which demonstrate the effectiveness of the proposed approach.

Acknowledgements

The authors would like to thank DigitalGlobe, Astrium Services, and USGS for acquiring and providing the data set used in this study, and the IEEE GRSS Image Analysis and Data Fusion Technical Committee.

Disclosure statement

No potential conflict of interest was reported by the authors.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 689.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.