ABSTRACT
With the advancement of the space-based imaging technology, large amounts of images of different modality and spatial, spectral, and temporal resolutions are available. However, it is clearly demonstrated that combining images from different sources improves results in remote-sensing applications, it still remains a challenge to make full use of their benefits. These challenges become complicate, extending to multitemporal analysis and very high resolution (VHR) images, which includes different acquisition conditions, and consequently radiometric and geometric differences between images. To design methods that are robust to data set shifts, recent remote-sensing literature has considered solutions based on domain adaptation (DA) approaches. In this research, we aim to propose a method based on autoencoder which is a DA method to achieve fused features of synthetic aperture radar (SAR) and optical to benefit from their complementary information, and simultaneously aligning multitemporal features by reduction in spectral and radiometric differences and make multitemporal features more similar for better accuracy in change detection (CD). In continuation, to cope with the problem of detecting changes using VHR images, we introduce a framework to obtain the change map, in which, the differences caused by the registration error as well as the geometric differences of VHR images will be minimized, by maximum use of spatial information of images. For these purposes, a multitemporal pair of co-registered images, each composed of optical multispectral channels and a SAR amplitude channel acquired over the same urban area before and after changes, is assumed. Finally, the strategy of detecting changes resulted in improvement compared to common and simpler methods and also individual sensor CD method, which demonstrate the effectiveness of the proposed approach.
Acknowledgements
The authors would like to thank DigitalGlobe, Astrium Services, and USGS for acquiring and providing the data set used in this study, and the IEEE GRSS Image Analysis and Data Fusion Technical Committee.
Disclosure statement
No potential conflict of interest was reported by the authors.