275
Views
6
CrossRef citations to date
0
Altmetric
Research Article

A self-attention multi-scale convolutional neural network method for SAR image despeckling

, , , &
Pages 902-923 | Received 05 May 2022, Accepted 20 Jan 2023, Published online: 24 Feb 2023
 

ABSTRACT

The speckle noise found in synthetic aperture radar (SAR) images severely affects the efficiency of image interpretation, retrieval and other applications. Thus, effective methods for despeckling SAR image are required. The traditional methods for SAR image despeckling fail to balance in terms of the relationship between the intensity of speckle noise filtering and the retention of texture details. Deep learning based SAR image despeckling methods have been shown to have the potential to achieve this balance. Therefore, this study proposes a self-attention multi-scale convolution neural network (SAMSCNN) method for SAR image despeckling. The advantage of the SAMSCNN method is that it considers both multi-scale feature extraction and channel attention mechanisms for multi-scale fused features. In the SAMSCNN method, multi-scale features are extracted from SAR images through convolution layers with different depths. These are concatenated; then, and an attention mechanism is introduced to assign different weights to features of different scales, obtaining multi-scale fused features with weights. Finally, the despeckled SAR image is generated through global residual noise reduction and image structure fine-tuning. The despeckling experiments in this study involved a variety of scenes using simulated and real data. The performance of the proposed model was analysed using quantitative and qualitative evaluation methods and compared to probabilistic patch-based (PPB), SAR block-matching 3-D (SAR-BM3D) and SAR-CNN methods. The experimental results show that the method proposed in this paper improves the objective indexes and shows great advantages in visual effects compared to these classical methods. The method proposed in this study can provide key technical support for the practical application of SAR images.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Projects Funded by the Central Government to Guide Local Scientific and Technological Development [22ZY1QA005], the National Natural Scientific Foundation of China [42201459], Young Doctoral Fund Project of Higher Education Institutions in Gansu Province [2022QB-058], Research project of Transportation Department of Gansu Province (2021-31) and Key R & D programs – Industrial [21YF11GA008].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 689.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.