177
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Self-calibrated dilated convolutional neural networks for SAR image despeckling

, , , , , & show all
Pages 6483-6508 | Received 06 Apr 2022, Accepted 20 Oct 2022, Published online: 15 Nov 2022
 

ABSTRACT

Synthetic aperture radar (SAR) imaging process is essentially disturbed by speckle noise due to its unique mechanism. Speckle noise causes severe degradation of SAR image quality, which significantly limits its practical application. Recently, convolutional neural networks (CNNs) have indicated good potential for various image processing tasks. In this article, we propose a self-calibrated dilated convolutional neural network for SAR image despeckling called SAR-SCDCNN. The main body of SAR-SCDCNN is formed by seven self-calibrated blocks (SeCaBlock). Firstly, in each SeCaBlock, the input features are split into two branches: one represents the contextual features in the original space, and another represents those in the long-range space. Then, the down-up sampling operation and the convolutions with hybrid dilated rates are employed to enlarge the receptive field. The weights for calculating features themselves are adaptively extracted in the second branch. And then, two branch features are concatenated and fused through a convolution. Finally, a skip connection is used between the input and output of each SeCaBlock to give full play to the expressive power of the deep neural network and enhance training stability. Experiments on synthetic speckled and real SAR images are conducted to perform objective quantitative and subjective visual evaluations of image quality. Results show that our proposed method can effectively suppress speckle noise and adequately preserve detailed features.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Natural Science Foundation of Heilongjiang Province under Grant F2018008 and Grant JJ2019LH2160, the Foundation for Distinguished Young Scholars of Harbin under Grant 2017RAYXJ016, the Fundamental Research Funds for the Central Universities under Grant 3072021CFT0602, and the Fundamental Research Funds for the Central Universities–Research and Innovation Fund for Doctoral Students.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.