501
Views
15
CrossRef citations to date
0
Altmetric
Research Article

ISAR imaging enhancement: exploiting deep convolutional neural network for signal reconstruction

ORCID Icon, , &
Pages 9447-9468 | Received 31 Mar 2020, Accepted 26 Jun 2020, Published online: 28 Oct 2020
 

ABSTRACT

Since the radar imaging can serve as an ill-posed inverse problem, it is an appealing topic to introduce deep learning instead of regularized iterative algorithm to address the presence of uncertainties in the inverse synthetic aperture radar (ISAR) forward model. Usually, the obtained image quality is largely limited by the sparse representation and denoising performance of the target scene. This paper presents a deep learning technology for radar echo signal recovery with enhanced ISAR imaging capability. If the convolutional neural network (CNN) is regarded as an imaging processor, its input and output are echo data and formed image, respectively. The trained CNN contains a variety of feature extractors, which can automatically extract and abstract high-level image feature representation. The main difficulty of CNN applications in the radar imaging field, in fact, is that a large amount of appropriate training data cannot be obtained as easy as in other fields. A novel deep CNN (DCNN)-based method, U-net-based imaging, is proposed for ISAR image reconstruction using less training samples compared to existing CNN imaging networks. Some improvements are presented to make it adapt to imaging tasks. Experimental simulations are carried out to prove the remarkable performance of the proposed method. In the presence of random noise and echo undersampling, the trained U-net achieves fast and accurate reconstruction of high-quality ISAR images, respectively.

Acknowledgements

This work was supported by the National Natural Science Foundation of China (No. 61571388). The authors are very grateful to the editor and reviewers for their constructive comments that have an important role in further improving this work.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the National Natural Science Foundation of China [61571388].

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.