1,097
Views
0
CrossRef citations to date
0
Altmetric
Research Article

NAS-YOLOX: a SAR ship detection using neural architecture search and multi-scale attention

, , &
Pages 1-32 | Received 30 Apr 2023, Accepted 06 Sep 2023, Published online: 04 Oct 2023
 

Abstract

Due to the advantages of all-weather capability and high resolution, synthetic aperture radar (SAR) image ship detection has been widely applied in the military, civilian, and other domains. However, SAR-based ship detection suffers from limitations such as strong scattering of targets, multiple scales, and background interference, leading to low detection accuracy. To address these limitations, this paper presents a novel SAR ship detection method, NAS-YOLOX, which leverages the efficient feature fusion of the neural architecture search feature pyramid network (NAS-FPN) and the effective feature extraction of the multi-scale attention mechanism. Specifically, NAS-FPN replaces the PAFPN in the baseline YOLOX, greatly enhances the fusion performance of the model’s multi-scale feature information, and a dilated convolution feature enhancement module (DFEM) is designed and integrated into the backbone network to improve the network’s receptive field and target information extraction capabilities. Furthermore, a multi-scale channel-spatial attention (MCSA) mechanism is conceptualised to enhance focus on target regions, improve small-scale target detection, and adapt to multi-scale targets. Additionally, extensive experiments conducted on benchmark datasets, HRSID and SSDD, demonstrate that NAS-YOLOX achieves comparable or superior performance compared to other state-of-the-art ship detection models and reaches best accuracies of 91.1% and 97.2% on AP0.5, respectively.

Additional information

Funding

This research was funded by the agreement for the 2022 Graduate Top Innovative Talents Training Program at Shanghai Maritime University [grant number: 2022YBR005] and the Top-notch Innovative Talent Training Program for Graduate students of Shanghai Maritime University [grant number: 2021YBR008].