82
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Dual-input ultralight multi-head self-attention learning network for hyperspectral image classification

, ORCID Icon, ORCID Icon, &
Pages 1277-1303 | Received 30 Oct 2023, Accepted 10 Jan 2024, Published online: 02 Feb 2024
 

ABSTRACT

In the hyperspectral image (HSI) classification tasks, various deep learning models have achieved remarkable success. However, most deep learning models are compute-intensive, requiring significant computing power, time, and other resources. It becomes a challenge to pursue better results while saving computational resources. Therefore, a novel dual-input ultralight multi-head self-attention learning network (DUMS-LN) is proposed for HSI classification. The proposed DUMS-LN consists of three main core modules, namely the high-dimensional reduced module (HDRM), lightweight multi-head self-attention (LMHSA) module, and linearized hierarchical conversion module (LHCM). HDRM is used as a pre-processing module with efficient data compression and combines spatial and spectral information extraction from the raw data to provide cleaner and more comprehensive feature data for subsequent processing. In addition, the core computational module of DUMS-LN is the LMHSA module, which is lightweight but possesses better data processing capability than the traditional multi-head self-attention module. Finally, the LHCM divides the model into two phases, reducing the dimensionality of the feature data phase by phase so that the LMHSA module can perform feature extraction at different levels. Experiments on four benchmark HSI datasets show that the proposed DUMS-LN outperforms the comparison HSI classification algorithms regarding speed and classification accuracy.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the National Natural Science Foundation of China under Grant 62071492.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.