234
Views
4
CrossRef citations to date
0
Altmetric
Research Article

A spectral-spatial attention aggregation network for hyperspectral imagery classification

, ORCID Icon, , &
Pages 7551-7580 | Received 23 Aug 2020, Accepted 24 Apr 2021, Published online: 25 Aug 2021
 

ABSTRACT

For the classification of hyperspectral imagery (HSI), the convolutional neural network (CNN) can learn the discriminative spatial-spectral information of the image better than the traditional classification methods. However, when CNN uses the local receptive field to extract the features of HSI, it may cause the feature expression of the same pixel on the feature map to be inconsistent, and eventually cause noise in the classification results. To overcome this, we introduce the attention mechanism in the CNN model to improve the feature expressiveness. A spectral-spatial attention aggregation network (SSAAN) for HSI classification is designed, and there are two attention branches in our method. The spectral attention module with the squeeze-and-excitation (SESAM) automatically obtains the importance of each feature channel of HSI, and then enhances the useful band features and suppresses the less-useful band features according to this importance. In the spatial attention module with selective kernel (SKSAM), first, different convolution kernels of 2D-CNN are used to extract the shallow-middle-deep layer features from the principal components after dimension reduction, and the pixel spatial information from the three paths is combined and aggregated. Then, the feature maps of kernels of different sizes are aggregated according to the selection weights. Finally, the feature vectors obtained from the two branches of the spatial attention module and the spectral attention module are connected to further improve feature representation, and the classification result is obtained by the softmax function. Experimental results through three real HSI data sets show that our proposed method SSAAN achieves better performance compared to the state-of-the-art methods.

Acknowledgements

The authors would like to thank the Editor-in-Chief, the Associate Editor, the Technical Editor, and the reviewers for their insightful comments and suggestions, which significantly improved the quality and presentation of this article.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant 61977022; in part by the Science Foundation for Distinguished Young Scholars of Hunan Province under Grant 2020JJ2017; in part by the Natural Science Foundation of Hunan Province under Grant 2019JJ50211, Grant 2019JJ50212, Grant 2020JJ4340, and Grant 2020JJ4343; in part by the Key Research and Development Program of Hunan Province under Grant 2019SK2102; in part by the Foundation of Education Bureau of Hunan Province under Grant 19B245, Grant 19B237, and Grant 20B257; in part by the Engineering Research Center on 3-D Reconstruction and Intelligent Application Technology of Hunan Province under Grant 2019-430602-73-03-006049; in part by the Hunan Emergency Communication Engineering Technology Research Center under Grant 2018TP2022; and in part by the Guangxi Key Laboratory of Cryptography and Information Security under Grant GCIS201911.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 689.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.