169
Views
4
CrossRef citations to date
0
Altmetric
Research Article

Lightweight Spatial-Spectral Network Based on 3D-2D Multi-Group Feature Extraction Module for Hyperspectral Image Classification

, , , &
Pages 3607-3634 | Received 18 Jan 2023, Accepted 31 May 2023, Published online: 04 Jul 2023
 

ABSTRACT

Each pixel in the hyperspectral image contains tens to hundreds of narrow bands of detailed spectral information captured by a hyperspectral sensor. Hyperspectral image classification (HSIC) is widely used in remote sensing image analysis. Convolutional neural network (CNN) is one of the most commonly used deep learning methods based on visual data processing. In recent years, hyperspectral classification method based on the convolutional neural network has shown excellent performance. However, single-scale feature extraction may lose some important detail information and cannot guarantee that the best spatial features are captured. Multi-scale character extraction and increasingly deep network structures can improve classification accuracy, but also bring a large number of computational parameters, resulting in higher computational costs. In response to this series of problems, this paper proposes a lightweight spatial-spectral network based on 3D − 2D multi-group feature extraction module (MGFM) for HSCI. In the 3D-MGFM, the input feature maps are grouped and processed in parallel, and then the information of each channel is integrated by point-wise convolution to achieve spatial-spectral feature fusion. We use dilated convolutions with different dilation rate on the convolution kernel to improve the problem of feature information loss, so that we can better learn deep-level spatial-spectral features. In the 2D-MGFM, we selectively emphasize spectral feature information through SENet and extract multi-scale features based on Depthwise Separable Convolution (DSC) grouping. Experimental results on four hyperspectral datasets demonstrate that the proposed method achieves better classification performance than other state-of-the-art methods.

Disclosure statement

The authors declare no conflict of interest.

Additional information

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant 62071168, in part by the Natural Science Foundation of Jiangsu Province under Grant BK20211201, and in part by the China Postdoctoral Science Foundation under Grant 2021M690885.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 689.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.