88
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Hyperspectral image classification using Walsh Hadamard transform- based key band selection and deep convolutional neural networks

, , , &
Pages 1220-1249 | Received 14 Jun 2023, Accepted 12 Jan 2024, Published online: 07 Feb 2024
 

ABSTRACT

Hyperspectral images (HSIs) have transformed the field of remote sensing by providing researchers with a wealth of information about the Earth’s surface. However, analyzing these images can be an overwhelming task due to the presence of overlapping areas, nested regions, and large intra-class variability. Hyperspectral image classification (HSIC) is a crucial part of identifying the various land cover classes present in hyperspectral images. In order to enhance the accuracy of HSIC, researchers utilize the potential of three-dimensional convolutional neural networks (3D-CNNs). With the ability to influence both the spectral and spatial data present in HSIs, 3D-CNNs provide a promising solution to overcome the challenges associated with HSIC. In this paper, a new method for key band selection is proposed to improve the performance of 3D-CNN model. The proposed method selects the most relevant key bands based on Walsh-Hadamard kernel strength features. These key bands are then used to extract overlapping 3D spatial patches, which serve as input to the proposed 3D-CNN model. To evaluate the performance of the 3D-CNN model six standard benchmark datasets are used. The effectiveness of the proposed method improves the performance of 3D-CNN for HSIC.

Acknowledgements

The authors thank the Vellore Institute of Technology, Vellore, for providing support in carrying out this research work.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data presented in this study are openly available in the PaviaU, Indian Pines, and Salinas datasets at 10.1109/LGRS.2020.3043710.

Author contributions

Conceptualization, L.P.G.G. and C.M.P.V.S.S.R; methodology, D.S. and L.P.G.G.; software, L.P.G.G.; validation, L.P.G.G. and C.P.V.S.S.R; writing – original draft preparation, L.P.G.G., C.M.P.V.S.S.R, and C.S.G.; writing – review and editing, C.S.G.; visualization, L.P.G.G.; supervision, B.K.C; project administration, B.K.C.; funding acquisition, L.P.G.G. and B.K.C. All authors have read and agreed to the published version of the manuscript.

Additional information

Funding

This research was funded by SERB-SIRE, India grant number SIR/2022/000321 and partially supported by the Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, and Forestry (IPET) through the Open Field Smart Agriculture Technology Short-term Advancement Program, funded by the Ministry of Agriculture, Food and Rural Affairs (MAFRA)(322032-3).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.