237
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Multi-scale spatial-spectral Transformer for spectral reconstruction from RGB images

ORCID Icon, , & ORCID Icon
Pages 306-324 | Received 01 Jun 2023, Accepted 26 Nov 2023, Published online: 08 Jan 2024
 

ABSTRACT

Spectrometers can obtain fine spectral reflectance of substances through high-resolution spectral sampling, which has important advantages in identifying the categories of ground substances. As a complement to high-cost spectrometers, reconstruction from RGB images provides an economical and convenient way to acquire hyperspectral images. Although the current mainstream reconstruction methods based on deep convolution network can directly learn the inverse reconstruction mapping in data-driven way, they still have shortcomings in the representation of multi-scale and long-range spatial-spectral correlations within hyperspectral images. To conquer these issues, we propose a novel multi-scale spatial-spectral Transformer network for spectral reconstruction. The proposed network consists of a cascade of multiple-scale spatial contextual module and spatial-spectral fusion Transformer module. Specifically, the multiple-scale spatial contextual module performs three-level feature extraction to learn the spatial contextual structures of each spectral band from the RGB image. The spatial-spectral fusion Transformer module employs three parallel spatial-spectral united Transformers and one fusion Transformer to enhance the spatial and spectral consistency of the reconstructed spectral image. Parallel deployment of spatial-spectral united Transformers is beneficial to reduce the over-smooth effect caused by stacking too many Transformers. Therefore, the proposed network can better recover hyperspectral images with both fine spatial structures and accurate spectral signatures. Comprehensive experiments show that our method can achieve better reconstruction performance than the state-of-the-art methods.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This research was enabled in part by support provided by the National Natural Science Foundation of China, funding reference number [Grants 62276139, U2001211 and U21B2044].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 689.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.