Acta Geodaetica et Cartographica Sinica ›› 2023, Vol. 52 ›› Issue (7): 1139-1147.doi: 10.11947/j.AGCS.2023.20220540

• Special Issue of Hyperspectral Remote Sensing Technology • Previous Articles     Next Articles

Hyperspectral image classification method based on hierarchical transformer network

ZHANG Yichao1,2, ZHENG Xiangtao1,3, LU Xiaoqiang1,3   

  1. 1. Key Laboratory of Spectral Imaging Technology CAS, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi'an 710119, China;
    2. University of Chinese Academy of Sciences, Beijing 100049, China;
    3. College of Physics and Information Engineering, Fuzhou University, Fuzhou 350100, China
  • Received:2022-08-25 Revised:2023-04-16 Published:2023-07-31
  • Supported by:
    The National Natural Science Foundation of China (No. 62271484); The National Science Fund for Distinguished Young Scholars (No. 61925112); The Key Research and Development Program of Shaanxi (No. 2023-YBGY-225)

Abstract: Hyperspectral image classification, which assigns each pixel to predefined land cover categories, is of crucial importance in various Earth science tasks such as environmental mapping and other related fields. In recent years, scholars have attempted to utilize deep learning frameworks for hyperspectral image classification and achieved satisfactory results. However, these methods still have certain deficiencies in extracting spectral features. This paper proposes a hierarchical self-attention network (HSAN) for hyperspectral image classification based on the self-attention mechanism. Firstly, a skip-layer self-attention module is constructed for feature learning, leveraging the self-attention mechanism of Transformer to capture contextual information and enhance the contribution of relevant information. Secondly, a hierarchical fusion method is designed to further alleviate the loss of relevant information during the feature learning process and enhance the interplay of features at different hierarchical levels. Experimental results on the Pavia University and Houston2013 datasets demonstrate that the proposed framework outperforms other state-of-the-art hyperspectral image classification frameworks.

Key words: hyperspectral image classification, transformer, self-attention mechanism, hierarchical fusion

CLC Number: