Item

Improving human activity recognition via graph attention network with linear discriminant analysis and residual learning

Hu, Lingyue
Zhao, Kailong
Ling, Bingo Wing-Kuen
Liang, Shangsong
Wei, Yiting
Supervisor
Department
Natural Language Processing
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
This paper proposes a model based on the graph attention network (GAT) with the linear discriminant analysis (LDA) and the residual learning (AResGAT) for improving the human activity recognition (HAR). This approach can address the challenges due to the data scarcity and improve the robustness of the classification with respect to the adversarial attacks due to the operations in the adversarial environments by applying the adversarial loss principle. As a result, the reliability of the classification is improved. Moreover, by using the minimum spanning trees (MST) for constructing the graph, our proposed method can efficiently capture the information related to the complex interactions among the nodes which are used for representing different parts of the human body. Furthermore, since the gradient is computed in a more efficient way, the convergence of the training algorithm is fastened. Hence, the AResGAT tackles the vanishing gradient problem found in the training of the deep learning model. Finally, the computer numerical simulation results show that the AResGAT significantly outperforms the existing models for the various datasets.
Citation
L. Hu, K. Zhao, B. Wing-Kuen Ling, S. Liang, and Y. Wei, “Improving human activity recognition via graph attention network with linear discriminant analysis and residual learning,” Biomed Signal Process Control, vol. 100, p. 107053, Feb. 2025, doi: 10.1016/J.BSPC.2024.107053.
Source
Biomedical Signal Processing and Control
Conference
Keywords
Data scarcity, Graph attention network, Human activity recognition, Linear discriminant analysis, Residual learning
Subjects
Source
Publisher
Elsevier
Full-text link