1,851
Views
4
CrossRef citations to date
0
Altmetric
Research Article

Lightweight multilayer interactive attention network for aspect-based sentiment analysis

, , &
Article: 2189119 | Received 12 Aug 2022, Accepted 06 Mar 2023, Published online: 03 Apr 2023
 

Abstract

Aspect-based sentiment analysis (ABSA) aims to automatically identify the sentiment polarity of specific aspect words in a given sentence or document. Existing studies have recognised the value of interactive learning in ABSA and have developed various methods to precisely model aspect words and their contexts through interactive learning. However, these methods mostly take a shallow interactive way to model aspect words and their contexts, which may lead to the lack of complex sentiment information. To solve this issue, we propose a Lightweight Multilayer Interactive Attention Network (LMIAN) for ABSA. Specifically, we first employ a pre-trained language model to initialise word embedding vectors. Second, an interactive computational layer is designed to build correlations between aspect words and their contexts. Such correlation degree is calculated by multiple computational layers with neural attention models. Third, we use a parameter-sharing strategy among the computational layers. This allows the model to learn complex sentiment features with lower memory costs. Finally, LMIAN conducts instance validation on six publicly available sentiment analysis datasets. Extensive experiments show that LMIAN performs better than other advanced methods with relatively low memory consumption.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the National Natural Science Foundation of China [grant number 62076006], the University Synergy Innovation Programme of Anhui Province [grant number GXXT-2021-008], the Anhui Provincial Key R&D Programme [grant number 202004b11020029], and the Scientific Research Fund for Young Teachers of Anhui University of Science & Technology [grant number QNZD2021-02].