3,402
Views
2
CrossRef citations to date
0
Altmetric
Articles

A text classification method based on LSTM and graph attention network

&
Pages 2466-2480 | Received 16 Jun 2022, Accepted 17 Sep 2022, Published online: 06 Oct 2022

References

  • Aggarwal, C. C., & Zhai, C. (2012). A survey of text classification algorithms. In C. C. Aggarwal & C. Zhai (Eds.), Mining text data (pp. 163–222). Springer. https://doi.org/10.1007/978-1-4614-3223-4_6.
  • Bai, H., Chen, Z., Lyu, M. R., King, I., & Xu, Z. (2018). Neural relational topic models for scientific article analysis. Proceedings of the 27th ACM International Conference on Information and Knowledge Management (CIKM) (pp. 27–36). Association for Computing Machinery, New York, NY, USA, 22–26 October 2018. https://doi.org/10.1145/3269206.3271696.
  • Bastings, J., Titov, I., Aziz, W., Marcheggiani, D., & Sima'an, K. (2017). Graph convolutional encoders for syntax-aware neural machine translation. arXiv preprint arXiv:1704.04675.
  • Defferrard, M., Bresson, X., & Vandergheynst, P. (2016). Convolutional neural networks on graphs with fast localized spectral filtering. Proceedings of the 30th International Conference on Neural Information Processing Systems(NIPS) (pp. 3844–3852). Curran Associates Inc.,  Red Hook, NY, United States, 5 December 2016.
  • Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805
  • Gemmis, M. D., Lops, P., Musto, C., Narducci, F., & Semeraro, G. (2015). Semantics-aware content-based recommender systems. In F. Ricci, L. Rokach, & B. Shapira (Eds.), Recommender systems handbook (pp. 119–159). Springer. https://doi.org/10.1007/978-1-4899-7637-6_4.
  • Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
  • Hu, L., Yang, T., Shi, C., Ji, H., & Li, X. (2019). Heterogeneous graph attention networks for semi-supervised short text classification. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (pp. 4821–4830), Association for Computational Linguistics, Hong Kong, China, 3-7 November 2019. https://doi.org/10.18653/v1/D19-1488.
  • Huang, L., Ma, D., Li, S., Zhang, X., & Wang, H. (2019). Text level graph neural network for text classification. arXiv preprint arXiv:1910.02356.
  • Jaderberg, M., Simonyan, K., Vedaldi, A., & Zisserman, A. (2016). Reading text in the wild with convolutional neural networks. International Journal of Computer Vision, 116(1), 1–20. https://doi.org/10.1007/s11263-015-0823-z
  • Jia, X., & Wang, L. (2022). Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network. PeerJ Computer Science, 7, e831. https://doi.org/10.7717/peerj-cs.831
  • Joulin, A., Grave, E., Bojanowski, P., & Mikolov, T. (2016). Bag of tricks for efficient text classification. arXiv preprint arXiv:1607.01759.
  • Khan, A., Sohail, A., Zahoora, U., & Qureshi, A. S. (2020). A survey of the recent architectures of deep convolutional neural networks. Artificial Intelligence Review, 53(8), 5455–5516. https://doi.org/10.1007/s10462-020-09825-6
  • Kim, Y. (2014). Convolutional neural networks for sentence classification. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing(EMNLP) (pp. 1746–1751), Doha, Qatar, 25-29 October 2014.
  • Liang, Z., Xie, H., & An, W. (2020). Text classification based on BiGRU and Bayesian classifier. Computer Engineering and Design, 41(2), 381–385. https://doi.org/10.16208/j.issn1000-7024.2020.02.013.
  • Liu, P., Qiu, X., & Huang, X. (2016). Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101.
  • Liu, Y., Guan, R., Giunchiglia, F., Liang, Y., & Feng, X. (2021). Deep attention diffusion graph neural networks for text classification. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing(EMNLP) (pp. 8142–8152), Association for Computational Linguistics, Punta Cana, Dominican Republic, November 2021. https://doi.org/10.18653/v1/2021.emnlp-main.642.
  • Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., & Khudanpur, S. (2010). Recurrent neural network based language model. Interspeech, 2(3), 1045–1048. https://doi.org/10.21437/Interspeech.2010-343
  • Pal, A., Selvakumar, M., & Sankarasubbu, M. (2020). Multi-label text classification using attention-based graph neural network. arXiv preprint arXiv:2003.11644.
  • Vaswani, A., Shazeer, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. 31st Conference on Neural Information Processing Systems (NIPS) (pp. 6000–6010). Long Beach, CA, USA, 4 December 2017https://doi.org/10.48550/arXiv.1706.03762.
  • Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., & Bengio, Y. (2018). Graph attention networks. arXiv preprint arXiv:1710.10903.
  • Wang, K., Han, S. C., & Poon, J. (2022). InducT-GCN: Inductive graph convolutional networks for text classification. arXiv preprint arXiv:2206.00265.
  • Wang, P., Xu, B., Xu, J., Tian, G., Liu, C.-L., & Hao, H. (2016). Semantic expansion using word embedding clustering and convolutional neural network for improving short text classification. Neurocomputing, 174, 806–814. https://doi.org/10.1016/j.neucom.2015.09.096
  • Wang, Y., Jin, J., Zhang, W., Yu, Y., Zhang, Z., & Wipf, D. (2021). Bag of tricks for node classification with graph neural networks. arXiv preprint arXiv:2103.13355.
  • Xu, G., Li, W., & Liu, J. (2020). A social emotion classification approach using multi-model fusion. Future Generation Computer Systems, 102, 347–356. https://doi.org/10.1016/j.future.2019.07.007
  • Yao, L., Mao, C., & Luo, Y. (2019). Graph convolutional networks for text classification. Proceedings of the AAAI Conference on Artificial Intelligence, 33(1), 7370–7377. https://doi.org/10.1609/aaai.v33i01.33017370.
  • Yin, C., Zhu, Y., Fei, J., & He, X. (2017). A deep learning approach for intrusion detection using recurrent neural networks. IEEE Access, 5, 21954–21961. https://doi.org/10.1109/ACCESS.2017.2762418
  • Zhang, Y., Liu, Q., & Song, L. (2018). Sentence-state lstm for text representation. arXiv preprint arXiv:1805.02474.
  • Zhang, Y., Yu, X., Cui, Z., Wu, S., Wen, Z., & Wang, L. (2020). Every document owns its structure: Inductive text classification via graph neural networks. arXiv preprint arXiv:2004.13826.
  • Zhou, J., Cui, G., Hu, S., Zhang, Z., Yang, C., Liu, Z., Wang, L., Li, C., & Sun, M. (2020). Graph neural networks: A review of methods and applications. AI Open, 1, 57–81. https://doi.org/10.1016/j.aiopen.2021.01.001