905
Views
1
CrossRef citations to date
0
Altmetric
Research Article

An enhanced text classification model by the inverted attention orthogonal projection module

, &
Article: 2173145 | Received 03 Oct 2022, Accepted 19 Jan 2023, Published online: 11 Feb 2023

References

  • Blitzer, J., McDonald, R., & Pereira, F. (2006). Domain adaptation with structural correspondence learning. In Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing, 120–128.
  • Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv:1406.1078.
  • Dai, Z., Yang, Z., Yang, Y., Carbonell, J., Le, Q. V., & Salakhutdinov, R. (2019). Transformer-xl: Attentive language models beyond a fixed-length context. arXiv:1901.02860. https://doi.org/10.48550/arXiv.1901.02860.
  • Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805.
  • Ganin, Y., & Lempitsky, V. (2015). Unsupervised domain adaptation by backpropagation. In International conference on machine learning (pp. 1180–1189). PMLR.
  • Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
  • Hu, M., & Liu, B. (2004). Mining and summarizing customer reviews. In Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 168–177. https://doi.org/10.1145/1014052.1014073
  • Hu, Y. L., & Zhao, Q. S. (2021). Bi-GRU model based on pooling and attention for text classification. International Journal of Wireless and Mobile Computing, 21(1), 26–31. https://doi.org/10.1504/IJWMC.2021.119057
  • Huan, H., Guo, Z., Cai, T., & He, Z. (2022). A text classification method based on a convolutional and bidirectional long short-term memory model. Connection Science, 34(1), 2108–2124. https://doi.org/10.1080/09540091.2022.2098926
  • Huang, Z., Ke, W., & Huang, D. (2020). Improving object detection with inverted attention. In 2020 IEEE winter conference on applications of computer vision (WACV) (pp. 1294–1302). IEEE.
  • Jing, R. (2019). A self-attention based LSTM network for text classification. Journal of Physics: Conference Series, 1207(1), 012008. https://doi.org/10.1088/1742-6596/1207/1/01200
  • Jordan, M. I. (1997). Serial order: A parallel distributed processing approach. Advances in Psychology, 121, 471–495. https://doi.org/10.1016/S0166-4115(97)80111-2
  • Kim, Y. (2014). Convolutional neural networks for sentence classification. In Proc. EMNLP, 1746–1751. https://doi.org/10.3115/v1/D14-1181
  • Li, Z., Zhang, Y., Wei, Y., Wu, Y., & Yang, Q. (2017). End-to-End adversarial memory network for cross-domain sentiment classification. In IJCAI, 2237–2243. https://doi.org/10.24963/ijcai.2017/311
  • Liu, P., Qiu, X., & Huang, X. (2016). Recurrent neural network for text classification with multi-task learning. arXiv:1605.05101.
  • Liu, R., Cao, J., Sun, N., & Jiang, L. (2022). Aspect feature distillation and enhancement network for aspect-based sentiment analysis. In Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 1577–1587.
  • Lv, H., Ning, Y., Ning, K., Ji, X., & He, S. (2022). Chinese Text Classification Using BERT and Flat-Lattice Transformer. In International Conference on AI and Mobile Services, 64–75. https://doi.org/10.1007/978-3-031-23504-7_5
  • Mnih, V., Heess, N., & Graves, A. (2014). Recurrent models of visual attention. Advances in Neural Information Processing Systems, 27. https://doi.org/10.48550/arXiv.1406.6247
  • Pan, S. J., Ni, X., Sun, J. T., Yang, Q., & Chen, Z. (2010). Cross-domain sentiment classification via spectral feature alignment. In Proceedings of the 19th International Conference on World Wide web, 751–760. https://doi.org/10.1145/1772690.1772767
  • Pang, B., & Lee, L. (2004). A sentimental education: Sentiment analysis using subjectivity summarization based on minimum cuts. arXiv: cs/0409058.
  • Pang, B., & Lee, L. (2005). Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales. arXiv: cs/0506075.
  • Qin, Q., Hu, W., & Liu, B. (2020). Feature projection for improved text classification. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 8161–8171. https://doi.org/10.18653/v1/2020.acl-main.726
  • Ranasinghe, K., Naseer, M., Hayat, M., Khan, S., & Khan, F. S. (2021). Orthogonal projection loss. In Proceedings of the IEEE/CVF International Conference on Computer Vision, 12333–12343. https://doi.org/10.48550/arXiv.2103.14021
  • Ruan, J., Caballero, J. M., & Juanatas, R. A. (2022). Chinese news text classification method based On attention mechanism. In 2022 7th international conference on business and industrial research (ICBIR) (pp. 330-334). IEEE.
  • Shaheen, Z., Wohlgenannt, G., & Filtz, E. (2020). Large scale legal text classification using transformer models. arXiv: 2010.12871.
  • Sivakumar, S., & Rajalakshmi, R. (2021). Analysis of sentiment on movie reviews using word embedding self-attentive LSTM. International Journal of Ambient Computing and Intelligence, 12(2), 33–52. https://doi.org/10.4018/IJACI.2021040103
  • Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., … Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30, https://doi.org/10.48550/arXiv.1706.03762
  • Wang, C., & Zhang, F. (2022). The performance of improved XLNet on text classification. In Third International Conference on Artificial Intelligence and Electromechanical Automation (AIEA 2022), 12329, 154–159.
  • Wei, S., Zhu, G., Sun, Z., Li, X., & Weng, T. (2022). GP-GCN: Global features of orthogonal projection and local dependency fused graph convolutional networks for aspect-level sentiment classification. Connection Science, 34(1), 1785–1806. https://doi.org/10.1080/09540091.2022.2080183
  • Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R. R., & Le, Q. V. (2019). Xlnet: Generalized autoregressive pretraining for language understanding. Advances in Neural Information Processing Systems, 32. https://doi.org/10.48550/arXiv.1906.08237
  • Yu, J., & Jiang, J. (2016). Learning sentence embeddings with auxiliary tasks for cross-DomainSentiment classification. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, 236–246. https://doi.org/10.18653/v1/D16-1023
  • Zhang, K., Zhang, H., Liu, Q., Zhao, H., Zhu, H., & Chen, E. (2019). Interactive attention transfer network for cross-domain sentiment classification. Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 33(1), pp. 5773-5780). https://doi.org/10.1609/aaai.v33i01.33015773
  • Zhou, Y., Li, J., Chi, J., Tang, W., & Zheng, Y. (2022). Set-CNN: A text convolutional neural network based on semantic extension for short text classification. Knowledge-Based Systems, 257, 109948. https://doi.org/10.1016/j.knosys.2022.109948