2,902
Views
9
CrossRef citations to date
0
Altmetric
Articles

A text classification method based on a convolutional and bidirectional long short-term memory model

, , &
Pages 2108-2124 | Received 07 Mar 2022, Accepted 02 Jul 2022, Published online: 12 Jul 2022

References

  • Alexis, C., & Holger, S. (2016). Very deep convolutional networks for natural language processing. Computer Science. arXiv:1606.01781. https://doi.org/10.48550/arXiv.1606.01781
  • Cheng, X., Zhang, C., & Li, Q. (2021). Improved Chinese short text classification method based on ERNIE_BiGRU model. Journal of Physics: Conference Series, 1993(1), 012038. https://doi.org/10.1088/1742-6596/1993/1/012038
  • Chia, Y. K., Witteveen, S., & Andrews, M. (2019). Transformer to CNN: Label scarce distillation for efficient text classification. Computation and Language, arXiv:1909.03508.
  • Cho, K., Van, B., & Gulcehre, C. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) (pp. D14–1179). https://doi.org/10.3115/v1/D14-1179
  • Chu, Z., Stratos, K., & Gimpel, K. (2020). Unsupervised label refinement improves dataless text classification. Computer Science. arXiv preprint arXiv:2012.04194.
  • Deng, J., Cheng, L., & Wang, Z. (2021). Attention-based BiLSTM fused CNN with gating mechanism model for Chinese long text classification. Computer Speech & Language, 68(6), 101182. https://doi.org/10.1016/j.csl.2020.101182
  • Deng, J., & Ren, F. (2021). Hierarchical network with label embedding for contextual emotion recognition. Research; A Journal of Science and Its Applications, 2021, 9. https://doi.org/10.34133/2021/3067943
  • Ghazi, F., Joseph, L., & Djamé, S. (2021). Challenging the semi-supervised VAE framework for text classification. Computer Science. arXiv:2109.12969.
  • Gu, J., & Peng, W. (2020). Sentiment classification method based on convolution attention mechanism. Computer Engineering and Design, 95–99.
  • Guo, B., Han, S., & Han, X. (2020). Label confusion learning to enhance text classification models. Computer Science. arXiv:2012.04987.
  • Hochreiter, S., & Jürgen, S. (1997). LSTM can solve hard long time lag problems. Annual Conference on Neural Information Processing Systems (pp. 473–479).
  • Jair, C., Li, X., & Yu, W. (2014). Imbalanced data classification via support vector machines and genetic algorithms. Connection Science, 26(4), 335–348. https://doi.org/10.1080/09540091.2014.924902
  • Javid, E., Hao, Y., & Zhang, W. (2021). How does Adversarial Fine-Tuning benefit BERT? Computer Science. arXiv:2108.13602v1.
  • Jincheng, H., & Wang, T.. (2022). Two-stream attention network with local and non-local dependence for referring relationships. International Journal of Embedded Systems, 15(1), 53–60. https://doi.org/10.1504/IJES.2022.122059
  • Johnson, R., & Zhang, T. (2017). Deep pyramid convolutional neural networks for text categorization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (pp. 562–570). https://doi.org/10.18653/v1/P17-1052
  • Joulin, A. (2017). Bag of tricks for efficient text classification. Computer Science. arXiv:1607.01759.
  • Kim, Y. (2014). Convolutional neural networks for sentence classification. In Proc. EMNLP(pp. 1746–1751). https://doi.org/10.3115/v1/D14-1181
  • Kontorovich, L. (2004). Uniquely decodable n-gram embeddings. Theoretical Computer Science, 329(1-3), 271–284. https://doi.org/10.1016/j.tcs.2004.10.010
  • Li, X., & Ning, H. (2020). Chinese text classification based on hybrid model of CNN and LSTM. Proceedings of the 3rd International Conference on Data Science and Information Technology (pp.129–134). https://doi.org/10.1145/3414274.3414493
  • Li, Z., Ren, A., & Li, J. (2016). Dscnn: Hardware-oriented optimization for stochastic computing based deep convolutional neural networks. 2016 IEEE 34th International Conference on Computer Design (ICCD) (pp. 678–681). IEEE. https://doi.org/10.1109/ICCD.2016.7753357
  • Liu, G., & Guo, J. (2019). Bidirectional LSTM with attention mechanism and convolutional layer for text classification. Neurocomputing, 337, 325–338. https://doi.org/10.1016/j.neucom.2019.01.078
  • Mikolov, T. (2013). Distributed representations of words and phrases and their compositionality. Annual Conference on Neural Information Processing Systems (pp. 3136–3144). arXiv:1310.4546.
  • Miyamoto, Y., & Cho, K. (2016). Gated word-character recurrent language model. In Proc. EMNLLP, Austin, Texas (pp. 1992–1997). https://doi.org/10.18653/v1/D16-1209
  • Panwar, M. (2020). TAN-NTM: topic attention networks for neural topic modeling. Computer Science. arXiv:2012.01524.
  • Pennington, J., Socher, R., & Manning, C. (2014). Glove: global vectors for word representation. In Proc. EMNLP, Doha, Qatar (pp. 1532–1543). https://doi.org/10.3115/v1/D14-1162
  • Pradhan, T., Kumar, P., & Pal, S. (2021). CLAVER: An integrated framework of convolutional layer, bidirectional LSTM with attention mechanism based scholarly venue recommendation. Information Sciences, 559, 212–235. https://doi.org/10.1016/j.ins.2020.12.024
  • Qiaoyun, W., Zhu, G., Zhang, S., Li, K.-C., Chen, X., & Xu, H. (2021). Extending emotional lexicon for improving the classification accuracy of Chinese film reviews. Connection Science, 33(2), 153–172. https://doi.org/10.1080/09540091.2020.1782839
  • Ren, H., & Lu, H. (2018). Compositional coding capsule network with k-means routing for text classification. Computer Science. arXiv:1810.09177.
  • Shunxiang, Z., Yu, H., & Zhu, G. (2022). An emotional classification method of Chinese short comment text based on ELECTRA. Connection Science, 34(1), 254–273. https://doi.org/10.1080/09540091.2021.1985968
  • Song, M. (2018). Text sentiment analysis based on convolutional neural network and bidirectional LSTM model. International Conference of Pioneering Computer Scientists, Engineers and Educators. Springer.
  • Sun, Y., Wang, S., & Li, Y. (2019). ERNIE: Enhanced representation through knowledge integration. Computation and Language. arXiv:1904.09223.
  • Suruchi, C. (2021). Application of convolution neural network in web query session mining for personalised web search. International Journal of Computational Science and Engineering, 24(4), 417–428. https://doi.org/10.1504/IJCSE.2021.117029
  • Tam, S., & Said, R. (2021). A ConvBiLSTM deep learning model based approach for twitter sentiment classification. IEEE Access, 9, 41283–41293. https://doi.org/10.1109/ACCESS.2021.3064830
  • Trueman, T. E., & Cambria, E. (2021). A convolutional stacked bidirectional LSTM with a multiplicative attention mechanism for aspect category and sentiment detection. Cognitive Computation, 13(6), 1423–1432. https://doi.org/10.1007/s12559-021-09948-0
  • Vaswani, A., & Shazeer, N. (2017). Attention is all you need. In Proc. NIPS, Long Beach, CA, USA(pp. 5998–6008). arXiv:1706.03762.
  • Xiao, Y., & Cho, K. (2016). Efficient character-level document classification by combining convolution and recurrent layers. Computer Science. arXiv:1602.00367.
  • Yang, Z., Yang, D., & Dyer, C. (2016). Hierarchical attention networks for document classification. In Proc. NAACL, San Diego, CA, USA (pp. 1480–1489). https://doi.org/10.18653/v1/N16-1174
  • Zhang, D., & Lee, W. (2003). Question classification using support vector machines. Proceedings of the 26th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 26–32). https://doi.org/10.1145/860435.860443
  • Zhang, X., Zhao, J., & Yann, L. (2015). Character-level convolutional networks for text classification. Advances in Neural Information Processing Systems, 649–657. arXiv:1509.01626.
  • Zhao, J. (2017). Comparison research on text pre-processing methods on twitter sentiment analysis. IEEE Access, 5, 2870–2879. https://doi.org/10.1109/ACCESS.2017.2672677
  • Zhao, J., Zhan, Z., & Yang, Q. (2018). Adaptive learning of local semantic and global structure representations for text classification. Proceedings of the 27th International Conference on Computational Linguistics (pp. 2033–2043).
  • Zhilin, Y., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R., & Le, Q. V. (2019). XLNet: Generalized autoregressive pretraining for language understanding. In Proceedings of the 33rd international conference on neural information processing systems (pp. 5753–5763). Curran Associates Inc.
  • Zhongliang, W., Liu, W., Zhu, G., Zhang, S., & Hsieh, M.-Y. (2022). Sentiment classification of Chinese Weibo based on extended sentiment dictionary and organisational structure of comments. Connection Science, 34(1), 409–428. https://doi.org/10.1080/09540091.2021.2006146
  • Zhou, C., Sun, C., & Liu, Z. (2015). A C-LSTM neural network for text classification. Computer Science, 1(4), 39–44. arXiv:1511.08630.
  • Zhou, P., Qi, Z., & Zheng, S. (2016). Text classification improved by integrating bidirectional LSTM with two-dimensional max pooling. In Proc. COLING, Osaka, Japan (pp. 3485–3495). arXiv:1611.06639.