6,527
Views
7
CrossRef citations to date
0
Altmetric
Research Article

Artificial Neural Networks for Educational Data Mining in Higher Education: A Systematic Literature Review

, , , &

References

  • Abdel-Hamid, O. et al., “Convolutional neural networks for speech recognition”. IEEE/ACM Transactions on Audio, Speech, and Language Processing, Automoc, Czechoslovakia,22(10): 1533–45, 2014.
  • Abidi, S. M. R., M. Hussain, Y. Xu, and W. Zhang. 2018. Prediction of confusion attempting algebra homework in an intelligent tutoring system through machine learning techniques for educational sustainable development. Sustainability (Switzerland) 11:1. doi:https://doi.org/10.3390/su11010105.
  • Acevedo, Y. V. N., and C. E. M. Marín. 2014. System architecture based on learning analytics to educational decision makers toolkit. Advances in Computer Science and Engineering 13 (2):89–105.
  • Ali, L., M. Asadi, D. Gaševic, J. Jovanovic, and M. Hatala. 2013. Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education 62 (2013):130–48. doi:https://doi.org/10.1016/j.compedu.2012.10.023.
  • Ali, L., M. Hatala, D. Gaševic, and J. Jovanovic. 2012. A qualitative evaluation of evolution of a learning analytics tool. Computers & Education 58 (1):470–89. doi:https://doi.org/10.1016/j.compedu.2011.08.030.
  • Alpaydin, E. 2010. Introduction to Machine Learning. London: The MIT Press.
  • Avella, J. T., M. Kebritchi, S. G. Nunn., T. Kanai, L. Smith, and J. Meinzen-Derr. June 2016. Learning analytics methods, benefits, and challenges in higher education: a systematic literature review. Online Learning. 20(4):201–11. doi: https://doi.org/10.24059/olj.v20i4.737.
  • Bahadır, E. June 2016. Using neural network and logistic regression analysis to predict prospective mathematics teachers’ academic success upon entering graduate education. Educational Sciences: Theory & Practice 16:943–64.
  • Barneveld, A. V., K. E. Arnold, and J. P. Campbell. 2012. Analytics in higher education: establishing a common language. Educause Learning Initiative ELI Paper 1.
  • Bassel, G. W., E. Glaab, J. Marquez, M. J. Holdsworth, and J. Bacardit. 2011. Functional network construction in arabidopsis using rule-based machine learning on large-scale data sets. The Plant Cell 23 (9):3101–16. doi:https://doi.org/10.1105/tpc.111.088153.
  • Bengio, Y. 2012. Practical recommendations for gradient-based training of deep architectures. In Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, ed. G. Montavon, G. B. Orr, and K. R. Müller, vol. 7700. 437–78. Berlin, Heidelberg: Springer. doi: https://doi.org/10.1007/978-3-642-35289-8_26.
  • Bengio, Y., A. Courville, and P. Vincent. 2013. Representation learning: a review and new perspectives. IEEE Transactions on Pattern Analysis and Machine Intelligence 35 (8):1798–828. doi:https://doi.org/10.1109/TPAMI.2013.50.
  • Bengio, Y., N. Boulanger-Lewandowski, and R. Pascanu, “Advances in optimizing recurrent networks”. IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, Canada, 8624–28, 2013.
  • Bengio, Y., R. Ducharme, P. Vincent, and C. Janvin March 2003. A neural probabilistic language model. Journal of Machine Learning Research. 3: 1137–55
  • Bernard, J., T. Chang, E. Popescu, and S. Graf, “Using artificial neural networks to identify learning styles”, AIED 2015: Artificial Intelligence in Education pp 541-544, International Conference on Artificial Intelligence in Education, Madrid, Spain, 2015.
  • Borkar, S., and K. Rajeswari. 2014. Attributes selection for predicting students’ academic performance using education data mining and artificial neural network. International Journal of Computer Applications 86 (10):(0975–8887). doi:https://doi.org/10.5120/15022-3310.
  • Brocardo, M. L., I. Traore, I. Woungang, and M. S. Obaidat. 2017. Authorship verification using deep belief network systems. Int J Commun Syst 30 (12):e3259. doi:https://doi.org/10.1002/dac.3259.
  • Budgaga, W., M. Malensek, S. Pallickara, N. Harvey, S. Pallickara, and S. Pallickara. 2016. Predictive analytics using statistical, learning, and ensemble methods to support real-time exploration of discrete event simulations. Future Generation Computer Systems 56:360–74. doi:https://doi.org/10.1016/j.future.2015.06.013.
  • Burbaite, R., V. Stuikys, and R. Damasevicius (2013). Educational robots as collaborative learning objects for teaching computer science. IEEE International Conference on System Science and Engineering, London, UK, 211–16. doi:https://doi.org/10.1109/ICSSE.2013.6614661
  • Campagni, R., D. Merlini, R. Sprugnoli, and M. C. Verri. 2015. Data mining models for student careers. Expert Systems with Applications. 42:5508–21.
  • Chatti, M. A., and U. Schroeder. 2012. Design and implementation of a learning analytics toolkit for teachers. Educational Technology & Society 15 (3):58–76.
  • Chatti, M. A., V. Lukarov, H. Thüs, A. Muslim, A. M. F. Yousef, U. Wahid, U., . C. Greven, A. Chakrabarti, and U. Schroeder. 2014. Learning analytics: challenges and future research directions. eleed 10.
  • Chen, C. 2010. Curricullum assessment using artificial neural networks and support vector machine modelling approaches – a case study. IR Applications 29. December 15,  35 - 36.
  • Chicco, D., P. Sadowski, and P. Baldi, “Deep autoencoder neural networks for gene ontology annotation predictions”, 5th ACM Conference on Bioinformatics, Computational Biology, and Health Informatics - BCB ‘14. 533–40, 2014. via ACM Digital Library, California, USA.
  • Cho, K., B. Van Merrienboer, C. Gulcehre, F. Bougares, H. Schwenk, and Y. Bengio. 2014. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Empiricial Methods in Natural Language Processing, ACL Anthology Team of Volunteers Doha, 1724 - 1734. Qatar: Association for Computational Linguistics.
  • Choi, E., A. Schuetz, W. F. Stewart, and J. Sun. 2016. Using recurrent neural network models for early detection of heart failure onset. Journal of the American Medical Informatics Association 112: PMID 27521897. 1067-5027.
  • Ciresan, D., A. Giusti, L. M. Gambardella, and J. Schmidhuber, J. “Mitosis detection in breast cancer histology images using deep neural networks”. Proceedings MICCAI, Nagoya, Japan, 2013.
  • Cireşan, D., U. Meier, J. Masci, and J. Schmidhuber. 2011a. Multi-column deep neural network for traffic sign classification. Neural Networks. Selected Papers from IJCNN 32:333–38. doi:https://doi.org/10.1016/j.neunet.2012.02.023.
  • Ciresan, D., U. Meier, and J. Schmidhuber, “Multi-column deep neural networks for image classification”, 2012a IEEE Conference on Computer Vision and Pattern Recognition: 3642–49.
  • Ciresan, D., U. Meier, and J. Schmidhuber, “Multi-column deep neural networks for image classification”, 2012b IEEE Conference on Computer Vision and Pattern Recognition, Providence, Rhode Island, North America, 3642–49, 2012
  • Ciresan, D. C., U. Meier, J. Masci, L. M. Gambardella, and J. Schmidhuber, “Flexible, high performance convolutional neural networks for image classification”. International Joint Conference on Artificial Intelligence, Barcelona, Spain, 2011a.
  • Cireşan, D. C., U. Meier, L. M. Gambardella, and J. Schmidhuber. 2010a. Deep, big, simple neural nets for handwritten digit recognition. Neural Computation. 22 (12):3207–20. 0899-7667.
  • Coates, A., H. Lee, and A. Y. Ng, “An analysis of single-layer networks in unsupervised feature learning”. International Conference on Artificial Intelligence and Statistics (AISTATS), Forth Lauderdale, USA, 2011.
  • Cooper, A. 2012. A brief history of analytics. JISC CETIS Analytics Series 1 (9): 21.
  • Cooper, A. R. 2013. Learning analytics interoperability - a survey of current literature and candidate standards. Cetis 1 (1): 2017.
  • Courville, A., J. Bergstra, and Y. Bengio, “A spike and slab restricted boltzmann machine”, JMLR: Workshop and Conference Proceeding, Washington, USA, 15: 233–41, 2011.
  • Courville, A., J. Bergstra, and Y. Bengio, “Unsupervised models of images by spike-and-slab RBMs”, 28th International Conference on Machine Learning, Washingthon, USA, 10. 1–8, 2012.
  • D. Ciresan, A. Giusti, L.M. Gambardella, and J. Schmidhuber, Juergen (2012). Pereira, F.; Burges, C. J. C.; Bottou, L.; Weinberger, K. Q., eds. Advances in neural information processing systems 25. Curran Associates, Inc. 2843–51.
  • Dahl, G., D. Yu, L. Deng, and A. Acero. 2012. Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Transactions on Audio, Speech, and Language Processing 20 (1):30–42. doi:https://doi.org/10.1109/TASL.2011.2134090.
  • Damaševičius, R. (2009). Analysis of academic results for informatics course improvement using association rule mining. Paper presented at the Information Systems Development: Towards a Service Provision Society, Paphos, Cyprus, 357-363. doi:https://doi.org/10.1007/b137171_37
  • Dascalu, M., C. Bodea, R. I. Mogos, A. Purnus, and B. Tesila (2018). A survey on social learning analytics: Applications, challenges and importance doi:https://doi.org/10.1007/978-3-319-73459-0_5
  • Deng, L., J. Li, J.-T. Huang, K. Yao, D. Yu, F. Seide, M. Seltzer, G. Zweig, X. He, J. Williams, Y. Gong, and A. Acero. “Recent advances in deep learning for speech research at Microsoft”. 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, Canada, 8604–08.
  • Deng, L. 2014. Deep learning: methods and applications. Foundations and Trends in Signal Processing 7 (3–4):1–199. doi:https://doi.org/10.1561/2000000039.
  • Deng, L., and D. Yu. 2011. Deep convex net: A scalable architecture for speech pattern classification. Interspeech 2011: 2285–88.
  • Deng, L., D. Yu, and J. Platt, “Scalable stacking and learning for building deep architectures”, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Kyoto, Japan, 2133–36, 2012.
  • Deng, L., G. Hinton, and B. Kingsbury, “New types of deep neural network learning for speech recognition and related applications: An overview (ICASSP)”, 2013.
  • Deng, L., G. Tur, X. He, and D. Hakkani-Tür. 2012. Use of Kernel Deep Convex Networks and End-To-End Learning for Spoken Language Understanding. In IEEE Workshop on Spoken Language Technologies, December 2012, pp. 210-215.
  • Deng, L., O. Abdel-Hamid, and D. Yu. 2013. A deep convolutional neural network using heterogeneous pooling for trading acoustic invariance with phonetic confusion. 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26-31 May 2013, pp. 8599-8603, IEEE.
  • Dietz-Uhler, B., and J. E. Hurn. 2013. “Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective”. Journal of Interactive Online Learning. Volume 12 (Number 1):1541–4914. Spring 2013.
  • Drachsler, H., and W. Greller, “Privacy and analytics – It’s a DELICATE issue”, A Checklist to establish trusted Learning Analytics, 6th Learning Analytics and Knowledge Conference 2016, April 25-29, 2016, 89–98. Edinburgh, UK. DOI: https://doi.org/10.1145/2883851.2883893.
  • Edwards, C. 2015. Growing pains for deep learning. Communications of the ACM 58 (7):14–16. doi:https://doi.org/10.1145/2771283.
  • Elkahky, A. M., Y. Song, and X. He. 2015. A multi-view deep learning approach for cross domain user modeling in recommendation systems. In Proceedings of the 24th International Conference on World Wide Web (WWW '15). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, CHE, 278–288.
  • Fan, B., L. Wang, F. K. Soong, and L. Xie. 2015. Photo-Real Talking Head with Deep Bidirectional LSTM. In 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 19-24 April 2015, South Brisbane, QLD, Australia, pp. 4884-4888, IEEE.
  • Ferguson, R. 2012. Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning 4(5/6):304–17. January 2012. doi:https://doi.org/10.1504/IJTEL.2012.051816.
  • Ferreira, S. A., and A. Andrade. 2014 August. Academic analytics: Mapping the genome of the university. IEEE Revista Iberoamericana De Technologias Del Aprendizaje. 9(3): 98 - 105.
  • Fischer, and C. Igel, “Training restricted boltzmann machines: An introduction”, Pattern Recognition 47, 25–39, 2014.
  • Forouzanfar, M., H. R. Dajani, V. Z. Groza, M. Bolic, and S. Rajan, Comparison of feed-forward neural network training algorithms for oscillometric blood pressure estimation, 4th Int. Workshop Soft Computing Applications. Arad, Romania: IEEE, 2014.
  • Fournier, H., R. Kop, and H. Sitlia, “The value of learning analytics to networked learning on a personal learning environment”, 1st International Conference Learning Analytics and Knowledge, February 27-March 1, 2011, Banff, Alberta, Canada.
  • Gao, J., X. He, S. W. Yih, and L. Deng. 2014. Learning Continuous Phrase Representations for Translation Modeling. Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), June 2014, Baltimore, Maryland, USA, pp. 699–709. Association for Computational Linguistics.
  • Gers, F. A., and J. Schmidhuber. 2001. LSTM recurrent networks learn simple context free and context sensitive languages. IEEE TNN 12 (6):1333–40. doi:https://doi.org/10.1109/72.963769.
  • Gopnik, A. 2017. Making AI more human: Artificial intelligence has staged a revival by starting to incorporate what we know about how children learn. Scientific American 316 (6):60–65. doi:https://doi.org/10.1038/scientificamerican0617-60.
  • Gordon, N. A. 2016. Issues in retention and attainment in Computer Science. AdvancedHE, UK: Higher Education Academy.
  • Govindarajan, K., V. S. Kumar, and D. Boulanger. 2016. Kinshuk, ”Learning Analytics Solution for Reducing Learners’ Course Failure Rate”, In 2015 IEEE Seventh International Conference on Technology for Education (T4E), Warangal, India, 10-12 Dec. 2015. IEEE, pp. 83-90.
  • Grasso, F., A. Luchetta, and S. Manetti. 2018. A multi-valued neuron based complex ELM neural network. Neural Processing Letters 48 (1):389–401. doi:https://doi.org/10.1007/s11063-017-9745-9.
  • Graupe, D. 2013. Principles of Artificial Neural Networks.3rd Edition. Singapore: World Scientific Publishing Company Co PTE Ltd, 253–74.
  • Graupe, D., 2016. “Deep learning neural networks: Design and case studies”. World Scientific Publishing Co Inc. 57–110.
  • Grefenstette, T., T. Ramalho, M. Reynolds, T. Harley, I. Danihelka, A. Grabska-Barwińska, S. G. Colmenarejo, E. Grefenstette, T. Ramalho, and J. Agapiou. 2016. Hybrid computing using a neural network with dynamic external memory. Nature 1476-4687. 538 (7626):471–76. doi:https://doi.org/10.1038/nature20101.
  • Greller, W., and H. Drachsler. 2012. Translating learning into numbers: toward a generic framework for learning analytics. Educational Technology and Society 15 (3):42–57.
  • Guenaga, M., and P. Garaizar. 2016 August. From analysis to improvement: Challenges and opportunities for learning analytics. IEEE Revista Iberoamericana de Tecnologias del Aprendizaje 11(3): 146 - 147.
  • Hastie, T., R. Tibshirani, and J. Friedman. 2009. The Elements of Statistical Learning. New York: Springer-Verlag.
  • Hinton, G., L. Deng, D. Yu, G. Dahl, A.-R. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, T. Sainath, et al. 2012b. Deep neural networks for acoustic modeling in speech recognition — The shared views of four research groups. IEEE Signal Processing Magazine. 29(6):82–97. doi:https://doi.org/10.1109/MSP.2012.2205597.
  • Hinton, G., L. Deng, D. Yu, G. E. Dahl, A. R. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, and T. Sainath. 2012a. Deep neural networks for acoustic modeling in speech recognition: The Shared views of four research groups. IEEE Signal Processing Magazine 29 (6):82–97.
  • Hinton, G., and R. Salakhutdinov. 2012. A better way to pretrain deep Boltzmann machines. Advances in Neural 3:1–9.
  • Hinton, G. E. 2010. A practical guide to training restricted boltzmann machines. UTML TR. Tech. Rep 2010–13 29 (6).
  • Hinton, G. E., N. Srivastava, A. Krizhevsky, I. Sutskever, and R. R. Salakhutdinov, (2012). “Improving neural networks by preventing co-adaptation of feature detectors”.
  • Hoel, T., and W. Chen, “Learning analytics interoperability – Looking for low-hanging fruits” In Liu, C.-C., Ogata, H., Kong, S. C., and Kashihara, A. (Eds.) (2014), 22nd International Conference on Computers in Education,  3(1): 139–158. Nara, Japan, December 2014.
  • Hoel, T., and W. Chen, “Privacy in learning analytics – Implications for system architecture” — In T. Watanabe and K. Seta (Eds.) (2015b). 11th International Conference on Knowledge Management, ICKM 15 in Osaka, Japan, 4 – 6 November 2015.
  • Hoel, T., and W. Chen. 2015a. Privacy-driven design of Learning Analytics applications – Exploring the design space of solutions for data sharing and interoperability. In Journal of learning analytics, 139–158.
  • Höst, M., and A. Oručević-Alagić. 2011. A systematic review of research on open source software in commercial software product development. Information and Software Technology 53 (6):616–24. doi:https://doi.org/10.1016/j.infsof.2010.12.009.
  • Huang, P.-S., X. He, J. Gao, L. Deng, A. Acero, and L. Heck. “Learning deep structured semantic models for web search using clickthrough data”. Microsoft Research, 2013.
  • Hutchinson, B., L. Deng, and D. Yu. 2012. Tensor deep stacking networks. IEEE Transactions on Pattern Analysis and Machine Intelligence 1–15:1944–57.
  • Hutter, M. 2012. One Decade of Universal Artificial Intelligence. Theoretical Foundations of Artificial General Intelligence, Atlantis Thinking Machines, vol. 4I, Atlantis Press, pp. 67–88. SBN 978-94-91216-61-9
  • Kalchbrenner, N., and P. Blunsom, “Recurrent continuous translation models”, EMNLP’2013.
  • Karamouzis, S. T., and A. Vrettos. An artificial neural network for predicting student graduation outcomes. World Congress on Engineering and Computer Science. 2008 WCECS 2008. 991-994. October 22-24, 2008. San Francisco, USA
  • Kaur, P., M. Singh, and G. S. Josan. 2015. Classification and prediction based data mining algorithms to predict slow learners in education sector. Procedia Computer Science 57:500–08. doi:https://doi.org/10.1016/j.procs.2015.07.372.
  • Kitchenham, B., S. Charters, and L. Kuzniarz. August 2015. Guidelines for conducting systematic mapping studies in software engineering. Journal of Information and Software Technology Archive. 64(C):1–18. doi: https://doi.org/10.1016/j.infsof.2015.03.007.
  • Kobayashi, V., S. T. Mol, and G. Kismihók G. 0000. Labour market driven learning analytics. Journal of Learning Analytics 1 (3):207–10. doi:https://doi.org/10.18608/jla.2014.13.24.
  • Komenda, M., M. Víta, C. Vaitsis, D. Schwarz, A. Pokorná, N. Zary, L. Dušek,2015. “Curriculum mapping with academic analytics in medical and healthcare education.” PLoS ONE 10 (12):e0143748. doi:https://doi.org/10.1371/journal.pone.0143748.
  • Kraft-Terry, S., and C. Kau. 2016. Manageable steps to implementing data-informed advising. NACADA Clearinghouse.
  • Krizhevsky, A., I. Sutskever, and G. Hinton. 2012. ImageNet Classification with Deep Convolutional Neural Networks”. Lake Tahoe, Nevada: NIPS 2012: Neural Information Processing Systems.
  • Kumar, G., and K. Kumar. 2012. The Use of Artificial-Intelligence-Based Ensembles for Intrusion Detection: A Review. In Applied Computational Intelligence and Soft Computing, 2012: 20. Article ID 850160. https://doi.org/https://doi.org/10.1155/2012/850160
  • Langley, P. 2011b. The changing science of machine learning. Machine Learning 82 (3):275–79.
  • Li, X., and X. Wu. 2015. Constructing Long Short-Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition In 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2015, pp. 4520-4524, IEEE,doi:https://doi.org/10.1109/ICASSP.2015.7178826
  • Lu, H., K. N. Plataniotis, and A. N. Venetsanopoulos. 0000. A survey of multilinear subspace learning for tensor data. Pattern Recognition 44 (7):1540–51. doi:https://doi.org/10.1016/j.patcog.2011.01.004.
  • Macfadyen, L. P., and S. Dawson. 2012. Numbers are not enough. why e-learning analytics failed to inform an institutional strategic plan. Educational Technology & Society 15 (3):149–63.
  • Marcus, G. 2017. Am I human?: Researchers need new ways to distinguish artificial intelligence from the natural kind. Scientific American 316 (3):58–63. doi:https://doi.org/10.1038/scientificamerican0317-58.
  • Marsh, J. A., J. F. Pane, and L. S. Hamilton, “Making sense of data-driven decision making in education: Evidence from recent RAND Research”, Santa Monica, CA, 2006.
  • Martines, H., Y. Bengio, and G. N. Yannakakis. 2013. Learning deep physiological models of affect. IEEE Computational Intelligence 8 (2):20–33. doi:https://doi.org/10.1109/MCI.2013.2247823.
  • Mesnil, G., Y. Dauphin, K. Yao, Y. Bengio, L. Deng, D. Hakkani-Tur, X. He, L. Heck, G. Tur, D. Yu, and G. Zweig. “Using recurrent neural networks for slot filling in spoken language understanding”. IEEE Transactions on Audio, Speech, and Language Processing. 23 ( 3): 530–39, 2015.
  • Mikolov, T., Karafiát, M., Burget, L., Cernocký, J. & Khudanpur, S. (2010). Recurrent neural network based language model. In T. Kobayashi, K. Hirose & S. Nakamura (eds.), INTERSPEECH ,1045-1048, ISCA.
  • Mnih, V., K. Kavukcuoglu, D. Silver, A. A. Rusu, J. Veness, M. G. Bellemare, A. Graves, M. Riedmiller, A. K. Fidjeland, G. Ostrovski et al. 2015. Human-level control through deep reinforcement learning. Nature. 518(7540):529–33. doi:https://doi.org/10.1038/nature14236.
  • Mohamed, A., G. Dahl, and G. Hinton. 2012. Acoustic modeling using deep belief networks. IEEE Transactions on Audio, Speech, and Language Processing 20 (1):14–22. doi:https://doi.org/10.1109/TASL.2011.2109382.
  • Mohri, M., A. Rostamizadeh, and A. Talwalkar. 2012a. Foundations of Machine Learning. USA, Massachusetts: MIT Press.
  • Mohri, M., A. Rostamizadeh, and A. Talwalkar. 2012c. Foundations of Machine Learning, vol. ISBN 9780262018258. USA, Massachusetts: MIT Press.
  • Mu, S., S. Shibata, T. Yamamoto, S. Goto, S. Nakashima, and K. Tanaka (2020). Experimental study on learning of neural network using particle swarm optimization in predictive fuzzy for pneumatic servo system doi:https://doi.org/10.1007/978-3-030-04946-1_32
  • Mukherjee, S., P. Niyogi, T. Poggio, and R. Rifkin. 2006. “Learning theory: Stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization”, Advances in Computational Mathematics. 25:161–93.
  • Murphy, K. P. Machine learning: a probabilistic perspective. Cambridge, MA, USA: MIT Press.
  • Nguyen, A., J. Yosinski, and J. Clune, “Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images.” arXiv:1412.1897, 2014.
  • Oancea, B., R. Dragoescu, and S. Ciucu, “Predicting students’ results in higher education using neural networks”, Conference Paper. April 2013, Applied Information And Communication Technologies, Jelgava, Latvia, 2013.
  • Ognjanovic, I., D. Gasevic, and S. Dawson. 2016. Using institutional data to predict student course selections in higher education. Internet and Higher Education 29:49–62. doi:https://doi.org/10.1016/j.iheduc.2015.12.002.
  • Ojha, V. K., A. Abraham, and V. Snášel. 2017. Metaheuristic design of feedforward neural networks: A review of two decades of research. Engineering Applications of Artificial Intelligence 60:97–116.
  • Okewu, E., and O. Daramola (2017), “Design of a learning analytics system for academic advising in nigerian universities”, IEEE International Conference on Computing, Networking and Informatics (IEEE ICCNI 2017), Lagos, Nigeria, October 29 – 31
  • Oudeyer, P.-Y. 2010. On the impact of robotics in behavioral and cognitive sciences: From insect navigation to human cognitive development. IEEE Transactions on Autonomous Mental Development 2 (1):2–16. doi:https://doi.org/10.1109/TAMD.2009.2039057.
  • Palmer, S. 2013. Modelling engineering student academic performance using academic analytics. International Journal of Engineering Education 29 (1):132–38.
  • Papamitsiou, Z., and A. Economides. 2014. Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Educational Technology & Society 17 (4):49–64.
  • Pardo, A., and G. Siemens. April 2014. Ethical and privacy principles for learning analytics. British Journal of Educational Technology. 45(3):438–50. doi: https://doi.org/10.1111/bjet.12152.
  • Pavlin-Bernardić, N., S. Ravić, and I. P. Matić. 2016. The application of artificial neural networks in predicting children’s giftedness. Suvremena Psihologija 19 (1):49–59. doi:https://doi.org/10.21465/2016-SP-191-04.
  • Plauska, I., and R. Damaševičius. 2014. Educational robots for internet-of-things supported collaborative learning. In Information and software technologies. ICIST 2014. communications in computer and information science, ed. G. Dregvaite and R. Damasevicius, vol. 465, 346–58. Cham: Springer.
  • Poggio, T., S. Rakhlin, A. Caponnetto, and R. Rifkin. 2012. Statistical learning theory and applications. Class: 2. https://cbmm.mit.edu/sites/default/files/documents/Class01_2019.pdf
  • Qianyin, X., and L. Bo. 2015. Multiple evaluation models for education based on artificial neural networks. International Journal of Hybrid Information Technology 8 (9):1–10. doi:https://doi.org/10.14257/ijhit.2015.8.9.01.
  • Qu, S., K. Li, S. Zhang, and Y. Wang. 2018. Predicting achievement of students in smart campus. IEEE Access 6:60264–73. doi:https://doi.org/10.1109/ACCESS.2018.2875742.
  • Rajani, S. 2011. Artificial intelligence – Man or machine. International Journal of Information Technology and Knowledge Management. 4 (1):173–76. Archived from the original.
  • Retalis, S., A. Papasalouros, Y. Psaromiligkos, S. Siscos, and T. Kargidis. 2006. Proceedings of the Fifth International Conference on Networked Learning, pp.  1-8.
  • Ruslan, S., and T. Joshua. 1958–71. Learning with hierarchical-deep models. IEEE Transactions on Pattern Analysis and Machine Intelligence 35:2012.
  • Sainath, T. N., A. R. Mohamed, B. Kingsbury, and B. Ramabhadran, “Deep convolutional neural networks for LVCSR”. 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Singapore, 8614–18, 2013.
  • Sak, H., A. Senior, and F. Beaufays (2014a). “Long short-term memory recurrent neural network architectures for large scale acoustic modeling”. in Proceedings of the 15th Annual Conference of the International Speech Communication Association: Celebrating the Diversity of Spoken Languages, INTERSPEECH 2014, pp. 338–42, September 2014.
  • Sak, H., A. Senior, and F. Beaufays, (2014b). “Long short-term memory recurrent neural network architectures for large scale acoustic modeling”.
  • Scheffel, M., H. Drachsler, and M. Specht, “Developing an evaluation framework of quality indicators for learning analytics”, In Fifth International Conference on Learning Analytics And Knowledge (15) (pp. 16–20). Poughkeepsie, NY, USA: ACM, 2015.
  • Scheffel, M., H. Drachsler, S. Stoyanov, and M. Specht. 2014. Quality indicators for learning analytics. Educational Technology and Society 17 (4):16–20.
  • Schmidhuber, J. 2015a. Deep learning in neural networks: An overview. Neural Networks 61:85–117.
  • Schmidhuber, J. 2015c. Deep learning in neural networks: an overview. Neural Networks 61:85–117. 2015.
  • Schmidhuber, J. 2015d. Deep learning in neural networks: an overview. Neural Networks 61:85–117.
  • Sclater, N., (2014), “Code of practice for learning analytics.” https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf
  • Seddon, J. 2008. Systems thinking in the public sector: the failure of the reform regime and a manifesto for a better way. Axminster: Triarchy Press.
  • Shahiri, A. M., W. Husain, and N. A. Rashid. 2015. A review on predicting student’s performance using data mining techniques. Procedia Computer Science 72:414–22. doi:https://doi.org/10.1016/j.procs.2015.12.157.
  • Shen, Y., X. He, J. Gao, L. Deng, and G. Mesnil, ( 2014-11-01). “A latent semantic model with convolutional-pooling structure for information retrieval”. Microsoft Research, 2014.
  • Shum, S. B., and R. D. Deakin Crick, “Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics”, In: Proc. 2nd International Conference on Learning Analytics & Knowledge, Vancouver, BC, Canada, 29 April-2 May 2012). ACM: New York. pp.92–101, 2012.
  • Sidhu, G., and B. Caffo. 2014. Exploiting pitcher decision-making using reinforcement learning. The Annals of Applied Statistics, Vol. 8, No. 2 (June 2014), pp. 926-955. Institute of Mathematical Statistics.
  • Siemens, G., D. Gasevic, C. Haythornthwaite, S. Dawson, S. B. Shum, R. Ferguson, E. Duval, K. Verbert, and R. S. J. D. Baker, “Open learning analytics: An integrated and modularized platform”, 2011.
  • Silver, D., A. Huang, C. J. Maddison, A. Guez, L. Sifre, G. Van Den Driessche, J. Schrittwieser, I. Antonoglou, V. Panneershelvam, M. Lanctot, et al. 2016. Mastering the game of go with deep neural networks and tree search. Nature 529(7587):484–89. doi:https://doi.org/10.1038/nature16961. 0028-0836. PMID 26819042
  • Sin, K., and L. Muthu. July 2015. Application of big data in education data mining and learning analytics – A literature review. ICTACT Journal On Soft Computing: Special Issue On Soft Computing Models For Big Data 5(4):04. doi: https://doi.org/10.21917/ijsc.2015.0145.
  • Singer, N., “Privacy concerns for classdojo and other tracking apps for school children”, New York Times, 4–6, 2014.
  • Siri, A. 2015. Predicting students’ dropout at university using artificial neural networks. Italian Journal Of Sociology Of Education 7 (2): 225-247.
  • Slade, S., and P. Prinsloo. March 2013. Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist 57(10):1510–29. doi: https://doi.org/10.1177/0002764213479366.
  • Socher, R., “Recursive deep models for semantic compositionality over a sentiment treebank”. EMNLP 2013.
  • Socher, R., and C. Lin, “Parsing natural scenes and natural language with recursive neural networks”, 26th International Conference on Machine Learning, Bellevue, Washington, USA, 2011.
  • Štuikys, V., R. Burbaite, and R. Damaševičius. 2013. Teaching of computer science topics using meta-programming-based glos and lego robots. Informatics in Education 12 (1):125–42. doi:https://doi.org/10.15388/infedu.2013.09.
  • Suchithra, R., V. Vaidhehi, and N. E. Iyer. February 2015. Survey of learning analytics based on purpose and techniques for improving student performance. International Journal of Computer Applications 111(1):0975–8887.
  • Şusnea, E. 2010. Using artificial neural networks in e-learning systems. U.P.B. Sci. Bull 72(4): 91-100. Series C
  • Sutskever, I., O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks”. NIPS’2014a.
  • Sutskever, L., O. Vinyals, and Q. Le. 2014b. Sequence to sequence learning with neural networks. Proc. NIPS,3104–3112.
  • Sze, V., Y. Chen, T. Yang, and J. Emer, “Efficient processing of deep neural networks: A tutorial and survey”. 2017.
  • Szegedy, A. T., and D. Erhan. 2013. Deep neural networks for object detection. Advances in Neural Information Processing Systems, 2553–2561.
  • Tahmasebi, H., and A. Hezarkhani. 2012. A hybrid neural networks-fuzzy logic-genetic algorithm for grade estimation. Computers & Geosciences 42:18–27. doi:https://doi.org/10.1016/j.cageo.2012.02.004.
  • Takle, P. R., and N. Gawai. 2015. Identification of student’s behavior in higher education from social media by using opinion based memetic classifier. International Journal on Recent and Innovation Trends in Computing and Communication 3 (3): 1074 -1078.
  • Tankelevičiene, L., and R. Damaševičius (2010). Towards the development of genuine intelligent ontology-based e-learning systems. IEEE International Conference on Intelligent Systems, IS 2010, London, UK, 79–84. doi:https://doi.org/10.1109/IS.2010.5548384
  • Tkachenko, Y. 2015. Autonomous CRM control via CLV approximation with deep reinforcement learning in discrete and continuous action space. arXiv preprint arXiv:1504.01840.
  • Unterthiner, T., A. Mayr, G. Klambauer, and S. Hochreiter, “Toxicity prediction using deep learning”, 2015.
  • Vahdat, M., A. Ghio, L. Oneto, D. Anguita, M. Funk, and M. Rauterberg, “Advances in learning analytics and educational data mining”, ESANN 2015 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Bruges, Belgium, 22-24 April 2015.
  • Vapnik, V. N., and A. Y. Chervonenkis. 1971. “On the uniform convergence of relative frequencies of events to their probabilities.” Theory of Probability and Its Applications” 16 (2):264–80. doi:https://doi.org/10.1137/1116025.
  • Vincent, P., H. Larochelle, I. Lajoie, Y. Bengio, and P. Manzagol. 2010. “Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. ”The Journal of Machine Learning Research 11:3371–408.
  • Virvou, M., E. Alepis, and S. Sidiropoulos, “A learning analytics tool for supporting teacher decision”, von der fakultät für mathematik, informatik und naturwissenschaften der, “action research and learning analytics in higher education”, 6. June 2014.
  • Wallach, I., M. Dzamba, and A. Heifets, “AtomNet: A deep convolutional neural network for bioactivity prediction in structure-based drug discovery”, 2015
  • Wang, T., and M. S. Y. Jong, “Towards equitable quality education for all: Are MOOCs really a way out?”, Conference 20th Global Chinese Conference on Computers in Education 2016. Hong Kong: The Hong Kong Institute of Education.
  • Wanli, X., G. Rui, P. Eva, and G. Sean. 2015. Participation- based student final performance prediction model through interpretable genetic programming: Integrating learning analytics, educational data mining and theory. Computers in Human Behaviour 47:168–81. doi:https://doi.org/10.1016/j.chb.2014.09.034.
  • Waxman, J. A., D. Graupe, and D. W. Carley. 2010. Automated prediction of apnea and hypopnea, using a LAMSTAR artificial neural network. American Journal of Respiratory and Critical Care Medicine 1073-449X. 181 (7):727–33. doi:https://doi.org/10.1164/rccm.200907-1146OC.
  • Widrow, B., A. Greenblatt, Y. Kim, D. Park 2013. The no-prop algorithm: A new learning algorithm for multilayer neural networks. Neural Networks 37:182–88. doi:https://doi.org/10.1016/j.neunet.2012.09.020. 2013.
  • Yassine, S., S. Kadry, and M. Sicilia. 2016. A framework for learning analytics in moodle for assessing course outcomes. In 2016 IEEE Global Engineering Education Conference (EDUCON), Abu Dhabi, United Arab Emirates, pp. 261-266, doi: https://doi.org/10.1109/EDUCON.2016.7474563.
  • Yorek, N., and I. Ugulu. September 23, 2015. “A CFBPN artificial neural network model for educational qualitative data analyses: Example of students’ attitudes based on Kellerts’ typologies.” Academic Journals. 10 (18):2606–16.
  • Yu, D., and L. Deng. 2010. Roles of pre-training and fine-tuning in context-dependent dbn-hmms for real-world speech recognition. In NIPS workshop on deep learning and unsupervised feature learning, Neural Information Processing Systems.
  • Yu, D., and L. Deng. 2014. Automatic speech recognition: a deep learning approach. (Publisher: Springer), UK: Cambridge University Press, ISBN 978-1-4471-5779-3
  • Yuan, G., C. Ho, and C. Lin. 2012. “Recent advances of large-scale linear classification”, proc.” IEEE 100 (9):2584–603. doi:https://doi.org/10.1109/JPROC.2012.2188013.
  • Zen, H., and H. Sak. 2015. “Unidirectional long short-term memory recurrent neural network with recurrent output layer for low-latency speech synthesis.” ICASSP 33 (5):4470–74.
  • Zen, H., and S. Hasim. 2015. “Unidirectional long short-term memory recurrent neural network with recurrent output layer for low-latency speech synthesis.” ICASSP 694: 4470–74.
  • Zengin, K., N. Esgi, E. Erginer, and M. E. Aksoy. 2011. A sample study on applying data mining research techniques in educational science: Developing a more meaning of data.” Procedia, Social and Behavioral Sciences 15:4028–32. doi:https://doi.org/10.1016/j.sbspro.2011.04.408.
  • Zhong, S., Y. Liu, and Y. Liu, (2011). “Bilinear Deep Learning for Image Classification”. 19th ACM International Conference on Multimedia. MM ‘11. New York, NY, USA: ACM: 343–52.
  • Zhu, W., J. Miao, and L. Qing, ( 2014-07-01). “Constrained extreme learning machine: A novel highly discriminative random feedforward neural network”. 2014 International Joint Conference on Neural Networks (IJCNN): 800–07, Beijing, China, 2014.
  • Zhu, W., J. Miao, L. Qing, and G. B. Huang, “Hierarchical extreme learning machine for unsupervised representation learning”. 2015 International Joint Conference on Neural Networks (IJCNN): 1–8, Killarney, Ireland.
  • Zissis, D., E. K. Xidias, and D. Lekkas. 2015. A cloud based architecture capable of perceiving and predicting multiple vessel behaviour. Applied Soft Computing 35:652–61. doi:https://doi.org/10.1016/j.asoc.2015.07.002.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.