References
- Krittika, N. K. Vishvakarma, R. R. K. Sharma, & K. K. Lai, “Linking big data analytics to a few industrial applications: A conceptual review:,” Journal of Information and Optimization Sciences, vol.38, no.6, pp.803-812, 2017. doi: 10.1080/02522667.2017.1372130
- N. M. Masoud, “The impact of stock market performance upon economic growth,” International Journal of Economics and Financial Issues, vol.3, no.4, pp.788-798, 2013.
- J. T. Connor, R. D. MartinandL. E. Atlas, “Recurrent neural networks and robust time series prediction,” IEEE transactions on neural networks, vol.5, no.2, pp.240-254, 1994. doi: 10.1109/72.279188
- D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,”Nature, vol.323, no.6088, pp.533, 1986. doi: 10.1038/323533a0
- P. Kumar, S. K. Sahu, & A. P. Singh, “Generating image descriptions using capsule network,” Journal of Information and Optimization Sciences, vol.40, no.2, 479-492, 2019. doi: 10.1080/02522667.2019.1587822
- Y. Bengio, P. Simard, and P. Frasconi, “Learning long-term dependencies with gradient descent is difficult,” IEEE transactions on neural networks, vol.5, no.2, pp.157-166, 1994. doi: 10.1109/72.279181
- S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation, vol.9, no.8, pp.1735-1780, 1997. doi: 10.1162/neco.1997.9.8.1735
- D. M. Nelson, A. C. Pereira, and R. A. de Oliveira, “Stock market’s price movement prediction with LSTM neural networks,” IEEE International Joint Conference on Neural Networks (IJCNN), pp. 1419-1426, 2017.
- L. Finsveen, “Time-series predictions with Recurrent Neural Networks-Studying Recurrent Neural Networks predictions and comparing to state-of-the-art Sequential Monte Carlo methods,” Master’s thesis, NTNU,2018.
- C. Olah, “Understanding lstm networks,” colah.github.io,2015.
- G. Williams, R. Baxter, H. He, S. Hawkins, and L. Gu, “A comparative study of RNN for outlier detection in data mining,” IEEE International Conference on In Data Mining 2002, pp. 709-712, ICDM 2003.
- J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Gated feedback recurrent neural networks,” International Conference on Machine Learning, pp.2067-2075, 2015.
- H. Larochelle, D. Erhan, A. Courville, J. Bergstra, and Y. Bengi, “An empirical evaluation of deep architectures on problems with many factors of variation,” Proceedings of the 24th international conference on Machine learning, ACM, pp. 473-480, 2007.
- A. Graves, and J. Schmidhuber, “Framewise phoneme classification with bidirectional LSTM and other neural network architectures,” Neural Networks, vol.18, no.5-6, pp.602-610, 2005. doi: 10.1016/j.neunet.2005.06.042
- J. S. Bridle, “Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition,” Neurocomputing Springer, Berlin, Heidelberg, pp. 227-236, 1990.
- A. N. R. C.Up, “Yahoo! Finance,” Yahoo! Finance, 2012.
- G. Panchal and M. Panchal, “Review on methods of selecting number of hidden nodes in artificial neural network,” International Journal of Computer Science and Mobile Computing, vol.3, no.11, pp.455-464, 2014.
- L. Golab, “Sliding Window Query Processing over Data Streams” UWSpace, 2006.