140
Views
4
CrossRef citations to date
0
Altmetric
Articles

An integrated autoencoder-based filter for sparse big data

ORCID Icon & ORCID Icon
Pages 260-268 | Received 30 Nov 2019, Accepted 09 Apr 2020, Published online: 06 May 2020

References

  • Abraham, B., & Nair, M. S. (2018). Computer-aided diagnosis of clinically significant prostate cancer from MRI images using sparse autoencoder and random forest classifier. Biocybernetics and Biomedical Engineering, 38(3), 733–744. https://doi.org/10.1016/j.bbe.2018.06.009
  • Alipanahi, B., Delong, A., Weirauch, M. T., & Frey, B. J. (2015). Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning. Nature Biotechnology, 33(8), 831–838. doi:10.1038/nbt.3300.
  • Araque, O., Corcuera-Platas, I., Sánchez-Rada, J. F., & Iglesias, C. A. (2017). Enhancing deep learning sentiment analysis with ensemble techniques in social applications. Expert Systems with Applications, 77, 236–246. doi: 10.1016/j.eswa.2017.02.002
  • Athey, S. (2017). Beyond prediction: Using big data for policy problems. Science, 355(6324), 483–485. doi: 10.1126/science.aal4321
  • Athey, S., Blei, D., Donnelly, R., Ruiz, F., & Schmidt, T. (2018). Estimating heterogeneous consumer preferences for restaurants and travel time using mobile location data. AEA Papers and Proceedings, 108, 64–67. doi: 10.1257/pandp.20181031
  • Baek, Y., & Kim, H. Y. (2018). ModAugNet: A new forecasting framework for stock market index value with an overfitting prevention LSTM module and a prediction LSTM module. Expert Systems with Applications, 113, 457–480. doi: 10.1016/j.eswa.2018.07.019
  • Bai, Y., Li, Y., Zeng, B., Li, C., & Zhang, J. (2019). Hourly PM2.5 concentration forecast using stacked autoencoder model with emphasis on seasonality. Journal of Cleaner Production, 224, 739–750. doi: 10.1016/j.jclepro.2019.03.253
  • Barfuss, W., Massara, G. P., Di Matteo, T., & Aste, T. (2016). Parsimonious modeling with information filtering networks. Physical Review E, 94(6), 062306. doi: 10.1103/PhysRevE.94.062306
  • Belohlavek, R., Outrata, J., & Trnecka, M. (2018). Toward quality assessment of Boolean matrix factorizations. Information Sciences, 459, 71–85. doi: 10.1016/j.ins.2018.05.016
  • Beyaz, A., Martínez Gila, D. M., Gómez Ortega, J., & Gámez García, J. (2019). Olive fly sting detection based on computer vision. Postharvest Biology and Technology, 150, 129–136. doi: 10.1016/j.postharvbio.2019.01.003
  • Busu, C., & Busu, M. (2019). Modeling the predictive power of the singular value decomposition-based entropy. Empirical evidence from the Dow Jones Global Titans 50 Index. Physica A: Statistical Mechanics and its Applications, 534(15), 120819. doi: 10.1016/j.physa.2019.04.055
  • Chai, J., Wang, Y., Wang, S., & Wang, Y. (2019). A decomposition–integration model with dynamic fuzzy reconstruction for crude oil price prediction and the implications for sustainable development. Journal of Cleaner Production, 229(20), 775–786. doi: 10.1016/j.jclepro.2019.04.393
  • Chan, T., Jia, K., Gao, S., Lu, J., Zeng, Z., & Ma, Y. (2015). PCANet: A simple deep learning baseline for image classification? IEEE Transactions on Image Processing, 24(12), 5017–5032. doi: 10.1109/TIP.2015.2475625
  • Chen, M., Hao, Y., Hwang, K., Wang, L., & Wang, L. (2017). Disease prediction by machine learning over big data from healthcare communities. IEEE Access, 5, 8869–8879. doi: 10.1109/ACCESS.2017.2694446
  • Deng, Y., Sander, A., Faulstich, L., & Denecke, K. (2019). Towards automatic encoding of medical procedures using convolutional neural networks and autoencoders. Artificial Intelligence in Medicine, 93, 29–42. doi: 10.1016/j.artmed.2018.10.001
  • Fayek, H. M., Lech, M., & Cavedon, L. (2017). Evaluating deep learning architectures for speech emotion recognition. Neural Networks, 92, 60–68. doi: 10.1016/j.neunet.2017.02.013
  • Fischer, T., & Krauss, C. (2018). Deep learning with long short-term memory networks for financial market predictions. European Journal of Operational Research, 270(2), 654–669. doi: 10.1016/j.ejor.2017.11.054
  • Frazier, D. T., Maneesoonthorn, W., Martin, G. M., & McCabe, B. P. M. (2019). Approximate Bayesian forecasting. International Journal of Forecasting, 35(2), 521–539. doi: 10.1016/j.ijforecast.2018.08.003
  • Fu, X., Wei, Y., Xu, F., Wang, T., Lu, Y., Li, J., & Huang, J. Z. (2019). Semi-supervised aspect-level sentiment classification model based on variational autoencoder. Knowledge-Based Systems, 171, 81–92. doi: 10.1016/j.knosys.2019.02.008
  • Görgel, P., & Simsek, A. (2019). Face recognition via deep stacked denoising sparse autoencoders (DSDSA). Applied Mathematics and Computation, 355, 325–342. doi: 10.1016/j.amc.2019.02.071
  • Grolinger, K., Capretz, M. A. M., & Seewald, L. (2016, December 5-8). Energy consumption prediction with big data: Balancing prediction accuracy and computational resources. 2016 IEEE International Congress on Big Data, Washington D.C., USA.
  • Hinton, G., Deng, L., Yu, D., Dahl, G. E., Mohamed, A.-r., Jaitly, N., Senior, A., Vanhoucke, V., Nguyen, P., Sainath, T., & Kingsbury, B. (2012). Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Processing Magazine, 29(6), 82–97. doi: 10.1109/MSP.2012.2205597
  • Jia, F., Lei, Y., Guo, L., Lin, J., & Xing, S. (2018). A neural network constructed by deep learning technique and its application to intelligent fault diagnosis of machines. Neurocomputing, 272, 619–628. doi: 10.1016/j.neucom.2017.07.032
  • Koren, Y., Bell, R., & Volinsky, C. (2009). Matrix factorization techniques for recommender systems. Computer, 42(8), 30–37. doi:10.1109/MC.2009.263.
  • Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012, December 3–8). ImageNet classification with deep convolutional neural networks. Paper presented at the NIPS’12: Proceedings of the 25th International Conference on Neural Information Processing Systems, North Miami Beach, FL.
  • Liu, Z., & Sullivan, C. J. (2019). Prediction of weather induced background radiation fluctuation with recurrent neural networks. Radiation Physics and Chemistry, 155, 275–280. doi: 10.1016/j.radphyschem.2018.03.005
  • Lv, Y., Duan, Y., Kang, W., Li, Z., & Wang, F. Y. (2015). Traffic flow prediction with big data: A deep learning approach. IEEE Transactions on Intelligent Transportation Systems, 16(2), 865–873. doi:10.1109/TITS.2014.2345663.
  • Lv, S.-X., Peng, L., & Wang, L. (2018). Stacked autoencoder with echo-state regression for tourism demand forecasting using search query data. Applied Soft Computing, 73, 119–133. doi: 10.1016/j.asoc.2018.08.024
  • Mahdi, M., & Genc, V. M. I. (2018). Post-fault prediction of transient instabilities using stacked sparse autoencoder. Electric Power Systems Research, 164, 243–252. doi: 10.1016/j.epsr.2018.08.009
  • Mei, L., Zixian, L., Xiaopeng, L., & Yiliu, L. (2019). Dynamic risk assessment in healthcare based on Bayesian approach. Reliability Engineering & System Safety, 189, 327–334. doi: 10.1016/j.ress.2019.04.040
  • Mohamed, A., Dahl, G. E., & Hinton, G. (2012). Acoustic modeling using deep belief networks. IEEE Transactions on Audio, Speech, and Language Processing, 20(1), 14–22. doi: 10.1109/TASL.2011.2109382
  • Peng, W., & Xin, B. (2019). A social trust and preference segmentation-based matrix factorization recommendation algorithm. EURASIP Journal on Wireless Communications and Networking, 2019(1), 272. doi: 10.1186/s13638-019-1600-4
  • Qi, L., Wang, R., Hu, C., Li, S., He, Q., & Xu, X. (2019). Time-aware distributed service recommendation with privacy-preservation. Information Sciences, 480, 354–364. doi: 10.1016/j.ins.2018.11.030
  • Qing yang, G., Shuang, W., & Ya-Ru, H. (2019). Deep learning network for multiuser detection in satellite mobile communication system. Computational Intelligence and Neuroscience, 2019(2019), 8613639. doi:10.1155/2019/8613639.
  • Sedhain, S., Menon, A. K., Sanner, S., & Xie, L. (2015, May 18–22). AutoRec: Autoencoders meet collaborative filtering. WWW '15: 24th International World Wide Web Conference, Florence, Italy.
  • Sehovac, L., Nesen, C., & Grolinger, K. (2019, 8–13 July 2019). Forecasting building energy consumption with deep learning: A sequence to sequence approach. 2019 IEEE International Congress on Internet of Things (ICIOT), Milan, Italy.
  • Shang, J., Zheng, Y., Tong, W., Chang, E., & Yu, Y. (2014, August 24–27). Inferring gas consumption and pollution emission of vehicles throughout a city. Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, New York, USA.
  • Shen, F., Chao, J., & Zhao, J. (2015). Forecasting exchange rate using deep belief networks and conjugate gradient method. Neurocomputing, 167, 243–253. doi: 10.1016/j.neucom.2015.04.071
  • Song, G., Wang, Z., Han, F., Ding, S., & Iqbal, M. A. (2018). Music auto-tagging using deep recurrent neural networks. Neurocomputing, 292, 104–110. doi: 10.1016/j.neucom.2018.02.076
  • Tănăsescu, A., & Popescu, P. G. (2019). A fast singular value decomposition algorithm of general k-tridiagonal matrices. Journal of Computational Science, 31, 1–5. doi: 10.1016/j.jocs.2018.12.009
  • Tang, J., Chen, X., Hu, Z., Zong, F., Han, C., & Li, L. (2019). Traffic flow prediction based on combination of support vector machine and data denoising schemes. Physica A: Statistical Mechanics and its Applications, 534(15), 120642. doi: 10.1016/j.physa.2019.03.007
  • Ting, F. F., Tan, Y. J., & Sim, K. S. (2019). Convolutional neural network improvement for breast cancer classification. Expert Systems with Applications, 120, 103–115. doi: 10.1016/j.eswa.2018.11.008
  • Tong, H., Liu, B., & Wang, S. (2018). Software defect prediction using stacked denoising autoencoders and two-stage ensemble learning. Information and Software Technology, 96, 94–111. doi: 10.1016/j.infsof.2017.11.008
  • Vincent, P., Larochelle, H., Bengio, Y., & Manzagol, P. A. (2008, July 5–9). Extracting and composing robust features with denoising autoencoders. ICML ‘08: The 25th Annual International Conference on Machine Learning, Helsinki, Finland.
  • Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y., & Manzagol, P.-A. (2010). Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. Journal of Machine Learning Research, 11(12), 3371–3408. http://www.jmlr.org/papers/volume11/vincent10a/vincent10a.pdf.
  • Wang, J., Peng, B., & Zhang, X. (2018a). Using a stacked residual LSTM model for sentiment intensity prediction. Neurocomputing, 322, 93–101. doi: 10.1016/j.neucom.2018.09.049
  • Wang, K., Qi, X., Liu, H., & Song, J. (2018b). Deep belief network based k-means cluster approach for short-term wind power forecasting. Energy, 165, 840–852. doi: 10.1016/j.energy.2018.09.118
  • Wang, K., Xu, L., Huang, L., Wang, C.-D., & Lai, J.-H. (2019). SDDRS: Stacked discriminative denoising auto-encoder based recommender system. Cognitive Systems Research, 55, 164–174. doi: 10.1016/j.cogsys.2019.01.011
  • Wei, Y., Park, J. H., Qiu, J., Wu, L., & Jung, H. Y. (2017). Sliding mode control for semi-Markovian jump systems via output feedback. Automatica, 81, 133–141. doi:10.1016/j.automatica.2017.03.032.
  • Wei, Y., Qiu, J., Karimi, H., & Ji, W. (2018). A novel memory filtering design for semi-Markovian jump time-delay systems. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 48(12), 2229–2241. doi:10.1109/TSMC.2017.2759900.
  • Wei, Y., Qiu, J., Karimi, H., & Wang, M. (2015). Model approximation for two-dimensional Markovian jump systems with state-delays and imperfect mode information. Multidimensional Systems and Signal Processing, 26(3), 575–597. doi:10.1007/s11045-013-0276-x.
  • Xu, Z., Lv, T., Liu, L., Zhang, Z., & Tan, J. (2019). A regression-type support vector machine for k-class problem. Neurocomputing, 340, 1–7. doi: 10.1016/j.neucom.2019.02.033
  • Xu, F., Tse, W. t. P., & Tse, Y. L. (2018). Roller bearing fault diagnosis using stacked denoising autoencoder in deep learning and Gath–Geva clustering algorithm without principal component analysis and data label. Applied Soft Computing, 73, 898–913. doi: 10.1016/j.asoc.2018.09.037
  • Zhang, L., Jiao, L., Ma, W., Duan, Y., & Zhang, D. (2019). PolSAR image classification based on multi-scale stacked sparse autoencoder. Neurocomputing, 351(25), 167–179. doi:10.1016/j.neucom.2019.03.024
  • Zheng, Y., Capra, L., Wolfson, O., & Yang, H. (2014). Urban computing: concepts, methodologies, and applications. ACM Transactions on Intelligent Systems and Technology (TIST), 5(3), 38. doi:10.1145/2629592.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.