10,766
Views
9
CrossRef citations to date
0
Altmetric
Review Articles

Synergy of physics-based reasoning and machine learning in biomedical applications: towards unlimited deep learning with limited data

, &
Article: 1582361 | Received 29 Aug 2018, Accepted 04 Feb 2019, Published online: 14 Mar 2019

References

  • Toga AW, Kuttler KG, Simpson KJ, et al. Automated detection of physiologic deterioration in hospitalized patients. J Am Med Inform Assoc. 2015 Nov;22:204–256.
  • Denaxas SC, Morley KI. Big biomedical data and cardiovascular disease research: opportunities and challenges. Eur Heart J - Qual Care Clinl Outcomes. 2015 July 1;1:9–16.
  • Cano A. A survey on graphic processing unit computing for large-scale data mining. WIREs Data Min Knowl Discovery. 2018;8:e1232.
  • Grover P, Kar AK. Big data analytics: a review on theoretical contributions and tools used in literature. J Flex Syst Manag. 2017;18:203.
  • LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015 May;521:436–444.
  • Deng L, Yu D. Deep learning: methods and applications. Found Trends Signal Process. 2014 June;7:197–387.
  • Friedman J. Machine. Ann Stat. 2001 Oct;29:1189–1232.
  • Chen T, Guestrin C. XGBoost: a scalable tree boosting system. Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining; 2016 August 13-17 San Francisco, CA, USA. p. 785–794.
  • Bishop CM. Pattern recognition and machine learning. New York, NY: Springer-Verlag; 2006.
  • Gavrishchaka VV. Boosting-based frameworks in financial modeling: application to symbolic volatility forecasting. In: Fomby TB, Terrell D, editors. Econometric analysis of financial and economic time series advances in econometrics. Vol. 20, part 2. Bingley, UK: Emerald Group Publishing Limited; 2006. p. 123–151.
  • Gavrishchaka VV, Koepke ME, Ulyanova ON. Ensemble learning frameworks for the discovery of multi-component quantitative models in biomedical applications. Proc ICCMS. 2010;4:329–336.
  • Gavrishchaka VV, Senyukova OV. Robust algorithmic detection of cardiac pathologies from short periods of RR data. In: Pham TD, Jaim LC, editors. Knowledge-based systems in biomedicine and computational life science, studies in computational intelligence. Vol. 450. Heidelberg, Germany: Springer; 2013. p. 137–153.
  • Senyukova OV, Gavrishchaka VV. Ensemble decomposition learning for optimal utilization of implicitly encoded knowledge in biomedical applications. Proc Comput Intell Bioinf. 4, 2011. 69–73.
  • Senyukova O, Gavrishchaka V, Koepke M. Universal multi-complexity measures for physiological state quantification in intelligent diagnostics and monitoring systems. In: Pham TD, Ichikawa K, Oyama-Higa M, et al., editors. Biomedical informatics and technology, ACBIT 2013, communications in computer and information science. Vol. 404. Berlin, Germany: Springer-Verlag Berlin; 2014. p. 76–90.
  • Voss A, Schulz S, Schroederet R, et al. Methods derived from nonlinear dynamics for analysing heart rate variability. Phylosophical Trans Royal Soc A. 2009 Jan;367:277–296.
  • Belair J, Glass L, An der Haiden U, et al. Dynamical disease: mathematical analysis of human illness. New York: AIP Press; 1995.
  • Kantz H, Schreiber T. Nonlinear time series analysis. Cambridge UK Cambridge University Press; 1997.
  • Schapire RE. The design and analysis of efficient learning algorithms [Ph.D. dissertation]. MA: Massachusetts Institute of Technology Cambridge; 1992.
  • Friedman J, Hastie T, Tibshirani R. Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors). Ann Stat. 2000 Apr;28:337–407.
  • Gavrishchaka V, Senyukova O, Davis K. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics. In: Sun C, Bednardz T, Pham TD, et al., editors. Advances in experimental medicine and biology. Vol. 823. Cham, Switzerland: Springer International Publishing; 2015. p. 107–126.
  • Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science. 2006 July;313:504–507.
  • Gehring J, Miao Y, Metze F, et al. Extracting deep bottleneck features using stacked auto-encoders. International Conference on Acoustics, Speech, and Signal Processing; 2013 May 26-30 Vancouver,p. 3377–3381.
  • Gavrishchaka V, Yang Z, Miao R, et al. Advantages of hybrid deep learning frameworks in applications with limited data. Int J Mach Learn Comput. 2018;8:549–558.
  • Che Z, Purushotham S, Khemain R, et al. Distilling knowledge from deep networks with applications to healthcare domain. arXiv:151203542 [statML]. 2015 Dec 11.
  • Zhou Z-H, Feng J. Deep forest: towards an alternative to deep neural networks. Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence; 2017 August 19-25 Melbourne, p. 3553–3559.
  • Filonenko E, Seeram E. Big data: the next era of informatics and data science in medical imaging: a literature review. J Clin Exp Radiol. 2018;1:1.
  • Ågren R. On metabolic networks and multi-omics integration [PhD Thesis, Department of Chemical and Biological Engineering, Chalmers University of Technology, Gothenburg, Sweden]; 2013
  • Grimbs S. Towards structure and dynamics of metabolic networks [PhD Thesis, University of Potsdam]; 2009
  • Saha1 R, Chowdhury A, Maranas CD. Recent advances in the reconstruction of metabolic models and integration of omics data. Curr Opin Biotechnol. 2014;29:39–45.
  • Hastie T, Tibshirani R, Friedman J. The elements of statistical learning, Springer Series in Statistics. New York, NY: Springer New York Inc; 2001.
  • Fiete IR. Learning and coding in biological neural networks [PhD Thesis]. Cambridge, MA: Harvard University; 2003
  • Aljadeff J, Lansdell BJ, Fairhall AL, Kleinfeld D, 2016, Analysis of Neuronal Spike Trains, Deconstructed, Neuron. 91, pp. 221-59. doi: 10.1016/j.neuron.2016.05.039.
  • Kohonen T. Self-organization and associative memory. New York: Springer; 1989.
  • Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back-propagating errors. Nature. 1986 Oct;323:533–536.
  • Werbos PJ. Backpropagation through time: what it does and how to do it. Proc IEEE. 1990 Oct;78:1550–1560.
  • Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997 Nov;9(8):1735–1780.
  • Kolmogorov, AN, On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition, 1957, Dokl. Akad. Nauk SSSR, 114, 953–956.
  • Cybenko G. Approximation by superpositions of a sigmoidal function. Math Control Signals Syst. 1989 Dec;2:303–314.
  • Saxe AM, McClelland JL, Ganguli S. Exact solutions to the nonlinear dynamics of learning in deep linear neural networks. arXiv:13126120v3 [csNE]. 2014 Feb 19.
  • Dziak JJ, Coffman DL, Lanza ST, Li R. 2017. Sensitivity and specificity of information criteria. PeerJ Preprints 5:e1103v3 https://doi.org/10.7287/peerj.preprints.1103v3.
  • Vapnik V. The nature of statistical learning theory. Heidelberg: Springer Verlag; 1995.
  • Vapnik V. Statistical learning theory. New York: Wiley; 1998.
  • Ratsch G. Robust boosting via convex optimization: theory and applications [Ph.D. thesis, University of Potsdam] ; 2001
  • Scholkopf B, Tsuda K, Vert JP, Eds. Kernel methods in computational biology (computational molecular biology). Cambridge, MA: MIT Press; 2004.
  • Gavrishchaka VV. BOOSTING-BASED FRAMEWORK FOR PORTFOLIO STRATEGY DISCOVERY AND OPTIMIZATION. New Math Nat Comput. 2006;2:315.
  • Freund Y, Schapire RE. A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci. 1997;55:119.
  • Valiant LG. A theory of the learnable. Commun ACM. 1984;27:1134.
  • Elder JF IV. The generalization paradox of ensembles. J Comput Graph Stat. 2003;12:853–864.
  • Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. In: Pereira F, Burges CJC, Bottou L, et al., editors. NIPS'12 Proceedings of the 25th International Conference on Neural Information Processing Systems; 2012 December 03-06; Lake Tahoe, Nevada; 2012 p. 1097–1105.
  • Banerjee S, Gavrishchaka VV. Multimoment convecting flux tube model of the polar wind system with return current and microprocesses. J Atmos Sol Terr Phys. 2007 Nov;69:2071–2080.
  • Erhan D, Bengio Y, Courville A, Manzagol P-A, Vincent P, Bengio S, Why Does Unsupervised Pre-training Help Deep Learning? Journal of Machine Learning Research 11 (2010) 625-660.
  • Shin H, Roth HR, Gao M, et al. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Trans Med Image. 2016 May;35:1285–1298.
  • Christodoulidis S, Anthimopoulos M, Ebner L, et al. Multisource transfer learning with convolutional neural networks for lung pattern analysis. IEEE J Biomed Health Inform. 2017 Jan;21:76–84.
  • Huang Z, Pan Z, Lei B. Transfer learning with deep convolutional neural network for SAR target classification with limited labeled data. Remote Sens. 2017 Aug;9:907.
  • Bart E, Ullman S. Single-example learning of novel classes using representation by similarity. Proceedings of the British Machine Vision Conference; 2005 September 5-8; Oxford Brookes University, Oxford.
  • Horton W, Doxas I. A low-dimensional energy-conserving state space model for substorm dynamics. J Geophys Res (Space Phys). 1996;101:27223–27237.
  • Gavrishchaka VV, Ganguli SB. Optimization of the neural-network geomagnetic model for forecasting large-amplitude substorm events. J Geophys Res. 2001;106:6247–6257.
  • Kotani K, Struzik ZR, Takamasu K, et al. Model for complex heart rate dynamics in health and diseases. Phys Rev E. 2005;72:041904.
  • Gavrishchaka VV, Ganguli SB. Volatility forecasting from multiscale and high-dimensional market data. Neurocomputing. 2003;55(1–2):285–305.
  • Gavrishchaka VV, Banerjee S. Support vector machine as an efficient framework for stock market volatility forecasting. Comput Manage Sci. 2006;3:147–160.
  • Costa M, Goldberger A, Peng C-K. Multiscale entropy analysis of biological signals. Phys Rev E. 2005;71:021906.
  • Ramanna R, Tchalekian R. Simulation: a must for autonomous driving, NVIDIA’s GPU Technology Conference; 2018 (GTC), Washington, DC; Talk ID: S8859.
  • Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y. Generative adversarial nets. NIPS'14 Proceedings of the 27th International Conference on Neural Information Processing Systems; 2014 December 8-13; Montreal, Canada; 2. p. 2672-2680.
  • Aliper A, Plis S, Artemov A, et al. Deep learning applications for predicting pharmacological properties of drugs and drug repurposing using transcriptomic data. Mol Pharmaceutics. 2016;13:2524−2530.
  • Kadurin A, Aliper A, Kazennov A, et al. The cornucopia of meaningful leads: applying deep adversarial autoencoders for new molecule development in oncology. Oncotarget. 2017;8(7):10883–10890.
  • Peng C-K, Havlin S, Stanley EH, et al. Quantification of scaling exponents and crossover phenomena in nonstationary heartbeat time series. Chaos. 1995 Sep;5:82–87.
  • Costa M, Goldberger AL, Peng C-K. Multiscale entropy analysis of biological signals. Phys Rev Lett E. 2005 Feb;71:021906.
  • Makowiec D, Dudkowska A, Zwierz M, Gałąska R, Rynkiewicz A (2006), Scale Invariant Properties in Heart Rate Signals, Acta Phys Pol B. 37:1627–1639.
  • Task Force of the European Society of Cardiology the North American Society of Pacing Electrophysiology. Heart rate variability: standards of measurement, physiological interpretation, and clinical use. Circulation. 1996 Mar;93:1043–1065.
  • Hausdorff JM, Mitchell SL, Firtion R, et al. Altered fractal dynamics of gait: reduced stride-interval correlations with aging and Huntington’s disease. J Appl Physiol. 1997;82:262–269.
  • Hausdorff JM, Lertratanakul A, Cudkowicz ME, et al. Dynamic markers of altered gait rhythm in amyotrophic lateral sclerosis. J Appl Physiol. 2000;88:2045–2053.
  • Damouras S, Chang MD, Sejdic E, et al. An empirical examination of detrended fluctuation analysis for gait data. Gait Posture. 2010;31:336–340.
  • Ota L, Uchitomi H, Suzuki K, Hove MJ, Orimo S, Miyake Y. Relationship between fractal property of gait cycle and severity of Parkinson’s disease. in IEEE/SICE International Symposium on System Intergration; 2011 December; Kyoto, Japan.
  • Biswas AK, Scott WA, Sommerauer JF, et al. Heart rate variability after acute traumatic brain injury in children. Crit Care Med. 2000;28:3907–3912.
  • Yang AC, Hong CJ, Tsai SJ, Heart rate variability in psychiatric disorders, Taiwanese Journal ofPsychiatry 24, 99-109 (2010).
  • Akinici A, Celiker A, Baykal E, et al. Heart rate variability in diabetic children: sensitivity of the time- and frequency-domain methods. Pediatr Cardiol. 1993;14:140–146.
  • Baumert M, Brechtel L, Lock J, et al. Heart rate variability, blood pressure variability, and baroreflex sensitivity in overtrained athletes. Clin J Sport Med. 2006;16:412.
  • Smrcka P, Bittner R, Vysoky P, Hána K, Fractal and multifractal properties of heartbeat interval series in extremal states of the human organism, Measurement Science Review 3, 13–15 (2003).
  • Senyukova O, Gavrishchaka V, Sasonko M, et al. Generic ensemble-based representation of global cardiovascular dynamics for personalized treatment discovery and optimization. In: Nguen NT, Iliadis L, Manolopoulos Y, et al., editors. Computational collective intelligence: 8th International Conference on Computational Collective Intelligence ICCCI, 2016 September 28-30; Halkidiki, Greece 9875; p. 197–207.
  • Onnela J-P, Chakraborti A, Kaski K, et al. Dynamics of market correlations: taxonomy and portfolio analysis. Phys Rev E. 2003;68:056110.
  • Theodoridis S, Koutroumbas K. Pattern recognition. San Diego, CA: Academic Press; 1998.
  • Senyukova O, Gavrishchaka V, Tulnova K. Multi-expert evolving system for objective psychophysiological monitoring and fast discovery of effective personalized therapies. Proceedings of IEEE Conference on Evolving Adaptive Intelligence Systems; 2017; Ljubljana, Slovenia;p. 1–8.
  • Hausdorff J, Zemany L, Peng C, et al. Maturation of gait dynamics: stride-to-stride variability and its temporal organization in children. J Appl Physiol. 1999;86:1040–1047.
  • Hausdorff J, Lertratanakul A, Cudkowicz ME, et al. Dynamic markers of altered gait rhythm in amyotrophic lateral sclerosis. J Appl Physiol. 2000;88:2045–2053.