75
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Error analysis of the moving least-squares regression learning algorithm with β-mixing and non-identical sampling

&
Pages 1586-1602 | Received 21 Jun 2018, Accepted 19 Jun 2019, Published online: 04 Jul 2019

References

  • R.C. Bradley, Basic properties of strong mixing conditions. A survey and some open questions, Probab. Surveys 2 (2005), pp. 107–144. doi: 10.1214/154957805100000104
  • M. Cerny, J. Antoch, and M. Hladik, On the possibilistic approach to linear regression models involving uncertain, indeterminate or interval data, Inform. Sci. 244 (2013), pp. 26–47. doi: 10.1016/j.ins.2013.04.035
  • F. Cucker and D.X. Zhou, Learning Theory: An Approximation Theory Viewpoint, Cambridge University, Cambridge, 2007.
  • G.E. Fasshauer, Toward approximate moving least squares approximation with irregularly spaced centers, Comput. Methods Appl. Mech. Eng. 193 (2004), pp. 1231–1243. doi: 10.1016/j.cma.2003.12.017
  • Z.C. Guo and L. Shi, Classification with non-i.i.d. sampling, Math. Comput. Model. 54(5–6) (2011), pp. 1347–1364. doi: 10.1016/j.mcm.2011.03.042
  • Q. Guo and P.X. Ye, Coefficient-based regularized regression with dependent and unbounded sampling, Int. J. Wavelets. Multiresolut. Inf. Process. 14(5) (2016). Article ID: 1650039, 14 pp., DOI: 10.1142/S0219691316500399.
  • Q. Guo and P.X. Ye, Error analysis of the moving least-squares method with non-identical sampling, Int. J. Comput. Math. 96(4) (2019), pp. 767–781. doi: 10.1080/00207160.2018.1469748
  • Q. Guo, P.X. Ye, and B.L. Cai, Convergence rate for lq-coefficient regularized regression with non-i.i.d. sampling, IEEE Access 6 (2018), pp. 18804–18813. DOI: 10.1109/ACCESS.2018.2817215.
  • F.C. He, H. Chen, and L.Q. Li, Statistical analysis of the moving least-squares method with unbounded sampling, Inform. Sci. 268 (2014), pp. 370–380. doi: 10.1016/j.ins.2014.01.001
  • Z. Komargodski and D. Levin, Hermite type moving-least-squares approximations, Comput. Math. Appl. 51(8) (2006), pp. 1223–1232. doi: 10.1016/j.camwa.2006.04.005
  • D.H. McLain, Drawing contours from arbitrary data points, Comput. J. 17(4) (1974), pp. 318–324. doi: 10.1093/comjnl/17.4.318
  • Z.W. Pan and Q.W. Xiao, Least-square regularized regression with non-iid sampling, J. Statist. Plann. Inference 139(10) (2009), pp. 3579–3587. doi: 10.1016/j.jspi.2009.04.007
  • A. Savitzky and M.J.E. Golay, Smoothing and differentiation of data by simplified least squares procedures, Anal. Chem. 36(8) (1964), pp. 1627–1639. doi: 10.1021/ac60214a047
  • D. Shepard, A two-dimensional interpolation function for irregularly-spaced data, Proc. of 23rd ACM national conf., 1968, pp. 517–524.
  • S. Smale, and D.X. Zhou, Online learning with Markov sampling, Anal. Appl. 7(1) (2009), pp. 87–113. doi: 10.1142/S0219530509001293
  • I. Steinwart, D. Hush, and C. Scovel, Learning from dependent observations, J. Multivariate Anal. 100(1) (2009), pp. 175–194. doi: 10.1016/j.jmva.2008.04.001
  • H.W. Sun and Q. Guo, Coefficient regularized regression with non-iid sampling, Int. J. Comput. Math. 88(15) (2011), pp. 3113–3124. doi: 10.1080/00207160.2011.587511
  • H.Z. Tong and Q. Wu, Learning performance of regularized moving least square regression, J. Comput. Appl. Math. 325 (2017), pp. 42–55. doi: 10.1016/j.cam.2017.04.046
  • M.P. Wand and M.C. Jones, Kernel Smoothing, Chapman and Hall, London, 1995.
  • H.Y. Wang, Concentration estimates for the moving least-square method in learning theory, J. Approx. Theory 163(9) (2011), pp. 1125–1133. doi: 10.1016/j.jat.2011.03.006
  • H.Y. Wang, D.H. Xiang, and D.X. Zhou, Moving least-square method in learning theory, J. Approx. Theory 162(3) (2010), pp. 599–614. doi: 10.1016/j.jat.2009.12.002
  • Q. Wu, Y.M. Ying, and D.X. Zhou, Learning rates of least square regularized regression, Found. Comput. Math. 6(2) (2006), pp. 171–192. doi: 10.1007/s10208-004-0155-9
  • Y.M. Ying and D.X. Zhou, Unregularized online learning algorithms with general loss functions, Appl. Comput. Harmon. Anal. 42(2) (2017), pp. 224–244. doi: 10.1016/j.acha.2015.08.007
  • D.X. Zhou, Deep distributed convolutional neural networks: Universality, Anal. Appl. 16(6) (2018), pp. 895–919. doi: 10.1142/S0219530518500124

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.