35
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Regularized Nyström Subsampling in Covariate Shift Domain Adaptation Problems

&
Pages 165-188 | Received 03 Sep 2023, Accepted 05 Feb 2024, Published online: 23 Feb 2024

References

  • Rudi, A., Comoriano, R., Rosasco, L. (2015). Less is more: Nyström computational regularization. In: Cortes, C., Lawrence, N., Lee, D., Sugiyama, M., Garnett, R., Garnett, R., eds. Advances in Neural Information Processing Systems, Vol. 28. Curran Associates, Inc., pp. 1648–1656, also arXiv:1507.04717 [stat.ML]
  • Lu, S., Mathe, P., Pereverzyev-Jr., S. (2019). Analysis of regularized Nyström subsampling for regression function of low smoothness. Anal. Appl. 17(6):931–946. DOI: 10.1142/S0219530519500039.
  • Kriukova, G., Tkachenko, P., Pereverzyev, S. (2016). On the convergence rate and some application of a regularized ranking algorithm. J. Complexity 33:14–29. DOI: 10.1016/j.jco.2015.09.004.
  • Schölkopf, B., Smola, A. J. (2002). Learning with Kernels. Cambridge, MA: MIT Press, 648 p.
  • Hastie, T., Tibshirani, R., Friedman, J. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. New York: Springer, 745 p.
  • Vapnik, V. N. (1998). Statistical Learning Theory. New York: Wiley, 736p.
  • Wilson, G., Cook, D. (2020) A survey of unsupervised deep domain adaptation. ACM Trans. Intell. Syst. Technol. 11:1–46. DOI: 10.1145/3400066.
  • Shimodaria, H. (2000). Improving Predictive Inference under covariate shift by weighting the log-likelihood function. J. Stat. Plan. Inference 90:227–244. DOI: 10.1016/S0378-3758(00)00115-4.
  • Huang, J., Gretton, A., Borgwardt, K., Schölkopf, B., Smola, A. (2006). Correcting sample selection bias by unlabeled data. Adv. Neural Inf. Process. Syst. 19:601–608.
  • Gizewski, E. R., Mayer, L., Moser, B. A., Nguyen, D. H., Pereverzyev-Jr, S., Pereverzyev, S. V., Shepeleva, N., Zellinger, W. (2022). On a regularization of unsupervised domain adaptation in RKHS. Appl. Comput. Harmon. Anal. 57:201–227. DOI: 10.1016/j.acha.2021.12.002.
  • Pereverzyev, S. (2022). An Introduction to Artificial Intelligence Based on Reproducing Kernel Hilbert Spaces. Cham: Springer, 152p.
  • Bakushinski, A. B. (1967). A general method of constructing regularizing algorithms for a linear ill-posed equation in Hilbert space. USSR Comput. Math. Math. Phys. 7:279–287. DOI: 10.1016/0041-5553(67)90047-X.
  • Lu, S., Pereverzev, S. V. (2013). Regularization Theory. Selected Topics. Inverse and Ill-posed Problems Series, Vol. 58. Berlin, Boston: Walter de Gruyter GmbH.
  • Mathe, P., Pereverzev, S. V. (2003). Geometry of linear ill-posed problems in variable Hilbert scales. Inverse Probl. 19:789–803. DOI: 10.1088/0266-5611/19/3/319.
  • Kanamori, T., Hido, S., Sugiyama, M. (2009) A least-squares approach to direct importance estimation. J. Mach. Learn. Res. 10:1391–1445.
  • Sugiyama, M., Müller, K.-R. (2005). Input-dependent estimation of generalization error under covariate shift. Stat. Decis. 23:249–279.
  • Vito, E. D., Rosasco, L., Caponnetto, A., Giovannini, U. D., Odone F. (2005). Learning from examples as an inverse problem. J. Mach. Learn. Res. 6:883–904.
  • Mathé, P., Pereverzev, S. V. (2002) Moduli of continuity for operator valued functions. Numer. Funct. Anal. Optim. 23(5–6):623–631. DOI: 10.1081/NFA-120014755.
  • Mathé, P., Pereverzev, S. V. (2003) Discretization strategy for ill-posed problems in variable Hilbert scales. Inverse Probl. 19(6):1263–1277. DOI: 10.1088/0266-5611/19/6/003.
  • Plato, R., Vainikko, G. M. (1990). On the regularization of projection methods for solving ill-posed problems. Numer. Math. 57:63–79. DOI: 10.1007/BF01386397.
  • Myleiko, G. L., Pereverzyev-Jr., S., Solodky, S. G. (2019). Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions. Anal. Appl. 17:453–475. DOI: 10.1142/S021953051850029X.
  • Kato, T. (1995). Perturbation Theory for Linear Operators. Berlin, Heidelberg: Springer, 623 p.
  • Kriukova, G., Pereverzyev-Jr., S., Tkachenko P. (2016). Nyström type subsampling analyzed as a regularized projection. Inverse Probl. 33(7):074001. DOI: 10.1088/1361-6420/33/7/074001.
  • Cheng, W., Ting, H., Siyang, J. (2023). Pairwise learning problems with regularization networks and Nyström subsampling approach. Neural Netw. 157:176–192.
  • Chen, H. (2012). The convergence rate of a regularized ranking algorithm. J. Approx. Theory. 164:1513–1519. DOI: 10.1016/j.jat.2012.09.001.
  • Lin, S.-B., Guo, X., Zhou, D.-X. (2017). Distributed learning with regularized least squares. J. Mach. Learn. Res. 18:1–31.
  • Minh, H. Q. (2010). Some properties of Gaussian reproducing kernel Hilbert spaces and their implications for function approximation and learning theory. Constr. Approx. 32:307–338. DOI: 10.1007/s00365-009-9080-0.
  • Vainikko, G. M., Veretennikov, A. Yu. (1986). Iteration Procedures in Ill-posed Problems. Moscow: Nauka.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.