Abstract
This paper studies the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert space. The target is the error analysis for the regression problem in learning theory. The upper and lower bounds of error are simultaneously estimated, which yield the optimal learning rate. The upper bound depends on the covering number and the approximation property of the reproducing kernel Hilbert space. The lower bound lies on the entropy number of the set that includes the regression function. Also, the rate is independent of the choice of the index q of the regular term.
Acknowledgements
The research was supported by the National Natural Science Foundation of China (Nos. 90818020, 60873206) and the Natural Science Foundation of Zhejiang Province of China (No. Y7080235).