Abstract
In this paper, we investigate the multiscale support vector regression (SVR) method for approximation of functions in Sobolev spaces on bounded domains. The Vapnik ϵ-intensive loss function, which has been developed well in learning theory, is introduced to replace the standard l2 loss function in multiscale least squares methods. Convergence analysis is presented to verify the validity of the multiscale SVR method with scaled versions of compactly supported radial basis functions. Error estimates on noisy observation data are also derived to show the robustness of our proposed algorithm. Numerical simulations support the theoretical predictions.
Notes
Dedicated to Professor Bernd Hofmann for his 60th birthday.
This work was supported by National Natural Science Foundation of China [grant number 11101093]; Chinese Ministry of Education [grant number 20110071120001].