Publication Cover
Applicable Analysis
An International Journal
Volume 91, 2012 - Issue 5
107
Views
7
CrossRef citations to date
0
Altmetric
Original Articles

Least-squares regularized regression with dependent samples and q-penalty

Pages 979-991 | Received 23 Oct 2010, Accepted 25 Jan 2011, Published online: 23 Mar 2011
 

Abstract

Least-squares regularized learning algorithms for regression were well-studied in the literature when the sampling process is independent and the regularization term is the square of the norm in a reproducing kernel Hilbert space (RKHS). Some analysis has also been done for dependent sampling processes or regularizers being the qth power of the function norm (q-penalty) with 0 < q ≤ 2. The purpose of this article is to conduct error analysis of the least-squares regularized regression algorithm when the sampling sequence is weakly dependent satisfying an exponentially decaying α-mixing condition and when the regularizer takes the q-penalty with 0 < q ≤ 2. We use a covering number argument and derive learning rates in terms of the α-mixing decay, an approximation condition and the capacity of balls of the RKHS.

AMS Subject Classification::

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.