Abstract
A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This article introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. In the proposed method, which we call principal coordinate ridge regression, one regresses the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, principal coordinate ridge regression, with dynamic time warping distance used to define the principal coordinates, is shown to outperform a functional generalized linear model. Supplementary materials for this article are available online.
Supplementary Materials
R code for analyses: Code to reproduce the analyses of the toy data and the signature verification data. (GNU zipped tar file)
Acknowledgments
The authors thank Lan Huo, Lei Huang, Huaihou Chen, Rong Jiao, and Fabian Scheipl for their assistance in implementing the methods proposed here, and the associate editor and referees for their very helpful feedback. Preliminary versions of this article were presented at the Banff International Research Station workshop “Frontiers in Functional Data Analysis” in July 2015, and at the International Workshop on Advances in Functional Data Analysis held at Universidad Carlos III de Madrid in November 2015. The authors thank the participants of both workshops for their helpful feedback. Philip Reiss, Pei-Shien Wu, and Wen-Yu Hua gratefully acknowledge the support of the U.S. National Institute of Mental Health (grant 1R01MH095836-01A1).