ABSTRACT
This paper considers variable selection in additive quantile regression based on group smoothly clipped absolute deviation (gSCAD) penalty. Although shrinkage variable selection in additive models with least-squares loss has been well studied, quantile regression is sufficiently different from mean regression to deserve a separate treatment. It is shown that the gSCAD estimator can correctly identify the significant components and at the same time maintain the usual convergence rates in estimation. Simulation studies are used to illustrate our method.
Acknowledgments
We sincerely thank the associate editor and an anonymous reviewer for their insightful comments that have improved many aspects of the manuscript. The second author thanks Dr Yebin Cheng whose grant (National Natural Science Foundation of China Grant 11271241) in which I am a co-PI supported this research.
Disclosure statement
No potential conflict of interest was reported by the authors.