Abstract
Compared with local polynomial quantile regression, K nearest neighbor quantile regression (KNNQR) has many advantages, such as not assuming smoothness of functions. The paper summarizes the research of KNNQR and has carried out further research on the selection of k, algorithm and Monte Carlo simulations. Additionally, simulated functions are Blocks, Bumps, HeaviSine and Doppler, which stand for jumping, volatility, mutagenicity slope and high frequency function. When function to be estimated has some jump points or catastrophe points, KNNQR is superior to local linear quantile regression in the sense of the mean squared error and mean absolute error criteria. To be mentioned, even high frequency, the superiority of KNNQR could be observed. A real data is analyzed as an illustration.
Acknowledgments
The authors would like to thank the editor and anonymous referees for their careful reading and for their comments which greatly improved the paper, and also thank Maozai Tian for beneficial discussions.
Disclosure statement
No potential conflict of interest was reported by the authors.
Funding
This work was supported by the Outstanding Innovative Talents Cultivation Funded Programs 2014 of Renmin Univertity of China and the National Social Science Foundation of China [13BTJ004].