539
Views
9
CrossRef citations to date
0
Altmetric
ECONOMETRIC THEORY AND TREATMENT EFFECTS

Nonparametric Knn estimation with monotone constraints

, &
 

ABSTRACT

The K-nearest-neighbor (Knn) method is known to be more suitable in fitting nonparametrically specified curves than the kernel method (with a globally fixed smoothing parameter) when data sets are highly unevenly distributed. In this paper, we propose to estimate a nonparametric regression function subject to a monotonicity restriction using the Knn method. We also propose using a new convergence criterion to measure the closeness between an unconstrained and the (monotone) constrained Knn-estimated curves. This method is an alternative to the monotone kernel methods proposed by Hall and Huang (Citation2001), and Du et al. (Citation2013). We use a bootstrap procedure for testing the validity of the monotone restriction. We apply our method to the “Job Market Matching” data taken from Gan and Li (Citation2016) and find that the unconstrained/constrained Knn estimators work better than kernel estimators for this type of highly unevenly distributed data.

2010 MATHEMATICS SUBJECT CLASSIFICATION:

Acknowledgments

This paper was presented at the Emory University Econometrics Conference in honor of Professor Esfandiar Maasoumi in November, 2014. We thank two anonymous referees and the Emory Econometrics Conference participates for their helpful comments that greatly improved the paper. Li would also like to thank China National Science Foundation (project # 71133001) for partial research support. Liu’s research is supported by the Fundamental Research Funds for the Central Universities (project number 20720171061).

Notes

1Although one can use adaptive bandwidth to avoid this problem, adaptive bandwidth method requires that one selects a different bandwidth at each different data evaluation point. It is computationally quite costly especially in large sample applications.

2We thank a referee pointing out that Du et al. (Citation2013) do not require i=1npi=1 and that this condition was listed in the paper is in fact a typo.

3The asymptotic analysis for the Knn method is more complex than that for the kernel method, see Mack and Rosenblatt (Citation1979), Mack (Citation1981), Ouyang et al. (Citation2006), Fan and Liu (Citation2015), among others.

4We would like to thank a referee for pointing out that local-polynomial estimates would serve better as the criterion in the curve method.

5In the simulation part, because the null hypothesis is a monotonic increasing function, we estimate all ĝ(x|p) and ĝ(x|p) by selecting p=p̂ or p* to minimize some distance functions D(p), while imposing that derivative function is always non-negative.

6We would like to thank Jeffrey Racine for his helpful suggestion about correcting bias in estimating test size.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.