References
- Alizadeh, A. A., M. B. Eisen, R. E. Davis, C. Ma, I. S. Lossos, A. Rosenwald, J. C. Boldrick, H. Sabet, T. Tran, X. Yu, et al. 2000. Distinct types of diffuse large B-cell lymphoma identified by gene expression profiling. Nature 403 (6769):503–11.
- Bickel, P. J., Y. A. Ritov, and A. B. Tsybakov. 2009. Simultaneous analysis of Lasso and Dantzig selector. The Annals of Statistics 37 (4):1705–32. doi:https://doi.org/10.1214/08-AOS620.
- Bunea, F., A. B. Tsybakov, and M. H. Wegkamp. 2007. Aggregation for Gaussian regression. The Annals of Statistics 35 (4):1674–97. doi:https://doi.org/10.1214/009053606000001587.
- Campbell, H., and I. Rudan. 2002. Interpretation of genetic association studies in complex disease. The Pharmacogenomics Journal 2 (6):349–60. doi:https://doi.org/10.1038/sj.tpj.6500132.
- Hoerl, A. E., and R. W. Kennard. 1970. Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12 (1):55–67. doi:https://doi.org/10.2307/1267351.
- Huang, J., S. Ma, and C. H. Zhang. 2008. Adaptive Lasso for sparse high-dimensional regression models. Statistica Sinica 18:1603–18.
- Jones, S., X. Zhang, D. W. Parsons, J. C.-H. Lin, R. J. Leary, P. Angenendt, P. Mankoo, H. Carter, H. Kamiyama, A. Jimeno, et al. 2008. Core signaling pathways in human pancreatic cancers revealed by global genomic analyses. Science (New York, N.Y.) 321 (5897):1801–6. doi:https://doi.org/10.1126/science.1164368.
- Knight, K., and W. Fu. 2000. Asymptotics for Lasso-type estimators. The Annals of Statistics 28:1356–78. doi:https://doi.org/10.1214/aos/1015957397.
- Lambert, S. A., A. Jolma, L. F. Campitelli, P. K. Das, Y. Yin, M. Albu, X. Chen, J. Taipale, T. R. Hughes, M. T. Weirauch, et al. 2018. The human transcription factors. Cell 172 (4):650–65. doi:https://doi.org/10.1016/j.cell.2018.01.029.
- Leng, C., Y. Lin, and G. Wahba. 2006. A note on the Lasso and related procedures in model selection. Statistica Sinica 16:1273–84.
- Lounici, K. 2008. Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators. Electronic Journal of Statistics 2:90–102. doi:https://doi.org/10.1214/08-EJS177.
- Meinshausen, N. 2015. Group bound: confidence intervals for groups of variables in sparse high dimensional regression without assumptions on the design. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 77 (5):923–45. doi:https://doi.org/10.1111/rssb.12094.
- Meinshausen, N., and B. Yu. 2009. Lasso-type recovery of sparse representations for high-dimensional data. The Annals of Statistics 37 (1):246–70. doi:https://doi.org/10.1214/07-AOS582.
- Meinshausen, N., and P. Bühlmann. 2006. High-dimensional graphs and variable selection with the Lasso. The Annals of Statistics 34 (3):1436–62. doi:https://doi.org/10.1214/009053606000000281.
- Rousseeuw, P. J., and A. M. Leroy. 1987. Robust regression and outlier detection. New York: Wiley.
- Schennach, S. 2013. Regressions with Berkson errors in covariates – A nonparametric approach. The Annals of Statistics 41 (3):1642–68. doi:https://doi.org/10.1214/13-AOS1122.
- Simpson, J. R., and D. C. Montgomery. 1996. A biased-robust regression technique for the combined outlier-multicollinearity problem. Journal of Statistical Computation and Simulation 56 :1–22. doi:https://doi.org/10.1080/00949659608811777.
- Taketani, K., J. Kawauchi, M. Tanaka-Okamoto, H. Ishizaki, Y. Tanaka, T. Sakai, J. Miyoshi, Y. Maehara, and S. Kitajima. 2012. Key role of ATF3 in p53-dependent DR5 induction upon DNA damage of human colon cancer cells. Oncogene 31 (17):2210. doi:https://doi.org/10.1038/onc.2011.397.
- Tibshirani, R. 1996. Variable selection and regression shrinkage via the Lasso. Journal of the Royal Statistical Society B 58:267–88. doi:https://doi.org/10.1111/j.2517-6161.1996.tb02080.x.
- Van De Geer, S. A., and P. Bühlmann. 2009. On the conditions used to prove oracle results for the Lasso. Electronic Journal of Statistics 3:1360–92. doi:https://doi.org/10.1214/09-EJS506.
- Wald, A. 1940. Fitting of straight lines if both variables are subject to error. The Annals of Mathematical Statistics 11 (3):284–300. doi:https://doi.org/10.1214/aoms/1177731868.
- Wang, H. L., P. S. Zhong, and Y. H. Cui. 2018. Empirical likelihood ratio tests for coefficients in high dimensional heteroscedastic linear models. Statistica Sinica 28:2409–33.
- Wu, Y. J., L. H. Cheng, and W. Q. Fang. 2018. Using Wald-type estimator to combat outliers and Berkson-type uncertainties with mixture distributions in linear regression models. Communications in Statistics - Theory and Methods 47 (14):3324–37. doi:https://doi.org/10.1080/03610926.2017.1353627.
- Wu, Y. J., and W. Q. Fang. 2017. Consistent estimation approach to tackling collinearity and Berkson-type measurement error in linear regression using adjusted Wald-type estimator. Communications in Statistics - Theory and Methods 46 (11):5501–16. doi:https://doi.org/10.1080/03610926.2015.1104353.
- Zhao, P., and B. Yu. 2006. On model selection consistency of Lasso. Journal of Machine Learning Research 7:2541–63.
- Zou, H. 2006. The adaptive Lasso and its oracle properties. Journal of the American Statistical Association 101 (476):1418–29. doi:https://doi.org/10.1198/016214506000000735.
- Zou, H., and T. Hastie. 2005. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67 (2):301–20. doi:https://doi.org/10.1111/j.1467-9868.2005.00503.x.