References
- Bülmann, P., and van de Geer, S. (2011), Statistics for High-Dimensional Data, Berlin: Springer.
- Chen, J., Tran-Dinh, Q., Kosorok, M. R., and Liu, Y. (2021), “Identifying Heterogeneous Effect using Latent Supervised Clustering with Adaptive Fusion,” Journal of Computational and Graphical Statistics, 30, 43–54. DOI: 10.1080/10618600.2020.1763808.
- Fan, J., and Song, R. (2010), “Sure Independence Screening in Generalized Linear Models with NP-dimensionality,” Annals of Statistics, 38, 3567–3604.
- Fan, J., and Lv, J. (2008), “Sure Independence Screening for Ultrahigh Dimensional Feature Space,” Journal of the Royal Statistical Society, Series B, 70, 849–911. DOI: 10.1111/j.1467-9868.2008.00674.x.
- Green, P., and Silverman, B. (1994), Nonparametric Regression and Generalized Linear Models: A Roughness Penalty Approach, London: Chapman and Hall.
- Greenshtein, E., and Ritov, Y. (2004), “Persistence in High-Dimensional Linear Predictor Selection and the Virtue of Overparametrization,” Bernoulli, 10, 971–988. DOI: 10.3150/bj/1106314846.
- Gu, C. (2002), Smoothing Spline ANOVA Models, New York: Springer.
- Guo, F. J., Levina, E., Michailidis, G., and Zhu, J. (2010), “Pairwise Variable Selection for High-Dimensional Model-Based Clustering,” Biometrics, 66, 793–804. DOI: 10.1111/j.1541-0420.2009.01341.x.
- Hocking, T., Joulin, A., Bach, F., and Vert, J. P. (2011), “Clusterpath: An Algorithm for Clustering using Convex Fusion Penalties,” in Proceedings of the 28th International Conference on Machine Learning (ICML’11), eds. L. Getoor and T. Scheffer, pp. 745–52, New York: Omnipress.
- Jacobs, R. A., Jordan, M. I., Nowlan, S. J., and Hinton, G. E. (1991), “Adaptive Mixtures of Local Experts,” Neural Computation, 3, 79–87. DOI: 10.1162/neco.1991.3.1.79.
- Kac, M. (1959), Statistical Independence in Probability, Analysis and Number Theory, Washington DC: Mathematical Association of America.
- Lindsten, F., Ohlsson, H., and Ljung, L. (2011), “Clustering using Sum-of-Norms Regularization: With Application to Particle Filter Output Computation,” in 2011 IEEE Statistical Signal Processing Workshop (SSP), pp. 201–204. DOI: 10.1109/SSP.2011.5967659.
- Ma, S., and Huang, J. (2017), “A Concave Pairwise Fusion Approach to Subgroup Analysis,” Journal of the American Statistical Association, 112, 410–423. DOI: 10.1080/01621459.2016.1148039.
- Pan, W., and Shen, X. (2006), “Penalized Model-based Clustering with Application to Variable Selection,” Journal of Machine Learning Research, 8, 1145–1164.
- Pan, W., Shen, X., and Liu, B. (2013), “Cluster Analysis: Unsupervised Learning via Supervised Learning with a Non-convex Penalty,” Journal of Machine Learning Research, 14, 1865–1889.
- Raftery,A., and Dean, N. (2006), “Variable Selection for Model-based Clustering,” Journal of the American Statistical Association, 101, 168–178. DOI: 10.1198/016214506000000113.
- Rosenbaum, P. (1995), Observational Studies, New York: Springer.
- Sylvester, J. J. (1867), “LX. Thoughts on Inverse Orthogonal Matrices, Simultaneous Signsuccessions, and Tessellated Pavements in Two or More Colours, with Applications to Newton’s Rule, Ornamental Tile-Work, and the Theory of Numbers,” The London, Edinburgh, and Dublin Philosophical Magazine, 34, 461–475. DOI: 10.1080/14786446708639914.
- Tang, X., and Qu, A. (2017), “Individualized Multi-Directional Variable Selection,” arXiv: 1709.05062.
- Tibshirani, R. (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288. DOI: 10.1111/j.2517-6161.1996.tb02080.x.
- Wahba, G. (1990), Spline Models for Observational Data, Philadelphia: SIAM.
- Yeh, I. C., and Hsu, T. K. (2018), “Building Real Estate Valuation Models with Comparative Approach through Case-based Reasoning,” Applied Soft Computing, 65, 260–271. DOI: 10.1016/j.asoc.2018.01.029.
- Zhang, K. (2019), “BET on Independence,” Journal of the American Statistical Association, 114, 1620–1637. DOI: 10.1080/01621459.2018.1537921..
- Zhang, K., Zhao, Z., and Zhou, W. (2021), “BEAUTY Powered BEAST,” arXiv: 2103.00674.