References
- M.M. Barbieri and J.O. Berger, Optimal predictive model selection, Ann. Stat. 32 (2004), pp. 870–897.
- B. Carpenter, A. Gelman, M.D. Hoffman, D. Lee, B. Goodrich, M. Betancourt, M. Brubaker, J. Guo, P. Li, and A. Riddell, Stan: A probabilistic programming language, J. Stat. Softw. 76 (2017), pp. 1–32.
- A. Chatterjee and S. Nath Lahiri, Bootstrapping lasso estimators, J. Amer. Statist. Assoc. 106 (2011), pp. 608–625.
- R-B. Chen, C-H. Chu, S. Yuan, and Y.N. Wu, Bayesian sparse group selection, J. Comput. Graph. Statist. 25 (2016), pp. 665–683.
- M.K. Cowles, Applied Bayesian Statistics: With R and OpenBUGS Examples, Vol. 98, Springer Science & Business Media, New York, 2013.
- B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression, Ann. Stat. 32 (2004), pp. 407–499.
- J. Fan and R. Li, Variable selection via nonconcave penalized likelihood and its oracle properties, J. Amer. Statist. Assoc. 96 (2001), pp. 1348–1360.
- J. Friedman, T. Hastie, and R. Tibshirani, Regularization paths for generalized linear models via coordinate descent, J. Stat. Softw. 33 (2010), pp. 1.
- E.I. George and R.E. McCulloch, Variable selection via Gibbs sampling, J. Amer. Statist. Assoc. 88 (1993), pp. 881–889.
- J.P. Hobert and G. Casella, The effect of improper priors on Gibbs sampling in hierarchical linear mixed models, J. Amer. Statist. Assoc. 91 (1996), pp. 1461–1473.
- K. Knight and W. Fu, Asymptotics for Lasso-type estimators, Ann. Stat. 28 (2000), pp. 1356–1378.
- L. Kuo and B. Mallick, Variable selection for regression models, Sankhyā: The Indian J. Stat., Ser. B 60 (1998), pp. 65–81.
- M. Kyung, J. Gill, M. Ghosh, and G. Casella, Penalized regression, standard errors, and Bayesian Lassos, Bayesian Anal. 5 (2010), pp. 369–411.
- D.J. Lunn, A. Thomas, N. Best, and D. Spiegelhalter, WinBUGS: A Bayesian modelling framework: Concepts, structure, and extensibility, Stat. Comput. 10 (2000), pp. 325–337.
- G. Malsiner-Walli, S. Frühwirth-Schnatter, and B. Grün, Identifying mixtures of mixtures using Bayesian estimation, J. Comput. Graph. Stat. 26 (2017), pp. 285–295.
- G. Malsiner-Walli, D. Pauger, and H. Wagner, Effect fusion using model-based clustering, Stat. Modelling. 18 (2017), pp. 175–196.
- T.J. Mitchell and J.J. Beauchamp, Bayesian variable selection in linear regression, J. Amer. Statist. Assoc. 83 (1988), pp. 1023–1032.
- M. Plummer, JAGS Version 3.3.0 User Manual, International Agency for Research on Cancer, Lyon, France, 2012.
- S. Raman, T.J. Fuchs, P.J. Wild, E. Dahl, and V. Roth, The Bayesian group-lasso for analyzing contingency tables, Proceedings of the 26th Annual International Conference on Machine Learning, ACM, 2009, pp. 881–888.
- R.A. Redner and H.F. Walker, Mixture densities, maximum likelihood and the EM algorithm, SIAM Rev. 26 (1984), pp. 195–239.
- Y-S. Su, M. Yajima, M.Y.-S. Su, and JAGS SystemRequirements, Package R2jags, R package version 0.03-08, 2015. Available at http://CRAN.R-project.org/package=R2jags.
- R. Tibshirani, Regression shrinkage and selection via the lasso, J. R. Stat. Soc. Ser. B (Methodol.) 58 (1996), pp. 267–288.
- R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, and K. Knight, Sparsity and smoothness via the fused Lasso, J. R. Stat. Soc. Ser. B (Stat. Methodol.) 67 (2005), pp. 91–108.
- M.J. Wainwright, Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso), IEEE Trans. Inform. Theor. 55 (2009), pp. 2183–2202.
- T.T. Wu and K. Lange, Coordinate descent algorithms for Lasso penalized regression, Ann. Appl. Stat. 2 (2008), pp. 224–244.
- X. Xu and M. Ghosh, Bayesian variable selection and estimation for group Lasso, Bayesian Anal. 10 (2015), pp. 909–936.
- M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, J. R. Stat. Soc. Ser. B (Stat. Methodol.) 68 (2006), pp. 49–67.
- P. Zhao and B. Yu, On model selection consistency of Lasso, J. Mach. Learn. Res. 7 (Nov 2006), pp. 2541–2563.