References
- Albert, J. H., and Chib, S. (1993), “Bayesian Analysis of Binary and Polychotomous Response Data,” Journal of the American Statistical Association, 88, 669. DOI: https://doi.org/10.1080/01621459.1993.10476321.
- Baker, S. G. (1994), “The Multinomial-Poisson Transformation,” Journal of the Royal Statistical Society, Series D, 43, 495–504. DOI: https://doi.org/10.2307/2348134.
- Belitz, C., Brezger, A., Kneib, T., Lang, S., and Umlauf, N. (2016), “BayesX-Software for Bayesian Inference in Structured Additive Regression Models (Version 3.0.2),” available at http://www.bayesx.org.
- Berk, R., Brown, L., Buja, A., Zhang, K., and Zhao, L. (2013), “Valid Post-Selection Inference,” The Annals of Statistics, 41, 802–837. DOI: https://doi.org/10.1214/12-AOS1077.
- Bleich, J., Kapelner, A., George, E. I., and Jensen, S. T. (2014), “Variable Selection for BART: An Application to Gene Regulation,” The Annals of Applied Statistics, 8, 1750–1781. DOI: https://doi.org/10.1214/14-AOAS755.
- Burgette, L. F., Puelz, D., and Hahn, P. R. (2010), “A symmetric prior for multinomial probit models.” Bayesian Analysis, 1, 1–18.
- Burgette, L. F., and Nordheim, E. V. (2012), “The Trace Restriction: An Alternative Identification Strategy for the Bayesian Multinomial Probit Model,” Journal of Business & Economic Statistics, 30, 404–410.
- Caron, F., and Doucet, A. (2012), “Efficient Bayesian Inference for Generalized Bradley–Terry Models,” Journal of Computational and Graphical Statistics, 21, 174–196. DOI: https://doi.org/10.1080/10618600.2012.638220.
- Chipman, H. A., George, E. I., and McCulloch, R. E. (1998), “Bayesian Cart Model Search,” Journal of the American Statistical Association, 93, 935–948. DOI: https://doi.org/10.1080/01621459.1998.10473750.
- ——— (2010), “BART: Bayesian Additive Regression Trees,” The Annals of Applied Statistics, 4, 266–298.
- Denison, D. G. T., Mallick, B. K., and Smith, A. F. M. (1998), “A Bayesian CART algorithm,” Biometrika, 85, 363–377.
- Du, J., and Linero, A. R. (2019), “Interaction Detection With Bayesian Decision Tree Ensembles,” in The 22nd International Conference on Artificial Intelligence and Statistics, pp. 108–117.
- Fernández-Delgado, M., Cernadas, E., Barro, S., and Amorim, D. (2014), “Do We Need Hundreds of Classifiers to Solve Real World Classification Problems?,” Journal of Machine Learning Research, 15, 3133–3181.
- Forster, J. J. (2010), “Bayesian Inference for Poisson and multinomial log-linear Models,” Statistical Methodology, 7, 210–224. DOI: https://doi.org/10.1016/j.stamet.2009.12.004.
- Friedman, J. H. (2001), “Greedy Function Approximation: A Gradient Boosting Machine,” The Annals of Statistics, 29, 1189–1232. DOI: https://doi.org/10.1214/aos/1013203451.
- Frühwirth-Schnatter, S., and Frühwirth, R. (2010), “Data Augmentation and MCMC for Binary and Multinomial Logit Models,” in Statistical Modelling and Regression Structures, eds. T. Kneib and G. Tutz, Physica-Verlag HD, pp. 111–132.
- Gelman, A., Hwang, J., and Vehtari, A. (2014), “Understanding Predictive Information Criteria for Bayesian Models,” Statistics and Computing, 24, 997–1016. DOI: https://doi.org/10.1007/s11222-013-9416-2.
- Goldstein, A., Kapelner, A., Bleich, J., and Pitkin, E. (2015), “Peeking Inside the Black Box: Visualizing Statistical Learning With Plots of Individual Conditional Expectation,” Journal of Computational and Graphical Statistics, 24, 44–65. DOI: https://doi.org/10.1080/10618600.2014.907095.
- Green, P. J. (1995), “Reversible Jump Markov Chain Monte Carlo Computation and Bayesian Model Determination,” Biometrika, 82, 711–732. DOI: https://doi.org/10.1093/biomet/82.4.711.
- Hastie, T., and Tibshirani, R. (2000), “Bayesian Backfitting” (with comments and a rejoinder by the authors), Statistical Science, 15, 196–223.
- Hill, J., Linero, A., and Murray, J. (2020), “Bayesian Additive Regression Trees: A Review and Look Forward,” Annual Review of Statistics and Its Application, 7, 251–278. DOI: https://doi.org/10.1146/annurev-statistics-031219-041110.
- Holmes, C. C., and Held, L. (2006), “Bayesian Auxiliary Variable Models for Binary and Multinomial Regression,” Bayesian Analysis, 1, 145–168. DOI: https://doi.org/10.1214/06-BA105.
- Jaffe, A. B., and Trajtenberg, M. (1996), “Flows of Knowledge From Universities and Federal Laboratories: Modeling the Flow of Patent Citations Over Time and Across Institutional and Geographic Boundaries,” Proceedings of the National Academy of Sciences of the United States of America, 93, 12671–12677. DOI: https://doi.org/10.1073/pnas.93.23.12671.
- Kindo, B. P., Wang, H., and Peña, E. A. (2016), “Multinomial Probit Bayesian Additive Regression Trees,” Stat, 5, 119–131. DOI: https://doi.org/10.1002/sta4.110.
- Klein, N., Kneib, T., and Lang, S. (2015), “Bayesian Generalized Additive Models for Location, Scale, and Shape for Zero-Inflated and Overdispersed Count Data,” Journal of the American Statistical Association, 110, 405–419. DOI: https://doi.org/10.1080/01621459.2014.912955.
- Kuhn, M. (2008), “Building Predictive Models in R Using the caret Package,” Journal of Statistical Software, 28, 1–26. DOI: https://doi.org/10.18637/jss.v028.i05.
- ——— (2017), “caret: Classification and Regression Training,” R Package Version 6.0-762017, available at https://CRAN.R-project.org/package=caret.
- Linero, A. R. (2018), “Bayesian Regression Trees for High-Dimensional Prediction and Variable Selection,” Journal of the American Statistical Association, 113, 626–636. DOI: https://doi.org/10.1080/01621459.2016.1264957.
- Linero, A. R., Sinha, D., and Lipsitz, S. R. (2020), “Semiparametric Mixed-Scale Models Using Shared Bayesian Forests,” Biometrics, 76, 131–144. DOI: https://doi.org/10.1111/biom.13107.
- Linero, A. R., and Yang, Y. (2018), “Bayesian Regression Tree Ensembles That Adapt to Smoothness and Sparsity,” Journal of the Royal Statistical Society, Series B, 80, 1087–1110. DOI: https://doi.org/10.1111/rssb.12293.
- Liu, J. S., Wong, W. H., and Kong, A. (1994), “Covariance Structure of the Gibbs Sampler With Applications to the Comparisons of Estimators and Augmentation Schemes,” Biometrika, 81, 27–40. DOI: https://doi.org/10.1093/biomet/81.1.27.
- McCulloch, R. (2015), “Nonparametric Heteroscedastic Regression Modeling, Bayesian Regression Trees and MCMC Sampling,” Presented at the 10th Conference on Bayesian Nonparametrics, Raleigh, NC.
- Nieto-Barajas, L. E., Prünster, I., and Walker, S. G. (2004), “Normalized Random Measures Driven by Increasing Additive Processes,” The Annals of Statistics, 32, 2343–2360. DOI: https://doi.org/10.1214/009053604000000625.
- Polson, N. G., Scott, J. G., and Windle, J. (2013), “Bayesian Inference for Logistic Models Using Pólya–Gamma Latent Variables,” Journal of the American Statistical Association, 108, 1339–1349. DOI: https://doi.org/10.1080/01621459.2013.829001.
- Pratola, M. T. (2016), “Efficient Metropolis–Hastings Proposal Mechanisms for Bayesian Regression Tree Models,” Bayesian Analysis, 11, 885–911. DOI: https://doi.org/10.1214/16-BA999.
- Pratola, M. T., Chipman, H., George, E., and McCulloch, R. (2020), “Heteroscedastic BART via Multiplicative Regression Trees,” Journal of Computational and Graphical Statistics, 29, 405–417. DOI: https://doi.org/10.1080/10618600.2019.1677243.
- Roberts, G. O., and Sahu, S. K. (1997), “Updating Schemes, Correlation Structure, Blocking and Parameterization for the Gibbs Sampler,” Journal of the Royal Statistical Society, Series B, 59, 291–317. DOI: https://doi.org/10.1111/1467-9868.00070.
- Ročková, V., and Saha, E. (2019), “On Theory for BART,” in The 22nd International Conference on Artificial Intelligence and Statistics, pp. 2839–2848.
- Ročková, V., and van der Pas, S. (2017), “Posterior Concentration for Bayesian Regression Trees and Their Ensembles,” arXiv no. 1708.08734.
- Sparapani, R. A., Logan, B. R., McCulloch, R. E., and Laud, P. W. (2016), “Nonparametric Survival Analysis Using Bayesian Additive Regression Trees (BART),” Statistics in Medicine, 35, 2741–2753. DOI: https://doi.org/10.1002/sim.6893.
- Starling, J. E., Aiken, C. E., Murray, J. S., Nakimuli, A., and Scott, J. G. (2019), “Monotone Function Estimation in the Presence of Extreme Data Coarsening: Analysis of Preeclampsia and Birth Weight in Urban Uganda,” arXiv no. 1912.06946.
- Starling, J. E., Murray, J. S., Carvalho, C. M., Bukowski, R. K., and Scott, J. G. (2020), “BART With Targeted Smoothing: An Analysis of Patient-Specific Stillbirth Risk,” Annals of Applied Statistics, 14, 28–50.
- Starling, J. E., Murray, J. S., Lohr, P. A., Aiken, A. R., Carvalho, C. M., and Scott, J. G. (2019), “Targeted Smooth Bayesian Causal Forests: An Analysis of Heterogeneous Treatment Effects for Simultaneous Versus Interval Medical Abortion Regimens Over Gestation,” arXiv no. 1905.09405.
- van der Pas, S., and Ročková, V. (2017), “Bayesian Dyadic Trees and Histograms for Regression,” in Advances in Neural Information Processing Systems.
- Walker, S. G. (2011), “Posterior Sampling When the Normalizing Constant Is Unknown,” Communications in Statistics—Simulation and Computation, 40, 784–792. DOI: https://doi.org/10.1080/03610918.2011.555042.
- Watanabe, S. (2010), “Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Information Criterion in Singular Learning Theory,” Journal of Machine Learning Research, 11, 3571–3594.
- ——— (2013), “A Widely Applicable Bayesian Information Criterion,” Journal of Machine Learning Research, 14, 867–897.
- Woody, S., Carvalho, C. M., and Murray, J. S. (2020), “Model Interpretation Through Lower-Dimensional Posterior Summarization,” Journal of Computational and Graphical Statistics. DOI: https://doi.org/10.1080/10618600.2020.1796684.
- Wu, Y., Tjelmeland, H., and West, M. (2007), “Bayesian CART: Prior Specification and Posterior Simulation,” Journal of Computational and Graphical Statistics, 16, 44–66. DOI: https://doi.org/10.1198/106186007X180426.
- Zhou, M., Li, L., Dunson, D., and Carin, L. (2012), “Lognormal and Gamma Mixed Negative Binomial Regression,” in Proceedings of the International Conference on Machine Learning (Vol. 2012), NIH Public Access, p. 1343.