ABSTRACT
This paper aims at evaluating different aspects of Monte Carlo expectation – maximization algorithm to estimate heavy-tailed mixed logistic regression (MLR) models. As a novelty it also proposes a multiple chain Gibbs sampler to generate of the latent variables distributions thus obtaining independent samples. In heavy-tailed MLR models, the analytical forms of the full conditional distributions for the random effects are unknown. Four different Metropolis–Hastings algorithms are assumed to generate from them. We also discuss stopping rules in order to obtain more efficient algorithms in heavy-tailed MLR models. The algorithms are compared through the analysis of simulated and Ascaris Suum data.
Acknowledgments
We would like to express our gratitude to the editors and two anonymous referees whose comments and suggestions have contributed to the improvement of the paper.
Disclosure statement
No potential conflict of interest was reported by the authors.