Abstract
Latent class analysis often aims to relate the classes to continuous external consequences (“distal outcomes”), but estimating such relationships necessitates distributional assumptions. Lanza, Tan, and Bray (2013) suggested circumventing such assumptions with their LTB approach: Linear logistic regression of latent class membership on each distal outcome is first used, after which this estimated relationship is reversed using Bayes’ rule. However, the LTB approach currently has 3 drawbacks, which we address in this article. First, LTB interchanges the assumption of normality for one of homoskedasticity, or, equivalently, of linearity of the logistic regression, leading to bias. Fortunately, we show introducing higher order terms prevents this bias. Second, we improve coverage rates by replacing approximate standard errors with resampling methods. Finally, we introduce a bias-corrected 3-step version of LTB as a practical alternative to standard LTB. The improved LTB methods are validated by a simulation study, and an example application demonstrates their usefulness.
Notes
1 It should be mentioned that in the three-step approach we assume that missingness on the distal outcome is unrelated to the size of the classification errors. Because cases with missing values are removed from the analysis, this missingness is assumed to be missing completely at random, as is also the case in the simultaneous (original) LTB method.
2 Note that it can seem intuitive to bootstrap only Step 1, but, this is not a solution to the problem at hand. The Step 3 estimates need to be bootstrapped, because in this step the model for is obtained; thus the uncertainty around this estimate can be obtained using bootstrap SEs.