3,429
Views
108
CrossRef citations to date
0
Altmetric
Articles

Regularized Structural Equation Modeling

, &

REFERENCES

  • Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B. N. Petrov & F. Csaki (Eds.), Second international symposium on information theory, (pp. 267–281). Budapest, Hungary: Akademiai Kiado.
  • Boker, S., Neale, M., & Rausch, J. (2004). Latent differential equation modeling with multivariate multi-occasion indicators. In K. van Montfort et al. (Ed.), Recent developments on structural equation models (pp. 151–174). New york, NY: Springer.
  • Browne, M. W. (2000). Cross-validation methods. Journal of Mathematical Psychology, 44 (1), 108–132.
  • Browne, M. W. (2001). An overview of analytic rotation in exploratory factor analysis. Multivariate Behavioral Research, 36(1), 111–150.
  • Choi, J., Zou, H., & Oehlert, G. (2010). A penalized maximum likelihood approach to sparse factor analysis. Statistics and Its Interface, 3, 429–436.
  • Chou, C. P., & Bentler, P. M. (1990). Model modification in covariance structure modeling: A comparison among likelihood ratio, Lagrange multiplier, and Wald tests. Multivariate Behavioral Research, 25(1), 115–136.
  • Chou, C. P., & Huh, J. (2012). Model modification in structural equation modeling. In R. Hoyle (Ed.), Handbook of Structural Equation Modeling, (pp. 232–246). New York, NY: Guilford.
  • Grice, J. W. (2001). Computing and evaluating factor scores. Psychological Methods, 6, 430.
  • Grimm, K. J., & McArdle, J. J. (2005). A note on the computer generation of mean and covariance expectations in latent growth curve analysis. In F. Dansereau & F. J. Yammarino (Eds.), Multi-level issues in strategy and methods (pp. 335–364). Amsterdam: Emerald.
  • Grimm, K. J., Steele, J. S., Ram, N., & Nesselroade, J. R. (2013). Exploratory latent growth models in the structural equation modeling framework. Structural Equation Modeling: A Multidisciplinary Journal, 20, 568–591.
  • Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning. New York, NY:.Springer.
  • Hastie, T., Tibshirani, R., & Wainwright, M. (2015). Statistical learning with sparsity: The lasso and generalizations. CRC. Boca Raton, FL:.
  • Hayashi, K., & Marcoulides, G. A. (2006). Teacher’s corner: Examining identification issues in factor analysis. Structural Equation Modeling, 13, 631–645.
  • Hirose, K., & Yamamoto, M. (2014a). Estimation of an oblique structure via penalized likelihood factor analysis. Computational Statistics & Data Analysis, 79, 120–132.
  • Hirose, K., & Yamamoto, M. (2014b). Sparse estimation via nonconcave penalized likelihood in factor analysis model. Statistics and Computing, 25, 1–13.
  • Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.
  • Holzinger, K. J., & Swineford, F. (1939). A study in factor analysis: The stability of a bi-factor solution. Supplementary Educational Monographs, 48.
  • Hsu, H.-Y., Troncoso Skidmore, S., Li, Y., & Thompson, B. (2014). Forced zero cross-loading misspecifications in measurement component of structural equation models: Beware of even “small” misspecifications. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 10, 138.
  • Jolliffe, I. T., Trendafilov, N. T., & Uddin, M. (2003). A modified principal component technique based on the lasso. Journal of Computational and Graphical Statistics, 12, 531–547.
  • Jöreskog, K. G. (1969). A general approach to confirmatory maximum likelihood factor analysis. Psychometrika, 34, 183–202. doi: 10.1007/bf02289343
  • Jung, S., & Takane, Y. (2008). Regularized common factor analysis. In K. Shigemasu (Ed.), New trends in psychometrics (pp. 141–149). Tokyo: Universal Academy Press.
  • Juster, F. T., & Suzman, R. (1995). An overview of the health and retirement study. Journal of Human Resources, 30, S7–S56.
  • Kaplan, D. (1988). The impact of specification error on the estimation, testing, and improvement of structural equation models. Multivariate Behavioral Research, 23(1), 69–86.
  • Kaplan, D., & Depaoli, S. (2012). Bayesian structural equation modeling. In R. Hoyle (Ed.), Handbook of structural equation modeling (pp. 650–673). New York, NY: Guilford.
  • Lam, C., & Fan, J. (2009). Sparsistency and rates of convergence in large covariance matrix estimation. Annals of Statistics, 37(6B), 4254.
  • Lawley, D. N. (1940). Vi.—the estimation of factor loadings by the method of maximum likelihood. Proceedings of the Royal Society of Edinburgh, 60(01), 64–82.
  • Lee, S. Y. (2007). Structural equation modeling: A bayesian approach (Vol. 711). New York, NY: John Wiley..
  • Leite, W. L., Huang, I.-C., & Marcoulides, G. A. (2008). Item selection for the development of short forms of scales using an ant colony optimization algorithm. Multivariate Behavioral Research, 43, 411–431.
  • Levy, R. (2011). Bayesian data-model fit assessment for structural equation modeling. Structural Equation Modeling, 18, 663–685.
  • MacCallum, R. C. (1986). Specification searches in covariance structure modeling. Psychological Bulletin, 100(1), 107.
  • MacCallum, R. C., Roznowski, M., & Necowitz, L. B. (1992). Model modifications in covariance structure analysis: the problem of capitalization on chance. Psychological Bulletin, 111, 490.
  • Magis, D., Tuerlinckx, F., & Boeck, P. D. (2014). Detection of differential item functioning using the lasso approach. Journal of Educational and Behavioral Statistics, 40, 111-135. Retrieved from http://dx.doi.org/10.3102/1076998614559747 doi: 10.3102/1076998614559747
  • Marcoulides, G. A., & Drezner, Z. (2001). Specification searches in structural equation modeling with a genetic algorithm. In G. A. Marcoulides & R. E. Schumacker (Eds.), New developments and techniques in structural equation modeling (p. 247–268). Mahwah, NJ: Erlbaum.
  • Marcoulides, G. A., & Drezner, Z. (2003). Model specification searches using ant colony optimization algorithms. Structural Equation Modeling, 10(1), 154–164.
  • Marcoulides, G. A., Drezner, Z., & Schumacker, R. E. (1998). Model specification searches in structural equation modeling using tabu search. Structural Equation Modeling: A Multidisciplinary Journal, 5, 365–376.
  • Marcoulides, G. A., & Ing, M. (2012). Automated structural equation modeling strategies. In R. Hoyle (Ed.), Handbook of structural equation modeling (pp. 690–704). New York, NY:.Guilford.
  • Marsh, H. W., & Hau, K.-T. (1996). Assessing goodness of fit: Is parsimony always desirable? The Journal of Experimental Education, 64, 364–390.
  • McArdle, J. J. (2005). The development of the ram rules for latent variable structural equation modeling. In A. Maydeu-Olivares & J. J. McArdle, (Eds.), Contemporary psychometrics: A festschrift for Roderick P. McDonald, (pp. 225–273). Mahwah, NJ: Lawrence Erlbaum.
  • McArdle, J. J., & Epstein, D. (1987). Latent growth curves within developmental structural equation models. Child development, 58, 110–133.
  • McArdle, J. J., & McDonald, R. P. (1984). Some algebraic properties of the reticular action model for moment structures. British Journal of Mathematical and Statistical Psychology, 37, 234–251.
  • McNeish, D. M. (2015). Using lasso for predictor selection and to assuage overfitting: A method long overlooked in behavioral sciences. Multivariate Behavioral Research, 50, 471–484.
  • Meinshausen, N. (2007). Relaxed lasso. Computational Statistics & Data Analysis, 52(1), 374–393.
  • Meredith, W., & Tisak, J. (1990). Latent curve analysis. Psychometrika, 55(1), 107–122.
  • Muthén, B., & Asparouhov, T. (2011). Bayesian SEM: A more flexible representation of substantive theory. Psychological Methods, 17, 313–335.
  • Muthén, B., & Asparouhov, T. (2012). Bayesian structural equation modeling: A more flexible representation of substantive theory. Psychological Methods, 17, 313–335. doi:10.1037/a0026802
  • Ning, L., & Georgiou, T. T. (2011). Sparse factor analysis via likelihood and I 1-regularization, 50th IEEE Conference on Decision and Control and European Control Conference (CDCECC), Orlando, FL, USA, December 12–15, 2011 5188–5192.
  • Park, T., & Casella, G. (2008). The Bayesian lasso. Journal of the American Statistical Association, 103(482), 681–686.
  • Pashler, H., & Wagenmakers, E.-J. (2012). Editors’ introduction to the special section on replicability in psychological science a crisis of confidence? Perspectives on Psychological Science, 7, 528–530.
  • R Core Team. (2015). R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria: Author.
  • Raykov, T., & Marcoulides, G. A. (1999). On desirability of parsimony in structural equation model selection. Structural Equation Modeling: A Multidisciplinary Journal, 6 (3), 292–300.
  • Rish, I., & Grabarnik, G. (2014). Sparse modeling: Theory, algorithms, and applications. Boca Raton,FL: CRC.
  • Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36.
  • Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 6, 461–464.
  • Thurstone, L. L. (1935). The vectors of mind. Chicago, IL: University of Chicago Press.
  • Thurstone, L. L. (1947). Multiple factor analysis. Chicago, IL: University of Chicago Press.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1), 267–288.
  • Trendafilov, N. T., & Adachi, K. (2014). Sparse versus simple structure loadings. Psychometrika, 1–15.
  • Tutz, G., & Schauberger, G. (2015). A penalty approach to differential item functioning in rasch models. Psychometrika, 21–43.
  • Yuan, K.-H., & Chan, W. (2008). Structural equation modeling with near singular covariance matrices. Computational Statistics & Data Analysis, 52, 4842–4858.
  • Yuan, K.-H., Wu, R., & Bentler, P. M. (2011). Ridge structural equation modelling with correlation matrices for ordinal and continuous data. British Journal of Mathematical and Statistical Psychology, 64(1), 107–133.
  • Zou, H., Hastie, T., & Tibshirani, R. (2006). Sparse principal component analysis. Journal of Computational and Graphical Statistics, 15, 265–286.
  • Zou, H., Hastie, T., & Tibshirani, R. (2007). On the “degrees of freedom” of the lasso. The Annals of Statistics, 35, 2173–2192. doi:10.1214/009053607000000127

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.