1,509
Views
73
CrossRef citations to date
0
Altmetric
Original Articles

On Nonregularized Estimation of Psychological Networks

ORCID Icon, , & ORCID Icon
Pages 719-750 | Published online: 08 Apr 2019

References

  • Armour, C., Fried, E. I., Deserno, M. K., Tsai, J., & Pietrzak, R. H. (2017). A network analysis of DSM-5 posttraumatic stress disorder symptoms and correlates in U.S. military veterans. Journal of Anxiety Disorders, 45, 49–59. doi:10.1016/j.janxdis.2016.11.008
  • Avagyan, V., Alonso, A. M., & Nogales, F. J. (2017). Improving the graphical Lasso estimation for the precision matrix through roots of the sample covariance matrix. Journal of Computational and Graphical Statistics, 26(4), 865–872. doi:10.1080/10618600.2017.1340890
  • Benjamini, Y., & Gavrilov, Y. (2009). A simple forward selection procedure based on false discovery rate control. The Annals of Applied Statistics, 3(1), 179–198. doi:10.1214/08-AOAS194
  • Bien, J., & Tibshirani, R. J. (2011). Sparse estimation of a covariance matrix. Biometrika, 98(4), 807–820. doi:10.1093/biomet/asr054
  • Blanchet, G., Legendre, P., & Borcard, D. (2008). Forward selection of spatial explanatory variables. Ecology, 89(9), 2623–2632. doi:10.1890/07-0986.1
  • Bollen, K. A., Harden, J. J., Ray, S., & Zavisca, J. (2014). BIC and alternative Bayesian information criteria in the selection of structural equation models. Structural Equation Modeling, 21(1), 1–19. doi:10.1080/10705511.2014.856691
  • Borsboom, D., & Cramer, A. O. (2013). Network analysis: an integrative approach to the structure of psychopathology. Annual Review of Clinical Psychology, 9(1), 91–121. doi:10.1146/annurev-clinpsy-050212-185608
  • Bühlmann, P. (2012). Statistical significance in high-dimensional linear models. Bernoulli 19(4), 1212–1242. doi:10.3150/12-BEJSP11
  • Bühlmann, P., Kalisch, M., & Meier, L. (2014). High-dimensional statistics with a view toward applications in biology. Annual Review of Statistics and its Application, 1(1), 255–278. doi:10.1146/annurev-statistics-022513-115545
  • Bühlmann, P., & Van De Geer, S. (2011). Statistics for high-dimensional data: Methods, theory and applications. Berlin, Germany: Springer Science & Business Media.
  • Burnham, K. P., & Anderson, D. R. (2004). Multimodel inference: Understanding AIC and BIC in model selection. Sociological Methods and Research, 33(2), 261–304. doi:10.1177/0049124104268644
  • Carvalho, C. M., Polson, N. G., & Scott, J. G. (2010). The horseshoe estimator for sparse signals. Biometrika, 97(2), 465–480. doi:10.1093/biomet/asq017
  • Casella, G., Girón, F. J., Martinez, M. L., & Moreno, E. (2009). Consistency of Bayesian procedures for variable selection. The Annals of Statistics, 37(3), 1207–1228. doi:10.1214/08-AOS606
  • Chand, S. (2012). On tuning parameter selection of Lasso-type methods - A Monte Carlo study. In Proceedings of 2012 9th International Bhurban Conference on Applied Sciences and Technology (pp. 120–129). Islamabad, Pakistan: IBCAST. doi:10.1109/IBCAST.2012.6177542
  • Chatterjee, A., & Lahiri, S. N. (2011). Bootstrapping Lasso estimators. Vol. 106. Abingdon, UK: Taylor & Francis Ltd. doi:10.2307/41416396
  • Chen, J., & Chen, Z. (2008). Extended Bayesian information criteria for model selection with large model spaces. Biometrika, 95(3), 759–771. doi:10.1093/biomet/asn034
  • Cramer, A. O., & Borsboom, D. (2015). Problems attract problems: A network perspective on mental disorders. Emerging Trends in the Social and Behavioral Sciences, 1–15. doi:10.1002/9781118900772
  • Dalalyan, A. S., Hebiri, M., & Lederer, J. (2017). On the prediction performance of the Lasso. Bernoulli, 23(1), 552–581. doi:10.3150/15-BEJ756
  • Dalege, J., Borsboom, D., van Harreveld, F., & van der Maas, H. L. (2017). A network perspective on political attitudes: Testing the connectivity hypothesis. arXiv preprint arXiv:1705.00193.
  • Das, A., Sampson, A., Lainscsek, C., Muller, L., Lin, W., Doyle, J., … Sejnowski, T. (2017). Interpretation of the precision matrix and its application in estimating sparse brain connectivity during sleep spindles from human electrocorticography recordings. Neural Computation, 29(3), 603–642. doi:10.1162/NECO{\_}a{\_}00936
  • Dempster, A. (1972). Covariance selection. Biometrics, 28(1), 157–175. doi:10.2307/2528966
  • Deserno, M. K., Borsboom, D., Begeer, S., & Geurts, H. M. (2017). Multicausal systems ask for multicausal approaches: A network perspective on subjective well-being in individuals with autism spectrum disorder. Autism, 21(8), 960–971. doi:10.1177/1362361316660309
  • de Vlaming, R., & Groenen, P. J. F. (2015). The current and future use of ridge regression for prediction in quantitative genetics. BioMed Research International, 2015, 1–18. doi:10.1155/2015/143712
  • Drton, M., & Perlman, M. D. (2004). Model selection for Gaussian concentration graphs. Biometrika, 91(3), 591–602. doi:10.1093/biomet/91.3.591
  • Efron, B. (2014). Estimation and accuracy after model selection. Journal of the American Statistical Association, 109(507), 991–1007. doi:10.1080/01621459.2013.823775
  • Efron, B., & Tibshirani, R. (1994). An introduction to the bootstrap. Boca Raton, FL: CRC Press.
  • Epskamp, S. (2016). Regularized Gaussian psychological networks: Brief report on the performance of extended BIC model selection. arXiv preprint arXiv:1606.05771.
  • Epskamp, S., Borsboom, D., & Fried, E. I. (2018). Estimating psychological networks and their accuracy: A tutorial paper. Behavior Research Methods, 50(1), 195–212. doi:10.3758/s13428-017-0862-1
  • Epskamp, S., Cramer, A. O. J., Waldorp, L. J., Schmittmann, V. D., & Borsboom, D. (2012). qgraph: Network visualizations of relationships in psychometric data. Journal of Statistical Software, 48(4), 1–18. doi:10.18637/jss.v048.i04
  • Epskamp, S., Fried, E. I. (2018). A tutorial on regularized partial correlation networks. Psychological Methods, 23(4), 617–634. doi:10.1037/met0000167.
  • Epskamp, S., Kruis, J., & Marsman, M. (2017). Estimating psychopathological networks: Be careful what you wish for. PLoS One, 12(6), 1–13. doi:10.1371/journal.pone.0179891
  • Fan, J., Liao, Y., & Liu, H. (2016). An overview of the estimation of large covariance and precision matrices. The Econometrics Journal, 19(1), C1–C32. doi:10.1111/ectj.12061
  • Foygel, R., & Drton, M. (2010). Extended Bayesian information criteria for Gaussian graphical models. In Advances in Neural Information Processing Systems, (pp. 604–612).
  • Friedman, J., Hastie, T., & Tibshirani, R. (2008). Sparse inverse covariance estimation with the graphical Lasso. Biostatistics, 9(3), 432–441. doi:10.1093/biostatistics/kxm045
  • Gao, X., Pu, D. Q., Wu, Y., & Xu, H. (2009). Tuning parameter selection for penalized likelihood estimation of inverse covariance matrix. arXiv. doi:10.5705/ss.2009.210
  • Ha, M. J., & Sun, W. (2014). Partial correlation matrix estimation using ridge penalty followed by thresholding and re-estimation. Biometrics, 70(3), 765–773. doi:10.1111/biom.12186
  • Harrell, F. (2001). Regression modeling strategies, with applications to linear models, survival analysis and logistic regression. New York: Springer.
  • Hartlap, J., Simon, P., & Schneider, P. (2007). Unbiased estimation of the inverse covariance matrix. Astronomy & Astrophysics, 464, 399–404. doi:10.1051/0004-6361:20066170
  • Hastie, T., Tibshirani, R., & Friedman, J. (2008). The elements of statistical learning: Data mining, inference, and prediction (2nd ed.). New York: Springer.
  • Hastie, T., Tibshirani, R., & Wainwright, M. (2015). Statistical learning with sparsity: The Lasso and generalizations. Boca Raton, FL: CRC Press. doi:10.1201/b18401-1
  • Henderson, D. A., & Denison, D. R. (1989). Stepwise regression in social and psychological research. Psychological Reports, 64(1), 251–257. doi:10.2466/pr0.1989.64.1.251
  • Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67. doi:10.1080/00401706.1970.10488634
  • Isvoranu, A. M., Van Borkulo, C. D., Boyette, L. L., Wigman, J. T., Vinkers, C. H., Borsboom, D., … Myin-Germeys, I. (2017). A network approach to psychosis: Pathways between childhood trauma and psychotic symptoms. Schizophrenia Bulletin, 43(1), 187–196. doi:10.1093/schbul/sbw055
  • Janková, J., & van de Geer, S. (2015). Confidence intervals for high-dimensional inverse covariance estimation. Electronic Journal of Statistics, 9(1), 1205–1229. doi:10.1214/15-EJS1031
  • Janková, J., & van de Geer, S. (2017). Honest confidence regions and optimality in high-dimensional precision matrix estimation. Test, 26(1), 143–162. doi:10.1007/s11749-016-0503-5
  • Javanmard, A., Montanari, A. (2018). Debiasing the lasso: Optimal sample size for Gaussian designs. The Annals of Statistics, 46(6A), 2593–2622.
  • Khondker, Z. S., Zhu, H., Chu, H., Lin, W., & Ibrahim, J. G. (2013). The Bayesian covariance Lasso. Statistics and its Interface, 6(2), 243–259. doi:10.1007/s11103-011-9767-z.Plastid
  • Kossakowski, J. J., Epskamp, S., Kieffer, J. M., van Borkulo, C. D., Rhemtulla, M., & Borsboom, D. (2016). The application of a network approach to Health-Related Quality of Life (HRQoL): introducing a new method for assessing HRQoL in healthy adults and cancer patients. Quality of Life Research, 25(4), 781–792. doi:10.1007/s11136-015-1127-z
  • Krämer, N., Schäfer, J., & Boulesteix, A.-L. (2009). Regularized estimation of large-scale gene association networks using graphical Gaussian models. BMC Bioinformatics, 10(1), 384. doi:10.1186/1471-2105-10-384
  • Kuismin, M., Kemppainen, J., & Sillanpää, M. (2017). Precision matrix estimation with ROPE. Journal of Computational and Graphical Statistics, 26(3), 682–694. doi:10.1080/10618600.2016.1278002
  • Kuismin, M., & Sillanpää, M. (2017). Estimation of covariance and precision matrix, network structure, and a view toward systems biology. Wiley Interdisciplinary Reviews: Computational Statistics, 9(6), 1–13. doi:10.1002/wics.1415
  • Kuismin, M., & Sillanpää, M. J. (2016). Use of Wishart prior and simple extensions for sparse precision matrix estimation. PLoS One, 11(2), e0148171. doi:10.1371/journal.pone.0148171
  • Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22(1), 79–86.
  • Kwan, C. C. (2014). A regression-based interpretation of the inverse of the sample covariance matrix. Spreadsheets in Education, 7(1), 4613.
  • Ledoit, O., & Wolf, M. (2004). Honey, I shrunk the sample covariance matrix. The Journal of Portfolio Management, 30(4), 110–119. doi:10.3905/jpm.2004.110
  • Ledoit, O., & Wolf, M. (2004). A well-conditioned estimator for large-dimensional covariance matrices. Journal of Multivariate Analysis, 88(2), 365–411. doi:10.1016/S0047-259X(03)00096-4
  • Leeb, H., Pötscher, B. M., & Ewald, K. (2015). On various confidence intervals post-model-selection. Statistical Science, 30(2), 216–227. doi:10.1214/14-STS507
  • Leppä-Aho, J., Pensar, J., Roos, T., & Corander, J. (2017). Learning Gaussian graphical models with fractional marginal pseudo-likelihood. International Journal of Approximate Reasoning, 83, 21–42. doi:10.1016/j.ijar.2017.01.001
  • Li, F-Q., & Zhang, X-S. (2017). Bayesian Lasso with neighborhood regression method for Gaussian graphical model. Acta Mathematicae Applicatae Sinica, English Series, 33(2), 485–496. doi:10.1007/s10255-017-0676-z
  • Liu, H., Roeder, K., & Wasserman, L. (2010). Stability approach to regularization selection (StARS) for high dimensional graphical models. In Advances in neural information processing systems, arXiv, (pp. 1432–1440). doi:papers3://publication/uuid/F1CE0C72-5199-4FC6-829C-B76A36C5ED28
  • Liu, K., Markovic, J., & Tibshirani, R. (2018). More powerful post-selection inference, with application to the lasso. arXiv preprint arXiv:1801.09037.
  • Liu, W. (2013). Gaussian graphical model estimation with false discovery rate control. The Annals of Statistics, 41(6), 2948–2978. doi:10.1214/13-AOS1169
  • Lysen, S. (2009). Permuted inclusion criterion: a variable selection technique. Publicly accessible Penn Dissertations, 28.
  • McElreath, R. (2016). Statistical rethinking: A Bayesian course with examples in R and Stan. Boca Raton, FL: CRC Press. doi:10.3102/1076998616659752
  • McLeod, A. I., & Xu, C. (2010). bestglm: Best subset GLM. http://CRAN.R-project.org/package=bestglm.
  • McNally, R. J., Robinaugh, D. J., Wu, G. W. Y., Wang, L., Deserno, M. K., & Borsboom, D. (2015). Mental disorders as causal systems. Clinical Psychological Science, 3(6), 836–849. doi:10.1177/2167702614553230
  • McNeish, D. M. (2015). Using Lasso for predictor selection and to assuage overfitting: A method long overlooked in behavioral sciences. Multivariate Behavioral Research, 50(5), 471–484. doi:10.1080/00273171.2015.1036965
  • Meinshausen, N., & Bühlmann, P. (2006). High-dimensional graphs and variable selection with the Lasso. The Annals of Statistics, 34(3), 1436–1462. doi:10.1080/00273171.2015.1036965
  • Mohammadi, A., & Wit, E. C. (2015a). Bayesian structure learning in sparse Gaussian graphical models. Bayesian Analysis, 10(1), 109–138. doi:10.1214/14-BA889
  • Mohammadi, A., & Wit, E. C. (2015b). BDgraph: An R package for Bayesian structure learning in graphical models. arXiv preprint arXiv:1501.05108.
  • Norouzi, M., Fleet, D. J., & Salakhutdinov, R. R. (2012). Hamming distance metric learning. In Advances in neural information processing systems (pp. 1061–1069).
  • Peng, J., Wang, P., Zhou, N., & Zhu, J. (2009). Partial correlation estimation by joint sparse regression models. Journal of the American Statistical Association, 104(486), 735–746. doi:10.1198/jasa.2009.0126
  • Piironen, J., & Vehtari, A. (2016). On the hyperprior choice for the global shrinkage parameter in the horseshoe prior. arXiv preprint arXiv:1610.05559.
  • Powers, D. (2011). Evaluation: From precision, recall and F-measure to Roc, informedness, markedness & correlation. Journal of Machine Learning Technologies, 2(1), 37–63. doi:10.1.1.214.9232
  • R Core Team. (2018). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. https://www.R-project.org/.
  • Raferty, A. (1995). Bayesian model selection in social research. Sociological Methodology, 25, 111–164. doi:10.2307/271063
  • Ren, Z., Sun, T., Zhang, C.-H., & Zhou, H. H. (2015). Asymptotic normality and optimalities in estimation of large Gaussian graphical models. The Annals of Statistics, 43(3), 991–1026. doi:10.1214/14-AOS1286
  • Rhemtulla, M., Brosseau-Liard, P. E., & Savalei, V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions. Psychological Methods, 17(3), 354–373. doi:10.1037/a0029315
  • Rhemtulla, M., Fried, E. I., Aggen, S. H., Tuerlinckx, F., Kendler, K. S., & Borsboom, D. (2016). Network analysis of substance abuse and dependence symptoms. Drug and Alcohol Dependence, 161, 230–237. doi:10.1016/j.drugalcdep.2016.02.005
  • Schäfer, J., & Strimmer, K. (2005a). A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics. Statistical Applications in Genetics and Molecular Biology, 4(1). doi:10.2202/1544-6115.1175
  • Schäfer, J., & Strimmer, K. (2005b). An empirical Bayes approach to inferring large-scale gene association networks. Bioinformatics, 21(6), 754–764. doi:10.1093/bioinformatics/bti062
  • Shao, J. (1997). An asymptotic theory for linear model selection. Statistica Sinica, 7, 221–264.
  • Spiller, T. R., Schick, M., Schnyder, U., Bryant, R. A., Nickerson, A., & Morina, N. (2017). Symptoms of posttraumatic stress disorder in a clinical sample of refugees: a network analysis. European Journal of Psychotraumatology, 8(sup3), 1318032. doi:10.1080/20008198.2017.1318032
  • Stephens, G. (1998). On the inverse of the covariance matrix in portfolio analysis. The Journal of Finance, 53(5), 1821–1827.
  • Tibshirani, R. (2011). Regression shrinkage and selection via the Lasso: A retrospective. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73(3), 273–282. doi:10.1111/j.1467-9868.2011.00771.x
  • Van Borkulo, C. D., Borsboom, D., Epskamp, S., Blanken, T. F., Boschloo, L., Schoevers, R. A., & Waldorp, L. J. (2014). A new method for constructing networks from binary data. Scientific Reports, 4, 1–10. doi:10.1038/srep05918
  • van Rooijen, G., Isvoranu, A. M., Meijer, C. J., van Borkulo, C. D., Ruhé, H. G., & de Haan, L. (2017). A symptom network structure of the psychosis spectrum. Schizophrenia Research, 189, 75–83. doi:10.1016/j.schres.2017.02.018
  • Van Wieringen, W. N., & Peeters, C. F. (2016). Ridge estimation of inverse covariance matrices from high-dimensional data. Computational Statistics and Data Analysis, 103, 284–303. doi:10.1016/j.csda.2016.05.012
  • Wagenmakers, E. J. (2007). A practical solution to the pervasive problems of p values. Psychonomic Bulletin & Review, 14(5), 779–804. doi:10.3758/BF03194105
  • Waldorp, L., Marsman, M., & Maris, G. (2018). Logistic regression and Ising networks: Prediction and estimation when violating Lasso assumptions (1974) (pp. 1–24). Behaviormetrika. doi:10.1007/s41237-018-0061-0
  • Wang, H. (2012). Bayesian graphical Lasso models and eficient posterior computation. Bayesian Analysis, 7(4), 867–886. doi:10.1214/12-BA729
  • Wang, T., Ren, Z., Ding, Y., Fang, Z., Sun, Z., MacDonald, M. L., … Chen, W. (2016). FastGGM: An efficient algorithm for the inference of Gaussian graphical model in biological networks. PLoS Computational Biology, 12(2), 1–16. doi:10.1371/journal.pcbi.1004755
  • Wang, Y. R., & Huang, H. (2014). Review on statistical methods for gene network reconstruction using expression data. Journal of Theoretical Biology, 362, 53–61. doi:10.1016/j.jtbi.2014.03.040
  • Wasserman, L., & Roeder, K. (2009). High-dimensional variable selection. The Annals of Statistics, 37(5), 2178–2201. doi:10.1214/08-AOS646
  • Williams, D. R. (2018). Bayesian inference for Gaussian graphical models: Structure learning. Explanation, and Prediction. doi:10.31234/OSF.IO/X8DPR
  • Williams, D. R., & Mulder, J. (2019). Bayesian hypothesis testing for Gaussian graphical models: Conditional independence and order constraints. https://doi.org/10.31234/osf.io/ypxd8.
  • Williams, D. R., Piironen, J., Vehtari, A., & Rast, P. (2018). Bayesian estimation of Gaussian graphical models with projection predictive selection. arXiv preprint arXiv:1801.05725.
  • Won, J-H., Lim, J., & Kim, S.-J. (2009). Maximum likelihood covariance estimation. Asilomar Conference on Signals, Systems, and Computers (pp. 1445–1449).
  • Wong, B. F., Carter, C. K., & Kohn, R. (2003). Efficient estimation of covariance selection models. Biometrika, 90(4), 809–830. doi:10.1093/biomet/90.4.809
  • Yang, Y., Etesami, J., & Kiyavash, N. (2015). Efficient neighborhood selection for Gaussian graphical models. arXiv preprint arXiv:1509.06449.
  • Yarkoni, T., & Westfall, J. (2017). Choosing prediction over explanation in psychology: Lessons from machine learning. Perspectives on Psychological Science, 12(6), 1100–1122. doi:10.1177/1745691617693393
  • Zhang, Y., & Yang, Y. (2015). Cross-validation for selecting a model selection procedure. Journal of Econometrics, 187(1), 95–112. doi:10.1177/1745691617693393
  • Zhao, P., & Yu, B. (2006). On model selection consistency of Lasso. The Journal of Machine Learning Research, 7, 2541–2563. doi:10.1109/TIT.2006.883611
  • Zhao, T., Liu, H., Roeder, K., Lafferty, J., & Wasserman, L. (2012). The huge package for high-dimensional undirected graph estimation in R. Journal of Machine Learning Research, 13, 1059–1062. doi:10.1002/aur.1474.Replication
  • Zou, H., & Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(5), 768–301. doi:10.1111/j.1467-9868.2005.00527.x

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.