6,117
Views
15
CrossRef citations to date
0
Altmetric
Articles

Building the foundations for measuring learning gain in higher education: a conceptual framework and measurement instrument

ORCID Icon, ORCID Icon & ORCID Icon
Pages 266-301 | Received 29 Nov 2017, Accepted 31 May 2018, Published online: 06 Sep 2018

References

  • Ajzen, I. (1985). From intentions to actions: A theory of planned behavior. In J. Kuhl & J. Beckman (Eds), Action control (pp. 11–39). Berlin, Heidelberg: Springer.
  • Allen, M.D., & Allen, M. (1988). The goals of Universities. Milton Keynes: Open University Press.
  • An, B.P. (2015). The role of academic motivation and engagement on the relationship between dual enrollment and academic performance. The Journal of Higher Education, 86:, 98–126.
  • Asikainen, H., & Gijbels, D. (2017). Do students develop towards more deep approaches to learning during studies? A systematic review on the development of students’ deep and surface approaches to learning in higher education. Educational Psychology Review, 29(2), 205–234.
  • Astin, A. (1996). Involvement in learning revisited: Lessons we have learned. Journal of College Student Development, 37, 123–134.
  • Astin, A.W., & Lee, J.J. (2003). How risky are one-shot cross-sectional assessments of undergraduate students? Research in Higher Education, 44(6), 657–672.
  • Baeten, M., Kyndt, E., Struyven, K., & Dochy, F. (2010). Using student-centred learning environments to stimulate deep approaches to learning: Factors encouraging or discouraging their effectiveness. Educational Research Review, 5(3), 243–260.
  • Barrie, S.C. (2004). A research-based approach to generic graduate attributes policy. Higher Education Research & Development, 23(3), 261–275.
  • Barrie, S.C. (2007). A conceptual framework for the teaching and learning of generic graduate attributes. Studies in Higher Education, 32(4), 439–458.
  • Bauer, K.W., & Liang, Q. (2003). The effect of personality and precollege characteristics on first-year activities and academic performance. Journal of College Student Development, 44(3), 277–290.
  • Bennett, N., Dunne, E., & Carré, C. (1999). Patterns of core and generic skills provision in higher education. Higher Education, 37(1), 71–93.
  • Biggs, J. (1987). Student approaches to learning and studying. Melbourne: Australian Council for Educational Research.
  • Blaich, C.F., & Wise, K.S. (2011). From gathering to using assessment results: Lessons from the Wabash National Study. Urbana, IL: National Institute for Learning Outcomes Assessment.
  • Blundell, R., Dearden, L., Goodman, A., & Reed, H. (2000). The returns to higher education in Britain: Evidence from a British cohort. The Economic Journal, 110(461), 82–99.
  • Bowden, J., Hart, G., King, B., Trigwell, K., & Watts, O. (2000). Generic capabilities of ATN university graduates Canberra. Canberra: Australian Government Department of Education, Training and Youth Affairs. Retrieved from http://www.gradskills.anu.edu.au/generic-capabilities-framework
  • Bowman, N.A. (2010a). Disequilibrium and resolution: The nonlinear effects of diversity courses on well-being and orientations toward diversity. Review of Higher Education, 33, 543–568.
  • Bowman, N.A. (2010b). Can first-year college students accurately report their learning and development? American Educational Research Journal, 47(2), 466–496.
  • Boyle, E.A., Duffy, T., & Dunleavy, K. (2003). Learning styles and academic outcome: The validity and utility of Vermunt’s inventory of learning styles in a British higher education setting. British Journal of Educational Psychology, 73(2), 267–290.
  • Boyle, G.J. (1991). Does item homogeneity indicate internal consistency or item redundancy in psychometric scales? Personality and Individual Differences, 12(3), 291–294.
  • Boyle, G.J., Stankov, L., & Cattell, R.B. (1995). Measurement and statistical models in the study of personality and intelligence. In D.H. Saklofske & M. Zeidnre (Eds.), International handbook of personality and intelligence (pp. 417–446). New York: Plenum.
  • Britton, J., Dearden, L., Shephard, N., & Vignoles, A. (2016). How English domiciled graduate earnings vary with gender, institution attended, subject and socio-economic background. Institute for Fiscal Studies Working Paper W, 16.
  • Brooks, P. (2012). Outcomes, testing, learning: What’s at stake? Social Research, 79, 601–611.
  • Buchannan, A. (2015). Education and social moral epistemology. In H. Brighouse & M. McPherson (Eds.), The aims of higher education: Problems of morality and justice. Chicago: University of Chicago Press.
  • Camara, W. (2013). Defining and measuring college and career readiness: A validation framework. Educational Measurement: Issues and Practice, 32(4), 16–27.
  • Carini, R.M., Kuh, G.D., & Klein, S.P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47(1), 1–32.
  • Cattell, R. B. (1978). The scientific use of factor analysis in behavioral and life sciences. New York: Plenum Press.
  • Center for Inquiry at Wabash College. (2016). Wabash National Study 2006–2012. Retrieved from http://www.liberalarts.wabash.edu/study-overview/
  • Coates, H., & Richardson, S. (2012). An international assessment of Bachelor degree graduates’ learning outcomes. Higher Education Management and Policy, 23(3), 51–69.
  • Coertjens, L., Van Daal, T., Donche, V., De Maeyer, S., Vanthournout, G., & Van Petegem, P. (2013). Analysing change in learning strategies over time: A comparison of three statistical techniques. Studies in Educational Evaluation, 39, 49–55.
  • Condon, D.M., & Revelle, W. (2014). The International cognitive ability resource: Development and initial validation of a public-domain measure. Intelligence, 43, 52–64.
  • Dent, A.L., & Koenka, A.C. (2016). The relation between self-regulated learning and academic achievement across childhood and adolescence: A meta-analysis. Educational Psychology Review, 28(3), 425–474.
  • Department for Education. (2016). Educational excellence everywhere. Policy Paper. Retrieved from https://www.gov.uk/government/publications/educational-excellence-everywhere
  • Department for Education. (2017a). New education and skills measures. Retrieved from https://www.gov.uk/government/news/new-education-and-skills-measures-announced
  • Department for Education. (2017b, June 29). Participation in education, training and employment by 16–18 year olds in England: End 2016. Statistical first release. SFR 29/2017. Retrieved from https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/623310/SFR29_2017_Main_text_.pdf
  • Dill, D., & Soo, M. (2005). Academic quality, league tables and public policy: A cross-national analysis of University ranking systems. Higher Education, 49(4), 495–537.
  • Dinsmore, D. (2017). Toward a dynamic, multidimensional research framework for strategic processing. Educational Psychology Review, 29, 235–268.
  • Dörrenbächer, L., & Perels, F. (2016). Self-regulated learning profiles in college students: Their relationship to achievement, personality, and the effectiveness of an intervention to foster self-regulated learning. Learning and Individual Differences, 51:, 229–241.
  • Duckworth, A.L., Peterson, C., Matthews, M.D., & Kelly, D.R. (2007). Grit: Perseverance and passion for long-term goals. Journal of Personality and Social Psychology, 92(6), 1087–1101.
  • Duckworth, A.L., & Quinn, P.D. (2009). Development and validation of the short grit scale (GRIT–S). Journal of Personality Assessment, 91(2), 166–174.
  • Dunne, E., Bennett, N., & Carré, C. (1997). Higher education: Core skills in a learning society. Journal of Education Policy, 12(6), 511–525.
  • Entwistle, N., & McCune, V. (2004). The conceptual bases of study strategy inventories. Educational Psychology Review, 16, 325–345.
  • Ewell, P.T. (2010). The US national survey of student engagement (NSSE). In D. Dill & M. Beerkens (Eds.), Public policy for academic quality (pp. 83–97). Netherlands: Springer.
  • Fredricks, J.A., Filsecker, M., & Lawson, M.A. (2016). Student engagement, context, and adjustment: Addressing definitional, measurement, and methodological issues. Learning and Instruction, 43, 1–4.
  • Fredricks, J.A., Wang, M.T., Linn, J.S., Hofkens, T.L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction, 43, 5–15.
  • Fryer, L.K., & Ainley, M. (2017). Supporting interest in a study domain: A longitudinal test of the interplay between interest, utility-value, and competence beliefs. Learning and Instruction. Corrected proof. Retrieved from https://doi.org/10.1016/j.learninstruc.2017.11.002
  • Fryer, L.K., Ginns, P., & Walker, R. (2016). Reciprocal modelling of Japanese university students’ regulation strategies and motivational deficits for studying. Learning and Individual Differences, 51, 220–228.
  • Goldstein, H. (2014). Using league table rankings in public policy formation: Statistical issues. Annual Review of Statistics and Its Application, 1, 385–399.
  • Goldstein, H., & Leckie, G. (2016). Trends in examination performance and exposure to standardised tests in England and Wales. British Educational Research Journal, 42(3), 367–375.
  • Harding, T.S., Mayhew, M.J., Finelli, C.J., & Carpenter, D.D. (2007). The theory of planned behavior as a model of academic dishonesty in engineering and humanities undergraduates. Ethics & Behavior, 17(3), 255–279.
  • HEFCE. (2017). National student satisfaction survey 2017. Results. Retrieved from http://www.hefce.ac.uk/lt/nss/results/2017/
  • HEFCE - Higher Education Funding Council for England. (2015 February). Business Plan. Creating and sustaining the conditions for a world-leading higher education system. Retrieved from http://www.hefce.ac.uk/media/hefce/content/about/How,we,operate/Corporate,planning/Business,plan/HEFCE%20Business%20plan%2011%202%2015.pdf
  • Heikkilä, A., Lonka, K., Nieminen, J., & Niemivirta, M. (2012). Relations between teacher students’ approaches to learning, cognitive and attributional strategies, well-being, and study success. Higher Education, 64(4), 455–471.
  • Herzog, S. (2011). Gauging academic growth of bachelor degree recipients: Longitudinal vs. self-reported gains in general education. New Directions for Institutional Research, Report no. 150, p. 21–39. DOI: 10.1002/ir.387.
  • Hill, J., Walkington, H., & France, D. (2016). Graduate attributes: Implications for higher education practice and policy: Introduction. Journal of Geography in Higher Education, 40(2), 155–163.
  • Ivcevic, Z., & Brackett, M. (2014). Predicting school success: Comparing conscientiousness, grit, and emotion regulation ability. Journal of Research in Personality, 52, 29–36.
  • Jorre de St Jorre, T., & Oliver, B. (2018). Want students to engage? Contextualise graduate learning outcomes and assess for employability. Higher Education Research & Development, 37(1), 44–57.
  • Kelley, K., & Lai, K. (2018). Confirmatory factor models: Power and accuracy for effects of interest. In P. Irwing, T. Booth, & D. Hughes (Eds.), The Wiley handbook of psychometric testing: A multidisciplinary reference on survey, scale and test development (pp. 113–139). Chichester: Wiley.
  • Kilgo, C.A., & Pascarella, E.T. (2016). Does independent research with a faculty member enhance four-year graduation and graduate/professional degree plans? Convergent results with different analytical methods. Higher Education, 71(4), 575–592.
  • King, P.M., & Mayhew, M.J. (2002). Moral judgement development in higher education: Insights from the defining issues test. Journal of Moral Education, 31(3), 247–270.
  • Klein, S., Benjamin, R., Shavelson, R., & Bolus, R. (2007). The collegiate learning assessment: Facts and fantasies. Evaluation Review, 31(5), 415–439.
  • Kline, R.B. (2005). Principles and practice of structural equation modeling (2nd ed.). Guilford: New York.
  • Knight, P.T., & Yorke, M. (2003). Employability and good learning in higher education. Teaching in Higher Education, 8(1), 3–16.
  • Kuh, G.D. (2003). What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices. Change: the Magazine of Higher Learning, 35(2), 24–32.
  • Kules, B. (2016). Computational thinking is critical thinking: Connecting to university discourse, goals, and learning outcomes. Proceedings of the Association for Information Science and Technology, 53(1), 1–6.
  • Leckie, G., & Goldstein, H. (2009). The limitations of using school league tables to inform school choice. Journal of the Royal Statistical Society: Series A (Statistics in Society), 172(4), 835–851.
  • Leckie, G., & Goldstein, H. (2017). The evolution of school league tables in England 1992–2016: ‘Contextual value-added’, ‘expected progress’ and ‘progress 8ʹ. British Educational Research Journal, 43(2), 193–212.
  • Liu, O.L. (2011). Measuring value added in education: Conditions and caveats. Assessment & Evaluation in Higher Education, 36(1), 81–94.
  • Loes, C.N., Salisbury, M.H., & Pascarella, E.T. (2015). Student perceptions of effective instruction and the development of critical thinking: A replication and extension. Higher Education, 69(5), 823–838.
  • Lonka, K., Chow, A., Keskinen, J., Hakkarainen, K., Sandström, N., & Pyhältö, K. (2013). How to measure PhD. students’ conceptions of academic writing-and are they related to well-being? Journal of Writing Research, 5(3), 1–25.
  • Lonka, K., Olkinuora, E., & Makinen, J. (2004). Aspects and prospects of measuring studying and learning in higher education. Educational Psychology Review, 16(4), 301–331.
  • MacCann, C., & Roberts, R.D. (2010). Do time management, grit, and self-control relate to academic achievement independently of conscientiousness? In R.E. Hicks (Ed.), Personality and individual differences: Current directions (pp. 79–90). Bowen Hills, QLD, Australia: Australian Academic Press.
  • Marsh, H.W., & Hau, K.T. (1999). Confirmatory factor analysis: Strategies for small sample sizes. Statistical Strategies for Small Sample Research, 1:, 251–284.
  • Martin, C. (2016). Should students have to borrow? Autonomy, wellbeing and student debt. Journal of Philosophy of Education, 50(3), 351–370.
  • Marton, F., & Säljö, R. (1984). Approaches to learning. In F. Marton, D. Hounsell, & N. Entwistle (Eds.), The experience of learning (pp. 36–55). Edinburgh: Scottish Academic Press.
  • Mason-Apps, E. (2017). Portsmouth HEFCE learning gain project. Presentation at the Annual Teaching and Learning Conference, Portsmouth. Retrieved http://www.port.ac.uk/research/learning-gain/about-the-project/
  • Mayhew, M.J. (2012). A multilevel examination of the influence of institutional type on the moral reasoning development of first-year students. Journal of Higher Education, 83, 367–388.
  • McDonald, R.P. (1981). The dimensionality of tests and items. British Journal of Mathematical and Statistical Psychology, 34(1), 100–117.
  • McDonald, R. P. (1999). Test theory: A unified treatment. Mahwah, NJ: Lawrence Erlbaum Associates.
  • McGrath, C.H., Guerin, B., Harte, E., Frearson, M., & Manville, C. (2015). Learning gain in higher education. Research Report RR996. Cambridge: RAND Europe. Retrieved from https://www.rand.org/pubs/research_reports/RR996.html
  • Meyer, J.H. (2000). The modelling of ‘dissonant’study orchestration in higher education. European Journal of Psychology of Education, 15(1), 5–18.
  • Murtaugh, P.A., Burns, L.D., & Schuster, J. (1999). Predicting the retention of university students. Research in Higher Education, 40(3), 355–371.
  • Muthén, L.K., & Muthén, B.O. (2002). How to use a Monte Carlo study to decide on sample size and determine power. Structural Equation Modeling, 9(4), 599–620.
  • Neves, J., & Stoakes, G. (2018). UKES, learning gain and how students spent their time. Higher Education Pedagogies, 3(1), 1–3.
  • Nulty, D.D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment & Evaluation in Higher Education, 33(3), 301–314.
  • OECD (2012a). AHELO feasibility study report. Volume 1: Design and Implementation. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume1.pdf
  • OECD (2012b). AHELO feasibility study report. Volume 2: Data analysis and national experiences. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume2.pdf
  • OECD (2012c). AHELO feasibility study report. Volume 3: Value-added Measurement and the Conference proceedings. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume3.pdf
  • OECD. (2017). Education at a glance 2017. OECD indicators. Retrieved from http://www.oecd-ilibrary.org/education/education-at-a-glance-2017_eag-2017-en
  • Oliver, B. (2013). Graduate attributes as a focus for institution-wide curriculum renewal: Innovations and challenges. Higher Education Research & Development, 32(3), 450–463.
  • Pascarella, E., Seifert, T., & Blaich, C. 2010. ‘How effective are the NSSE benchmarks in predicting important educational outcomes?’ Change Magazine (December). Retrieved from http://www.changemag.org/Archives/Back%20Issues/January-February%202010/full-how-effective.html
  • Pascarella, E.T., & Terenzini, P.T. (1991). How college affects students: Findings and insights from twenty years of research. San Francisco: Jossey-Bass.
  • Pascarella, E.T., & Terenzini, P.T. (2005). How college affects students: A third decade of research (Vol. 2.). San Francisco: Jossey-Bass.
  • Perry, W.G., Jr. (1970). Forms of intellectual and ethical development in the college years: A scheme. New York: Holt, Rinehart, and Winston.
  • Peterson, C.H., Gischlar, K.L., & Peterson, N.A. (2017). Item construction using reflective, formative, or rasch measurement models: Implications for group work. The Journal for Specialists in Group Work, 42(1), 17–32.
  • Pintrich, P.R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, (16), 385–407.
  • Podsakoff, P.M., MacKenzie, S.B., Lee, J.Y., & Podsakoff, N.P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879.
  • Pollard, E., Williams, M., Williams, J., Bertram, C., Buzzeo, J., Drever, E., … Coutinho, S. 2013. How should we measure higher education? A fundamental review of the performance indicators. Synthesis Report. London: HEFCE. Retrieved from http://www.hefce.ac.uk/pubs/rereports/year/2013/ukpireview/
  • Ponterotto, J.G., & Ruckdeschel, D.E. (2007). An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures. Perceptual and Motor Skills, 105(3), 997–1014.
  • Porter, S.R., & Umbach, P.D. (2006). Student survey response rates across institutions: Why do they vary? Research in Higher Education, 47(2), 229–247.
  • Porter, S.R., & Whitcomb, M.E. (2003). The impact of lottery incentives on student survey response rates. Research in Higher Education, 44(4), 389–407.
  • Randles, R., & Cotgrave, A. (2017). Measuring student learning gain: A review of transatlantic measurements of assessments in higher education. Innovations in Practice, 11(1), 50–59.
  • Richardson, H.A., Simmering, M.J., & Sturman, M.C. (2009). A tale of three perspectives: Examining post hoc statistical techniques for detection and correction of common method variance. Organizational Research Methods, 12(4), 762–800.
  • Rodgers, T. (2005). Measuring value added in higher education: Do any of the recent experiences in secondary education in the United Kingdom suggest a way forward? Quality Assurance in Education, 13(2), 95–106.
  • Rodgers, T. (2007). Measuring value added in higher education: A proposed methodology for developing a performance indicator based on the economic value added to graduates. Education Economics, 15(1), 55–74.
  • Roohr, K.C., Liu, H., & Liu, O.L. (2017). Investigating student learning gains in college: A longitudinal study. Studies in Higher Education, 42(12), 2284–2300.
  • Rosman, T., Mayer, A.K., Kerwer, M., & Krampen, G. (2017, June). The differential development of epistemic beliefs in psychology and computer science students: A four-wave longitudinal study. Learning and Instruction 49, 166–177.
  • Satorra, A., & Saris, W.E. (1985). Power of the likelihood ratio test in covariance structure analysis. Psychometrika, 50(1), 83–90.
  • Schommer-Aikins, M., Beuchat-Reichardt, M., & Hernández-Pina, F. (2012). Epistemological and learning beliefs of trainee teachers studying education. Anales De Psicología/Annals of Psychology, 28(2), 465–474.
  • Schommer-Aikins, M., & Easter, M. (2009). Ways of knowing and willingness to argue. The Journal of Psychology, 143(2), 117–132.
  • Schommer-Aikins, M., Mau, W., Brookhart, S., & Hutter, R. (2000). Understanding middle students’ beliefs about knowledge and learning using a multidimensional paradigm. Journal of Educational Research, 94(2), 120–127.
  • Schultzberg, M., & Muthén, B. (2017). Number of subjects and time points needed for multilevel time-series analysis: A simulation study of dynamic structural equation modeling. Structural Equation Modeling: A Multidisciplinary Journal, 25(4), 495–515.
  • Smith, J., McKnight, A., & Naylor, R. (2000). Graduate employability: Policy and performance in higher education in the UK. The Economic Journal, 110(464), 382–411.
  • Smith, J., & Naylor, R. (2001). Determinants of degree performance in UK universities: A statistical analysis of the 1993 student cohort. Oxford Bulletin of Economics and Statistics, 63(1), 29–60.
  • Spronken-Smith, R., Bond, C., McLean, A., Frielick, S., Smith, N., Jenkins, M., & Marshall, S. (2015). Evaluating engagement with graduate outcomes across higher education institutions in Aotearoa/New Zealand. Higher Education Research & Development, 34(5), 1014–1030.
  • Trizano-Hermosilla, I., & Alvarado, J.M. (2016). Best alternatives to Cronbach’s alpha reliability in realistic conditions: Congeneric and asymmetrical measurements. Frontiers in Psychology, 7(769), 1–8.
  • Tynjälä, P. (2001). Writing, learning and the development of expertise in higher education. In P. Tynjälä, L. Mason, & K. Lonka (Eds.), Writing as a learning tool (pp. 37–56). Netherlands: Springer.
  • Van der Vleuten, C.P.M., Verwijnen, G.M., & Wijnen, W.H.F.W. (1996). Fifteen years of experience with progress testing in a problem-based learning curriculum. Medical Teacher, 18(2), 103–109.
  • Vaske, J.J., Beaman, J., & Sponarski, C.C. (2017). Rethinking internal consistency in Cronbach’s Alpha. Leisure Sciences, 39(2), 163–173.
  • Veenman, M.V., Wilhelm, P., & Beishuizen, J.J. (2004). The relation between intellectual and metacognitive skills from a developmental perspective. Learning and Instruction, 14(1), 89–109.
  • Verhine, R.E., Dantas, L.M.V., & Soares, J.F. (2006). Do Provão ao ENADE: uma análise comparativa dos exames nacionais utilizados no Ensino Superior Brasileiro. Retrieved from http://www.scielo.br/pdf/ensaio/v14n52/a02v1452.pdf
  • Vermetten, Y.J., Lodewijks, H.G., & Vermunt, J.D. (1999). Consistency and variability of learning strategies in different university courses. Higher Education, 37(1), 1–21.
  • Vermunt, J.D., Vignoles, A., & Ilie, S. (2016). Defining learning gain in higher education – exploring the student perspective. Paper presented at the Society for Research into Higher Education. In Symposium A cross-institutional perspective on merits and challenges of learning gain for Teaching Excellence Framework. Paper 0230. Retrieved from https://www.srhe.ac.uk/conference2016/downloads/SRHE_ARC_2016_Programme.pdf
  • Vermunt, J.D., & Donche, V. (2017). A learning patterns perspective on student learning in higher education: State of the art and moving forward. Educational Psychology Review, 29(2), 269–299.
  • Vermunt, J.D., & Vermetten, Y.J. (2004). Patterns in student learning: Relationships between learning strategies, conceptions of learning, and learning orientations. Educational Psychology Review, 16(4), 359–384.
  • Wang, J., Hefetz, A., & Liberman, G. (2017). Applying structural equation modelling in educational research/La aplicación del modelo de ecuación estructural en las investigaciones educativas. Cultura Y Educación, 29(3), 563–618.
  • Wang, J., & Wang, X. (2012). Structural equation modeling: Applications using Mplus. Chichester: John Wiley & Sons.
  • Wang, M.T., Fredricks, J.A., Ye, F., Hofkens, T.L., & Linn, J.S. (2016). The Math and Science engagement scales: Scale development, validation, and psychometric properties. Learning and Instruction, 43:, 16–26.
  • Wolf, E.J., Harrington, K.M., Clark, S.L., & Miller, M.W. (2013). Sample size requirements for structural equation models: An evaluation of power, bias, and solution propriety. Educational and Psychological Measurement, 73(6), 913–934.
  • Wolters, C.A., & Hussain, M. (2015). Investigating grit and its relations with college students’ self-regulated learning and academic achievement. Metacognition and Learning, 10(3), 293–311.
  • Zusho, A. (2017). Toward an integrated model of student learning in the college classroom. Educational Psychology Review, 29(2), 301–324.