432
Views
8
CrossRef citations to date
0
Altmetric
RESEARCH

Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course

, &
Pages 435-454 | Received 08 Apr 2017, Accepted 07 Aug 2017, Published online: 31 Jan 2018

REFERENCES

  • Angelo, T.A., and Cross, K.P. 1993. Classroom assessment techniques: A handbook for college teachers. San Francisco, CA: Jossey-Bass.
  • Barnette, J.J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement , 60 (3):361–370.
  • Bauer, C.F. 2008. Attitude toward chemistry: A semantic differential instrument for assessing curriculum impacts. Journal of Chemical Education , 85 (10):1440–1445.
  • Baum, Edward J., . 2013. Augmenting guided-inquiry learning with a blended classroom approach. Journal of College Science Teaching , 42 (6):27–33.
  • Black, P., and William, D. 1998. Assessment and classroom learning. Assessment in Education: Principles, Policy and Practice , 5 (1):7–74.
  • Broomfield, D., and Bligh, J. 1998. An evaluation of the short form course experience questionnaire with medical students. Medical Education , 32 (4):367–369.
  • Browne, M.W., and Cudeck, R. 1992. Alternative ways of assessing model fit. Sociological Methods and Research , 21 (2):230–258.
  • Byrne, M., and Flood, B. 2003. Assessing the teaching quality of accounting programmes: An evaluation of the Course Experience Questionnaire. Assessment and Evaluation in Higher Education , 28 (2):135–145.
  • Cole, D.A. 1987. Utility of confirmatory factor analysis in test validation research. Journal of Consulting and Clinical Psychology , 55 (4):584–594.
  • Cortina, J.M. 1993. What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology , 78 (1):98.
  • Costello, A.B., & Osborne, J.W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation , 10 (7):1–9.
  • Creswell, J.W. 2009. Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). Thousand Oaks, CA: Sage.
  • Creswell, J.W. 2014. Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: Sage.
  • Cronbach, L.J., and Meehl, P.E. 1955. Construct validity in psychological tests. Psychological Bulletin , 52 (4), 281–302.
  • Dancy, M., Hora, M., Ferrare, J., Iverson, E., Lattuca, L., and Turns, J., 2014. Describing and measuring undergraduate STEM teaching practice: A report from a national meeting on the measurement of undergraduate science, technology, engineering and mathematics (STEM) teaching. American Association for the Advancement of Science. 57 p.
  • de Winter, J.D., Dodou, D., and Wieringa, P.A. (2009). Exploratory factor analysis with small sample sizes. Multivariate Behavioral Research , 44 (2):147–181.
  • Diseth, Å. 2007. Approaches to learning, course experience and examination grade among undergraduate psychology students: Testing of mediator effects and construct validity. Studies in Higher Education , 32 (3):373–388.
  • Diseth, Å Pallesen, S. Hovland, A. Larsem, S. . 2006. Course experience, approaches to learning and academic achievement. Education + Training , 48 (2-3):156–169.
  • Dorman, J.P. 2014. Classroom psychosocial environment and course experiences in pre-service teacher education courses at an Australian university. Studies in Higher Education , 39 (1):34–47.
  • Durbin, J.M. 2002. The benefits of combining computer technology and traditional teaching methods in large enrollment geoscience classes. Journal of Geoscience Education , 50 (1):56–63.
  • Eagan, K. 2016. Becoming more student-centered? An examination of faculty teaching practices across STEM and non-STEM disciplines between 2004 and 2014. A report prepared for the Alfred P. Sloan Foundation.
  • Entwistle, N., and Tait, H. 1990. Approaches to learning, evaluations of teaching, and preferences for contrasting academic environments. Higher Education , 19 (2):169–194.
  • Entwistle, N., Tait, H., and McCune, V. 2000. Patterns of response to an approaches to studying inventory across contrasting groups and contexts. European Journal of Psychology of Education , 15 (1):33–48.
  • Ewell, P. 2009. Assessment, accountability, and improvement: Revisiting the tension. National Institute for Learning Outcomes Assessment.
  • Fan, X. 2003. Using commonly available software for bootstrapping in both substantive and measurement analyses. Educational and Psychological Measurement , 63 (1):24–50.
  • Fink, L.D. 2013. Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass.
  • Frankfort-Nachmias, C., and Leon-Guerrero, A. 2010. Social statistics for a diverse society. Thousand Oaks, CA: Sage.
  • Fraser, B.J., Giddings, G.J., and McRobbie, C.J. 1992. Assessment of the psychosocial environment of university science laboratory classrooms: A cross-national study. Higher Education 24 (4):431–451.
  • Freeman, S., Eddy, S., McDonough, M., Smith, M., Okoroafor, N., Jordt, H., and Wenderoth, M.P., 2014. Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, USA , 111 (23):8410–8415, doi: https://doi.org/10.1073/pnas.1319030111 .
  • Fryer L.K. Ginns P. Walker R.A. Nakao K. . 2012. The adaptation and validation of the CEQ and the R-SPQ-2F to the Japanese tertiary environment. British Journal of Educational Psychology , 82 (4):549–563.
  • Gall, M.D., Gall, J.P. and Borg, W.R., 2002. Educational research: An introduction (7th ed.). Boston, MA: Allyn & Bacon.
  • Grace, D., Weaven, S., Bodey, K., Ross, M., and Weaven, K. 2012. Putting student evaluations into perspective: The course experience quality and satisfaction model (CEQS). Studies in Educational Evaluation , 38 (2):35–43.
  • Grove, K. 2002. Using online homework to engage students in a geoscience course for general education. Journal of Geoscience Education , 50 (5):566–574.
  • Hall-Wallace, M.K., and McAuliffe, C.M. 2002. Design, implementation, and evaluation of GIS-based learning materials in an introductory geoscience course. Journal of Geoscience Education , 50 (1):5–14.
  • Harris, C., and Kloubec, J. 2014. Assessment of student experience in a problem-based learning course using the course experience questionnaire. Journal of Nutrition Education and Behavior 46 (4):315–319.
  • Joint Information Systems Committee (JISC) . 2016. Changing the learning landscape. Available at https://www.jisc.ac.uk/rd/projects/changing-the-learning-landscape (accessed 13 August 2017).
  • Jolley, A., Lane, E., Kennedy, B., and Frappé-Sénéclauze, T.P. 2012. SPESS: A new instrument for measuring student perceptions in Earth and ocean science. Journal of Geoscience Education , 60 (1):83–91.
  • Laird, T.F.N., Niskode-Dossett, A.S., and Kuh, G.D. 2009. What general education courses contribute to essential learning outcomes? Journal of General Education , 58 (2):65–84.
  • Law, D.C., and Meyer, J.H. 2011. Adaptation and validation of the course experience questionnaire in the context of post-secondary education in Hong Kong. Quality Assurance in Education , 19 (1):50–66.
  • Libarkin, J.C. 2001. Development of an assessment of student conception of the nature of science. Journal of Geoscience Education , 49 (5):435–442.
  • Lizzio, A., Wilson, K., and Simons, R. (2002). University students' perceptions of the learning environment and academic outcomes: Implications for theory and practice. Studies in Higher Education , 27 (1):27–52.
  • Lo, C.C., Johnson, E.L., and Tenorio, K.A. 2012. Promoting student learning by having college students participate in an online environment. Journal of the Scholarship of Teaching and Learning , 11 (2):1–15.
  • Lyon, P.M., and Hendry, G.D. 2002. The use of the Course Experience Questionnaire as a monitoring evaluation tool in a problem-based medical programme. Assessment and Evaluation in Higher Education , 27 (4):339–352.
  • MacCallum, R.C., Browne, M.W., and Sugawara, H.M. 1996. Power analysis and determination of sample size for covariance structure modeling. Psychological Methods , 1 (2):130–149.
  • MacCallum, R.C., Widaman, K.F., Preacher, K.J., and Hong, S. 2001. Sample size in factor analysis: The role of model error. Multivariate Behavioral Research , 36 (4):611–637.
  • Marsh, H.W., and Hocevar, D. 1991. The multidimensionality of students' evaluations of teaching effectiveness: The generality of factor structures across academic discipline, instructor level, and course level. Teaching and Teacher Education , 7 (1):9–18.
  • McConnell, D.A., Steer, D.N., Owens, K.D., Knott, J.R., Van Horn, S., Borowski, W., Dick, J., Foos, A., Malone, M., McGrew, H., Greer, L., and Heaney, P. 2006. Using concept tests to assess and improve student conceptual understanding in introductory geoscience courses. Journal of Geoscience Education , 54 (1):61–68.
  • McConnell, D.A., and van Der Hoeven Kraft, K.J. 2011. Affective domain and student learning in the geosciences. Journal of Geoscience Education , 59 (3):106–110.
  • Meyers, L.S., Gamst, G., and Guarino, A. J. 2006. Applied multivariate research: Design and interpretation. Thousand Oaks, CA: Sage.
  • Montgomery, H. 2005. Constructivist pedagogical preferences of undergraduate students in geoscience service courses at the University of Texas at Dallas. Texas Science Teacher , 34 (1):25–30.
  • Mulaik, S.A., James, L.R., Van Alstine, J., Bennett, N., Lind, S., and Stilwell, C.D. 1989. Evaluation of goodness-of-fit indices for structural equation models. Psychological Bulletin , 105 (3):430–445.
  • Mundfrom, D.J., Shaw, D.G. and Ke, T.L. 2005. Minimum sample size recommendations for conducting factor analyses. International Journal of Testing , 5 (2):159–168.
  • National Survey of Student Engagement (NSSE), Faculty Survey of Student Engagement (FSSE), and Beginning College Survey of Student Engagement (BCSSE) . 2016. National survey of student engagement. Available at http://nsse.indiana.edu/ (accessed 13 August 2017).
  • Nevo, B. 1985. Face validity revisited. Journal of Educational Measurement , 22 (4):287–293.
  • New Media Consortium . 2016. Horizon report: 2016 higher education edition. Available at http://www.nmc.org/publication/nmc-horizon-report-2016-higher-education-edition/ (accessed 13 August 2017).
  • Pearson, R.H. and Mundform, D.J. 2010. Recommended sample size for conducting exploratory factor analysis on dichotomous data. Journal of Modern Applied Statistical Methods , 9 (2):359–368.
  • Pintrich, P.R., Smith, D.A., García, T., and McKeachie, W.J. 1993. Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational and Psychological Measurement , 53 (3):801–813.
  • President's Council of Advisors on Science and Technology (PCAST) . 2012. Report to the President: Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Available at https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf (accessed 21 August 2017).
  • Ramsden, P. 1991. A performance indicator of teaching quality in higher education: The Course Experience Questionnaire. Studies in Higher Education , 16 (2):129–150.
  • Renshaw, C.E. 2014. Design and assessment of a skills-based geoscience curriculum. Journal of Geoscience Education 62 (4):668–678. doi:https://doi.org/10.5408/13-100.1
  • Richardson, J.T. 1994. A British evaluation of the course experience questionnaire. Studies in Higher Education , 19 (1):59–68.
  • Richardson, J.T., and Price, L. 2003. Approaches to studying and perceptions of academic quality in electronically delivered courses. British Journal of Educational Technology , 34 (1):45–56.
  • Rubio, D.M., Berg-Weger, M., Tebb, S.S., Lee, E.S., and Rauch, S. 2003. Objectifying content validity: Conducting a content validity study in social work research. Social Work Research , 27 (2):94–104.
  • Steele, G.A., West, S.A., and Simeon, D.T. 2003. Using a modified Course Experience Questionnaire (CEQ) to evaluate the innovative teaching of medical communication skills. Education for Health: Change in Learning and Practice . 16 (2):133–144.
  • St. John, K., and McNeal, K. (2017). The strength of evidence pyramid: One approach for characterizing the strength of evidence of geoscience education research (GER) community claims. Journal of Geoscience Education , 65 (4):363–372.
  • St. John, K., McNeal, K., Macdonald, H., and Kastens, K. 2016. Webinar: Results from the geoscience education research (GER) survey—Data for setting priorities in the GER community. Available at: http://nagt.org/nagt/profdev/workshops/GER_community/GER_webinar.html (accessed 13 August 2017).
  • Stergiou, D.P., and Airey, D. 2012. Using the Course Experience Questionnaire for evaluating undergraduate tourism management courses in Greece. Journal of Hospitality, Leisure, Sport and Tourism Education , 11 (1):41–49.
  • Streiner, D.L. 2003. Starting at the beginning: An introduction to coefficient alpha and internal consistency. Journal of Personality Assessment , 80 (1):99–103.
  • Suskie, L. 2010. Assessing student learning: A common sense guide. New York: John Wiley & Sons.
  • Swap R.J. Walter J.A. . 2015. An approach to engaging students in a large-enrollment, introductory STEM college course. Journal of the Scholarship of Teaching and Learning , 15 (5) :1 –21.
  • Talukdar, J., Aspland, T., and Datta, P. 2013. Australian higher education and the Course Experience Questionnaire: Insights, implications, and recommendations. Australian Universities' Review , 55 (1):27–35.
  • Taylor, P.C. Fraser, B.J., and Fisher, D.L. 1997. Monitoring constructivist classroom learning environments. International Journal of Educational Research . 27 (4):293–302.
  • Thien, L.M., and Ong, M.Y. 2016. The applicability of course experience questionnaire for a Malaysian university context. Quality Assurance in Education , 24 (1):41–55.
  • Thompson, B. 1993. The use of statistical significance tests in research: Bootstrap and other alternatives. The Journal of Experimental Education , 61 (4):361–377.
  • Wenner, J.M., Burn, H.E., and Baer, E.M. 2011. The math you need, when you need it: Online modules that remediate mathematical skills in introductory geoscience courses. Journal of College Science Teaching , 41 (1):16–24.
  • Wiggins, G.P. 1998. Educative assessment: Designing assessments to inform and improve student performance (vol. 1). San Francisco, CA: Jossey-Bass.
  • Wilson, K.L., Lizzio, A., and Ramsden, P. 1997. The development, validation, and application of the Course Experience Questionnaire. Studies in Higher Education , 22 (1):33–53.
  • Winchester, H.P. 2001. The relationship between teaching and research in Australian geography. Journal of Geography in Higher Education , 25 (1):117–120.
  • Yin H. Wang W. . 2015. Assessing and improving the quality of undergraduate teaching in China: the Course Experience Questionnaire. Assessment & Evaluation in Higher Education , 40 (8):1032–1049.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.