3,022
Views
3
CrossRef citations to date
0
Altmetric
Original Articles

Development of a Student Self-Reported Instrument to Assess Course Reform

, , , , , & show all

REFERENCES

  • Arum, R., & Roksa, J. (2010). Academically adrift: Limited learning on college campuses. Chicago, IL: University of Chicago Press.
  • Astin, A. W. (1971). Two approaches to measuring students' perceptions of their college environment. Journal of College Student Personnel, 12, 169–172.
  • Astin, A. W. (1977). Four critical years. Effects of college on beliefs, attitudes, and knowledge. San Francisco, CA. Wiley, Jossey-Bass.
  • Astin, A. W., & Astin, H. S. (1992). Undergraduate Science Education: The Impact of Different College Environments on the Educational Pipeline in the Sciences. Final report.
  • Bell, C. A., Gitomer, D. H., McCaffrey, D. F., Hamre, B. K., Pianta, R. C., & Qi, Y. (2012). An argument approach to observation protocol validity. Educational Assessment, 17, 62–87.
  • Blackie, M. A. L., Case, J. M., & Jawitz, J. (2010). Student-centredness: The link between transforming students and transforming ourselves. Teaching in Higher Education, 15, 637–646.
  • Blickenstaff, J. C. (2005). Women and science careers: Leaky pipeline or gender filter? Gender and Education, 17, 369–386.
  • Borrego, M., Streveler, R. A., Miller, R. L., & Smith, K. A. (2008). A new paradigm for a new field: Communicating representations of engineering education research. Journal of Engineering Education, 97, 147–162.
  • Brown, N. J. S., Furtak, E. M., Timms, M., Nagashima, S. O., & Wilson, M. (2010). The evidence-based reasoning framework: Assessing scientific reasoning. Educational Assessment, 15, 123–141.
  • Carnell, E. (2007). Conceptions of effective teaching in higher education: extending the boundaries. Teaching in Higher Education, 12, 25–40.
  • Centra, J. A. (1993). Reflective faculty evaluation: Enhancing teaching and determining faculty effectiveness. The Jossey-Bass higher and adult education series: ERIC.
  • Clare, L., & Aschbacher, P. R. (2001). Exploring the technical quality of using assignments and student work as indicators of classroom practice. Educational Assessment, 7, 39–59.
  • Cornelius-White, J. (2007). Learner-centered teacher-student relationships are effective: A meta-analysis. Review of Educational Research, 77, 113–143.
  • De Welde, K., Laursen, S., & Thiry, H. (2007). Women in science, technology, engineering and math (STEM). ADVANCE Library Collection.
  • Dean, D. J., & Fleckenstein, A. (2007). Keys to success for women in science. In R. J. Burke & M. C. Mattis (Eds.), Women and minorities in science, technology, engineering and mathematics: Upping the numbers (pp. 28–44). Northampton, MA: Edward Elgar.
  • Denton, D. D. (1998). Engineering education for the 21st century: Challenges and opportunities. Journal of Engineering Education, 87, 19–22.
  • Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61, 550–558.
  • Fraser, B. J. (2012). Classroom learning environments: Retrospect, context and prospect. In B. J. Fraser, K. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (Vol. 24, pp. 1191–1239). Springer Netherlands.
  • Fraser, B. J., & Treagust, D. F. (1986). Validity and use of an instrument for assessing classroom psychosocial environment in higher education. Higher Education, 15, 37–57.
  • Freire, P. (2000). Pedagogy of the oppressed: 30th anniversary edition: Bloomsbury Academic.
  • Grossman, P., Loeb, S., Cohen, J., Hammerness, K., Wyckoff, J., Boyd, D., & Lankford, H. (2010). Measure for measure: The relationship between measures of instructional practice in middle school English language arts and teachers' value-added scores. National Bureau of Economic Research.
  • Gwet, K. L. (2012). Handbook of inter-rater reliability (3rd edition): The definitive guide to measuring the extent of agreement among multiple raters. Gaitherburg, MD: Advanced Analytics, LLC.
  • Haber, M., Barnhart, H. X., Song, J., & Gruden, J. (2005). Observer variability: A new approach in evaluating interobserver agreement. Journal of Data Science, 3, 69–83.
  • Haghighi, K. (2005). Systematic and sustainable reform in engineering education. Journal of Environmental Engineering, 131, 501–502.
  • Hallgren, K. A. (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutor Quant Methods Psychol, 8, 23–34.
  • Harper, S. R., & Quaye, S. J. (2010). Student engagement in higher education: Theoretical perspectives and practical approaches for diverse populations. New York, NY: Taylor & Francis.
  • Harper, V. B., Jr. (2009). Virginia's value added: A diverse system perspective. Assessment Update, 21(4), 1–2.
  • Hill, H. C., Charalambous, C. Y., Blazar, D., McGinn, D., Kraft, M. A., Beisiegel, M., Lynch, K., … (2012). Validating arguments for observational instruments: Attending to multiple sources of variation. Educational Assessment, 17, 88–106.
  • Hughes, G. (2007). Using blended learning to increase learner support and improve retention. Teaching in Higher Education, 12, 349–363.
  • Isaacson, R. L., McKeachie, W. J., Milholland, J. E., Lin, Y. G., Hofeller, M., & Zinn, K. L. (1964). Dimensions of student evaluations of teaching. Journal of Educational Psychology, 55, 344.
  • Jenny, H. H. (1996). Cost accounting in higher education. Simplified macro-and micro-costing techniques. ERIC.
  • Katehi, L., Banks, K., Diefes-Dux, H., Follman, D., Gaunt, J., Haghighi, K., … Oakes, W. (2004). A new framework for academic reform in engineering education. Paper presented at the American Society for Engineering Education Conference, Salt Lake City, UT.
  • King, A. (1989). Effects of self-questioning training on college students' comprehension of lectures. Contemporary Educational Psychology, 14, 366–381.
  • King, A. (1990). Enhancing peer interaction and learning in the classroom through reciprocal questioning. American Educational Research Journal, 27, 664–687.
  • King, A. (1993). From sage on the stage to guide on the side. College Teaching, 41, 30–35.
  • Kuh, G. D. (2001). The national survey of student engagement: Conceptual framework and overview of psychometric properties 1–26. Bloomington: Indiana University Center for Postsecondary Research.
  • Kuh, G. D. (2003). What we're learning about student engagement from NSSE: Benchmarks for effective educational practices. Change: The Magazine of Higher Learning, 35(2), 24–32.
  • McCombs, B. L. (2001). What do we know about learners and learning? The learner-centered framework: Bringing the educational system into balance. Educational Horizons, 79, 182–193.
  • McCray, R. A., DeHaan, R. L., & Schuck, J. A. (Eds.). (2003). Improving undergraduate instruction in science, technology, engineering, and mathematics: Report of a workshop. Washington, DC: National Academies Press.
  • Meyer, J. P., Cash, A. H., & Mashburn, A. (2011). Occasions and the reliability of classroom observations: Alternative conceptualizations and methods of analysis. Educational Assessment, 16, 227–243.
  • Miller, J. E., & Groccia, J. E. (2011). To improve the academy: Resources for faculty, instructional, and organizational development. Wiley.
  • The National Center for Academic Transformation: Who We Are. (2013). Retrieved from http://www.thencat.org/whoweare.html
  • NSF. (2012). Science and Engineering Indicators 2012. From http://www.nsf.gov/statistics/seind12/c2/c2s2.htm
  • Office of Management and Budget. (2013). Fiscal Year 2014: Budget of the U.S. Government. Washington, DC: Government Printing Office.
  • Pace, C. R. (1984). Measuring the quality of college student experiences. An account of the development and use of the College Student Experiences Questionnaire.
  • Pace, C. R. (1985). The credibility of student self-reports.
  • Piburn, M., Sawada, D., Turley, J., Falconer, K., Benford, R., Bloom, I., & Judson, E. (2000). Reformed teaching observation protocol (RTOP) reference manual. Tempe, Arizona: Arizona Collaborative for Excellence in the Preparation of Teachers.
  • Pike, G. R. (2011). Using college students' self-reported learning outcomes in scholarly research. New Directions for Institutional Research, 2011(150), 41–58.
  • Pintrich, P. R., Smith, D. A. F., Garcia, T., & Mckeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (Mslq). Educational and Psychological Measurement, 53, 801–813.
  • Potočnik, J. (2009). Women in science and technology: Creating sustainable careers In O. f. O. P. o. t. E. Communities (Ed.). Luxembourg, Belgium: European Communities.
  • Provezis, S. (2010). Regional accreditation and student learning outcomes: Mapping the territory. National Institute for Learning Outcomes Assessment, Occasional Paper(6), 7.
  • Sawada, D., Piburn, M., Turley, J., Falconer, K., Benford, R., Bloom, I., & Judson, E. (2000). Reformed teaching observation protocol (RTOP) training guide (ACEPT IN-002). Arizona Board of Regents. Retrieved from https://mathed.asu.edu/instruments/rtop/Training_Guide_Mar2000.pdf
  • Sawada, D., Piburn, M. D., & Judson, E. (2002). Measuring reform practices in science and mathematics classrooms: The reformed teaching observation protocol. School Science & Mathematics, 102, 245–253.
  • Schunk, D. H., & Meece, J. L. (1992). Student perceptions in the classroom. Hillsdale, NJ: Erlbaum.
  • Shrout, P. E, & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86, 420–428.
  • Simpson, P. M., & Siguaw, J.A. (2000). Student evaluations of teaching: An exploratory study of the faculty response. Journal of Marketing Education, 22, 199–213.
  • Slavin, Robert E. (1995). Cooperative learning: Theory, research, and practice (Vol. 2). Boston, MA: Allyn and Bacon.
  • Smith, M. K., Jones, F. H. M., Gilbert, S. L., & Wieman, C. E. (2013). The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices. CBE-Life Sciences Education, 12, 618–627.
  • Sojka, J., Gupta, A. K., & Deeter-Schmelz, D. R. (2002). Student and faculty perceptions of student evaluations of teaching: A study of similarities and differences. College Teaching, 50, 44–49.
  • Streveler, R. A., Litzinger, T. A., Miller, R. L., & Steif, P. S. (2008). Learning conceptual knowledge in the engineering sciences: Overview and future research directions. Journal of Engineering Education, 97, 279–294.
  • Swing, R. L., & Coogan, C. S. (2010). Valuing assessment: Cost-benefit considerations. Champaign, IL: National Institute for Learning Outcomes Assessment.
  • Taylor, P. C., Fraser, B. J., & Fisher, D. L. (1997). Monitoring constructivist classroom learning environements. International Journal of Educational Research, 27, 293–302.
  • Turner, P. M., & Carriveau, R. S. (2010). Next generation course redesign. Peter Lang.
  • Turner, P. M. (2009). Next generation: Course redesign. Change: The Magazine of Higher Learning, 41(6), 10–16.
  • Twigg, C. A. (2006). Improving learning and reducing costs: Project outcomes and lesson learned from the roadmap to redesign. Program in Course Redesign: Round I: The National Center for Academic Transformation.
  • Watson, K., & Froyd, J. (2007). Diversifying the US engineering workforce: A new model. Journal of Engineering Education, 96, 19–32.
  • Wolters, C. A., & Taylor, D. J. (2012). A self-regulated learning perspective on student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 635–651). New York, NY: Springer.
  • Wu, C.-H. (2007). An empirical study on the transformation of Likert-scale data to numerical scores. Applied Mathematical Sciences, 1, 2851–2862.
  • Zegers, F. E. (1991). Coefficients for interrater agreement. Applied Psychological Measurement, 15, 321–333.
  • Zimpher, N. L. (2009). The leaky pipeline: IT can help. EDUCAUSE Review, 3, 4–5.