383
Views
1
CrossRef citations to date
0
Altmetric
Editorials

Editorial

ORCID Icon, ORCID Icon & ORCID Icon

It is our pleasure to introduce the special issue on the theme Assessment, Benchmarking, Impact, and Evaluation which emerged as an important focus among papers presented at the Research in Engineering Education Symposium (REES), held in Bogotá, Colombia from July 6 to July 8 in 2017. Assessment, benchmarking, impact, and evaluation tools and methods are used by educators, researchers, employers, funding agencies, and regulators to study change, to give feedback, and to inform decisions on modifications and improvements, some of which involve high stakes. The papers in this issue investigate new approaches on assessment and present valuable insights that will inform future research. The response to the call for papers for the special issue was so strong that the papers will be published in two issues. This volume is a collection of the first issue with six research articles.

This special issue reflects the international nature of REES. The Guest Editors are based in Australia, the United States, and Brazil; and Co Guest Editor, Bruce Kloot, is based in South Africa. The six papers are from Colombia, Germany, and the United States. While representing geographic diversity, the articles in this issue also highlight the broad meanings of assessment, benchmarking, impact, and evaluation to include ways of telling how education research impacts engineering education, ways of appraising teaching effectiveness, ways of improving classroom assessment methods, and ways of monitoring student study behaviours.

Eunsil Lee, Adam Carberry, Heidi Diefes-Dux, Sara Atwood, and Matt Siniawski observe, in ‘Faculty perception before, during, and after implementation of standards-based grading’ that standards-based assessment methods were perceived to support students’ achievements in engineering education; however, faculty require support to implement a new grading system. The findings are likely to transfer widely.

Impact can be achieved by developing learning resources available to students. Yet, it is often difficult to judge the extent to which students use the resources that are available to them. Elizabeth Wirtz, Amy Dunford, Edward Berger, Elizabeth Briody, Gireesh Guruprasad, and Ryan Senkpeil studied student help-seeking behaviours and report in ‘Resource usage and usefulness: academic help-seeking behaviours of undergraduate engineering students’ that students were most likely to use the resources that they perceive to be available at the right time and right place. The researchers advise educators to consider the convenience of access for students when developing courses and learning resources.

In ‘The discriminative learning gain: A two-parameter quantification of the difference in learning success between courses’, Julie Direnga, Dion Timmermann, Ferdinand Kieckhäfer, Andrea Brose, and Christian Kautz, demonstrate an innovative approach to measure discriminative effects of instruction with respect to initial performance. Evaluators and researchers will find their approach and findings valuable in cases where it is not possible to use the same pre and post-tests to study impact.

In universities, students’ course evaluations are often collected via surveys and used as data to inform the performance reviews of the instructors and to inform course improvements. Engelberth Soto-Estrada, Ann Wellens, and Jairo Gómez-Lizarazo, in ‘Student course evaluation: a process-based approach’, argue for the limitations of sole reliance on student course evaluations and demonstrate a much-needed alternative approach, which frames teaching as a process and targets assessment of multiple aspects.

In ‘Video coding of classroom observations for research and instructional support in an innovative learning environment’, David Evenhouse, Austin Zadoks, Claudio Cesar Silva de Freitas, Nimit Patel, Rohit Kandakatla, Nick Stites, Taylor Prebel, Edward Berger, Charles Krousgrill, Jeffrey Rhoads, and Jennifer DeBoer developed a protocol for coding instructional support in video-recorded classroom meetings. Their protocol will be valuable for researchers, and for benchmarking and improving instructional support.

Research funding bodies are increasingly expecting evidence of impact. Elizabeth Altamirano, Jeremi London, Kevin Lau, Narongrit Waree, and Samantha Cruz investigated the relationship between research engagement practices and engineering education research impact. In ‘Understanding research engagement practices among engineering education stakeholders to promote the impact of research on practice’, they report seven ways that the stakeholders engage with research, and four forms of engineering education research impact.

Assessment, benchmarking, impact, and evaluation are critical components of education. We have no doubt that readers will find, in this issue, papers addressing pressing and significant issues that are present in their educational contexts or relevant to their research studies.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.