107
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Citing as an online learning support tool for student-generated assessment

ORCID Icon &
Pages 165-186 | Received 14 Jul 2021, Accepted 04 Oct 2023, Published online: 27 Jan 2024

References

  • Ahn, J. N., Hu, D., & Vega, M. (2020). Do as I do, not as I say: Using social learning theory to unpack the impact of role models on students’ outcomes in education. Social & Personality Psychology Compass, 14(2), Article e12517. https://doi.org/10.1111/spc3.12517
  • Aiken, L. R., & Groth-Marnat, G. (2006). Psychological testing and assessment (12th ed.). Allyn and Bacon.
  • Akay, H., & Boz, N. (2009). Prospective teachers’ views about problem-posing activities. Procedia-Social and Behavioral Sciences, 1(1), 1192–1198. https://doi.org/10.1016/j.sbspro.2009.01.215
  • Akem, J. A., & Agbe, N. N. (2003). Rudiments of measurement and evaluation in education psychology. The Return Press.
  • Arruabarrena, R., Sánchez, A., Blanco, J. M., Vadillo, J. A., & Usandizaga, I. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education, 16(1), Article 10. https://doi.org/10.1186/s41239-019-0140-7
  • Baerheim, A., & Meland, E. (2003). Medical students proposing questions for their own written final examination: Evaluation of an educational project. Medical Education, 37(9), 734–738. https://doi.org/10.1046/j.1365-2923.2003.01578.x
  • Bakla, A. (2018). Learner-generated materials in a flipped pronunciation class: A sequential explanatory mixed-methods study. Computers & Education, 125, 14–38. https://doi.org/10.1016/j.compedu.2018.05.017
  • Bandura, A. (1963). Social learning and personality development. Holt, Rinehart, and Winston.
  • Bandura, A. (1986). Social cognitive theory. Holt, Rinehart and Winston.
  • Bates, S. P., Galloway, R. K., Riise, J., & Homer, D. (2014). Assessing the quality of a student-generated question repository. Physical Review Special Topics – Physics Education Research, 10(2), 020105. https://doi.org/10.1103/PhysRevSTPER.10.020105
  • Bishay, P. L. (2020). Teaching the finite element method fundamentals to undergraduate students through truss builder and truss analyzer computational tools and student‐generated assignments mini-projects. Computer Applications in Engineering Education, 28(4), 1007–1027. https://doi.org/10.1002/cae.22281
  • Brown, S. I., & Walter, M. I. (2005). The art of problem posing (3rd ed.). Lawrence Erlbaum Associates.
  • Calabrese, J. E., Capraro, M. M., & Thompson, C. G. (2022). The relationship between problem posing and problem solving: A systematic review. International Education Studies, 15(4), 1–8. https://doi.org/10.5539/ies.v15n4p1
  • Carless, D., Chan, K. K. H., To, J., Lo, M., & Barrett, M. (2018). Developing students’ capacities for evaluative judgement through analysing exemplars. In D. Boud, R. Ajjawi, P. Dawson, & J. Tai (Eds.), Developing evaluative judgement in higher education: Assessment for knowing and producing quality work (pp. 108–116). Routledge.
  • Caspari-Sadeghi, S., Forster-Heinlein, B., Maegdefrau, J., & Bachl, L. (2021). Student-generated questions: Developing mathematical competence through online assessment. International Journal for the Scholarship of Teaching and Learning, 15(1), Article 8 https://digitalcommons.georgiasouthern.edu/ij-sotl.
  • Chang, K.-E., Wu, L.-J., Weng, S.-E., & Sung, Y.-T. (2012). Embedding game-based problem-solving phase into problem-posing system for mathematics learning. Computers & Education, 58(2), 775–786. https://doi.org/10.1016/j.compedu.2011.10.002
  • Chen, O., Castro-Alonso, J., Paas, F., & Sweller, J. (2018). Extending cognitive load theory to incorporate working memory resource depletion: Evidence from the spacing effect. Educational Psychology Review, 30(2), 483–501. https://doi.org/10.1007/s10648-017-9426-2
  • Chin, C., Brown, D. E., & Bruce, B. C. (2002). Student-generated questions: A meaningful aspect of learning in science. International Journal of Science Education, 24(5), 521–549. https://doi.org/10.1080/09500690110095249
  • Chin, C., & Kayalvizhi, G. (2005). What do pupils think of open science investigations? A study of Singaporean primary 6 pupils. Educational Research, 47(1), 107–126. https://doi.org/10.1080/0013188042000337596
  • Clark, R., Nguyen, F., & Sweller, J. (2006). Efficiency in learning: Evidence-based guidelines to manage cognitive load. Pfeiffer.
  • Cohen, M., & Riel, M. (1989). The effect of distant audiences on students’ writing. American Educational Research Journal, 26(2), 143–159. https://doi.org/10.3102/00028312026002143
  • Collis, B., & Moonen, J. (2001). Flexible learning in a digital world: Experiences and expectations. Kogan Page.
  • Collis, B., & Moonen, J. (2006). The contributing student: Learners as co-developers of learning resources for reuse in web environments. In D. Hung & M. S. Khine (Eds.), Engaged learning with emerging technologies (pp. 49–67). Springer Dordrecht.
  • Crespo, S., & Sinclair, N. (2008). What makes a problem mathematically interesting? Inviting prospective teachers to pose better problems. Journal of Mathematics Teacher Education, 11(5), 395–415. https://doi.org/10.1007/s10857-008-9081-0
  • Cross, A., Bayyapunedi, M., Ravindran, D., Cutrell, E., & Thies, W. V. (2014). Enabling the crowd to improve the legibility of online educational videos. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, Baltimore, MD, USA (pp. 1167–1175). Association for Computing Machinery.
  • Denny, P., Luxton-Reilly, A., & Simon, B. (2009). Quality of student contributed questions using PeerWise. In M. Hamilton & T. Clear (Eds.), Proceedings of the Eleventh Australasian Conference on Computing Education-Volume 95, Wellington, New Zealand (pp. 55–63). Australian Computer Society.
  • Double, K., McGrane, J., & Hopfenbeck, T. N. (2020). The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educational Psychology Review, 32(1), 481–509. https://doi.org/10.1007/s10648-019-09510-3
  • Doyle, E., & Buckley, P. (2022). The impact of co-creation: An analysis of the effectiveness of student authored multiple choice questions on achievement of learning outcomes. Interactive Learning Environments, 30(9), 1726–1735. https://doi.org/10.1080/10494820.2020.1777166
  • Falkner, K., & Falkner, N. J. (2012). Supporting and structuring “contributing student pedagogy” in computer science curricula. Computer Science Education, 22(4), 413–443. https://doi.org/10.1080/08993408.2012.727713
  • Foos, P. W. (1989). Effects of student-written questions on student test performance. Teaching of Psychology, 16(2), 77–78. https://doi.org/10.1207/s15328023top1602_10
  • Garcia-Loro, F., Martin, S., Ruipérez-Valiente, J. A., Sancristobal, E., & Castro, M. (2020). Reviewing and analyzing peer review inter-rater reliability in a MOOC platform. Computers & Education 154, 103894. https://doi.org/10.1016/j.compedu.2020.103894.
  • George, D., & Mallery, P. (2003). SPSS for Windows step by step: A simple guide and reference. 11.0 update (4th ed.). Allyn & Bacon.
  • Groenendijk, T., Janssen, T., Rijlaarsdam, G., & Huub Van, D. B. (2013). Learning to be creative: The effects of observational learning on students’ design products and processes. Learning and Instruction, 28, 35–47. https://doi.org/10.1016/j.learninstruc.2013.05.001
  • Hain, S., & Back, A. (2008). Personal learning journal: Course design for using weblogs in higher education. Electronic Journal of E-Learning, 6(3), 189–196.
  • Hamer, J., Cutts, Q., Jackova, J., Luxton-Reilly, A., McCartney, R., Purchase, H., Riedesel, C., Saeli, M., Sanders, K., & Sheard, J. (2008). Contributing student pedagogy. ACM SIGCSE Bulletin Archive, 40(4), 194–212. https://doi.org/10.1145/1473195.1473242
  • Hamer, J., Sheard, J., Purchase, H., & Luxton-Reilly, A. (2012). Contributing student pedagogy. Computer Science Education, 22(4), 315–318. https://doi.org/10.1080/08993408.2012.727709
  • Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in Psychology, 52, 139–183. https://doi.org/10.1016/S0166-4115(08)62386-9
  • Huang, T. K., Cui, J., Cortese, C., & Pepper, M. (2015). Internet based peer assisted learning: Current models, future applications, and potential. In Y. Zhang (Ed.), Handbook of mobile teaching and learning (pp. 1–13). Springer.
  • Huang, G.-J., Zou, D., & Lin, J. (2020). Effects of a multi-level concept mapping-based question-posing approach on students’ ubiquitous learning performance and perceptions. Computers & Education, 149, 103815. https://doi.org/10.1016/j.compedu.2020.103815
  • Kember, D., Biggs, J., & Leung, D. Y. P. (2004). Examining the multidimensionality of approaches to learning through the development of a revised version of the learning process questionnaire. British Journal of Educational Psychology, 74(2), 261–280. https://doi.org/10.1348/000709904773839879
  • Khashaba, A. S. (2020). Evaluation of the effectiveness of online peer-based formative assessments (PeerWise) to enhance student learning in physiology: A systematic review using PRISMA guidelines. International Journal of Research in Education & Science, 6(4), 613–628. https://doi.org/10.46328/ijres.v6i4.1216
  • Kim, J. (2015). Learnersourcing: Improving learning with collective learner activity [ Unpublished doctoral dissertation]. Massachusetts Institute of Technology.
  • Kirschner, P. A. (2002). Cognitive load theory: Implications of cognitive load theory on the design of learning. Learning and Instruction, 12(1), 1–10. https://doi.org/10.1016/S0959-4752(01)00014-7
  • Klepsch, M., Schmitz, F., & Seufert, T. (2017). Development and validation of two instruments measuring intrinsic, extraneous, and germane cognitive load. Frontiers in Psychology, 8. https://doi.org/10.3389/fpsyg.2017.01997
  • Kopparla, M., Bicer, A., Vela, K., Lee, Y., Bevan, D., Kwon, H., Caldwell, C., Capraro, M. M., & Capraro, R. M. (2019). The effects of problem-posing intervention types on elementary students’ problem-solving. Educational Studies, 45(6), 708–725. https://doi.org/10.1080/03055698.2018.1509785
  • Kul, Ü., & Çelik, S. (2020). A meta-analysis of the impact of problem posing strategies on students’ learning of mathematics. Romanian Journal for Multidimensional Education, 12(3), 341–368. https://doi.org/10.18662/rrem/12.3/325
  • Laaser, W., & Toloza, E. A. (2017). The changing role of the educational video in higher distance education. International Review of Research in Open & Distributed Learning, 18(2), 264–276. https://doi.org/10.19173/irrodl.v18i2.3067
  • Lam, R. (2014). Can student-generated test materials support learning? Studies in Educational Evaluation, 43, 95–108. https://doi.org/10.1016/j.stueduc.2014.02.003
  • Lee, E. (2011). Facilitating student-generated content using web 2.0 technologies. Educational Technology, 51(4), 36–40. https://www.jstor.org/stable/44429930
  • Leung, S. S. (1997). On the role of creative thinking in problem posing. ZDM: The International Journal on Mathematics Education, 29(3), 81–85. https://doi.org/10.1007/s11858-997-0004-9
  • Liu, C. C., Chen, W. C., Lin, H. M., & Huang, Y. Y. (2017). A remix-oriented approach to promoting student engagement in a long-term participatory learning program. Computers & Education, 110, 1–15. https://doi.org/10.1016/j.compedu.2017.03.002
  • Luxton-Reilly, A., & Denny, P. (2010). Constructive evaluation: A pedagogy of student-contributed assessment. Computer Science Education, 20(2), 145–167. https://doi.org/10.1080/08993408.2010.486275
  • Ma, Q. (2020). Examining the role of inter-group peer online feedback on wiki writing in an EAP context. Computer Assisted Language Learning, 33(3), 197–216. https://doi.org/10.1080/09588221.2018.1556703
  • Maia, M. C. O., Eliane, C. A., Figueiredo, J., & Serey, D. (2020). Student engagement through creation of new activities: An empirical study on contributing student pedagogy. In IX Congresso Brasileiro de Informática na Educação 2020. Simpósio Brasileiro de Informática na Educação. https://doi.org/10.5753/cbie.sbie.2020.1693
  • McDonald, P. A., & Smith, J. M. (2020). Improving mathematical learning in Scotland’s curriculum for excellence through problem posing: An integrative review. The Curriculum Journal, 31(3), 398–435. https://doi.org/10.1002/curj.15
  • Mehrens, W. A., & Lehmann, I. J. (1991). Measurement and evaluation in education and psychology (2nd ed.). Houghton Mifflin Company.
  • Murray, D., McGill, T., Thompson, N., & Toohey, D. (2017). Can learners become teachers? Evaluating the merits of student generated content and peer assessment. Issues in Informing Science and Information Technology, 14, 21–33. https://doi.org/10.28945/3698
  • Öhrstedt, M., & Lindfors, P. (2019). First-semester students’ capacity to predict academic achievement as related to approaches to learning. Journal of Further & Higher Education, 43(10), 1420–1432. https://doi.org/10.1080/0309877X.2018.1490950
  • Olson, M. H., & Hergenhahn, B. R. (2009). An introduction to theories of learning (8th ed.). Pearson/Prentice Hall.
  • Olson, G. M., & Olson, J. S. (2001). Distance matters. In J. M. Carroll (Ed.), Human-computer interaction in the new millennium (pp. 397–417). ACM Press.
  • O’Reilly, T. (2005, September 30). What is Web 2.0 – Design patterns and business models for the next generation of software. http://oreilly.com/web2/archive/what-is-web-20.html
  • Paas, F., Renkl, A., & Sweller, J. (2004). Cognitive load theory: Instructional implications of the interaction between information structures and cognitive architecture. Instructional Science, 32(1/2), 1–8. https://doi.org/10.1023/B:TRUC.0000021806.17516.d0
  • Papinczak, T., Peterson, R., Babri, A. S., Ward, K., Kippers, V., & Wilkinson, D. (2012). Using student-generated questions for student-centred assessment. Assessment & Evaluation in Higher Education, 37(4), 439–452. https://doi.org/10.1080/02602938.2010.538666
  • Persada, S. F., Ivanovski, J., Miraja, B. A., Nadlifatin, R., Mufidah, I., Chin, J., & Redi, A. A. N. P. (2020). Investigating generation Z’ intention to use learners’ generated content for learning activity: A theory of planned behavior approach. International Journal of Emerging Technologies in Learning (iJET), 15(4), 179–194. https://doi.org/10.3991/ijet.v15i04.11665
  • Pittenger, A. L., & Lounsbery, J. L. (2011). Student-generated questions to assess learning in an online orientation to pharmacy course. American Journal of Pharmaceutical Education, 75(5), 94. Article 94. https://doi.org/10.5688/ajpe75594
  • Redd-Boyd, T. M., & Slater, W. H. (1989). The effects of audience specification on undergraduates’ attitudes, strategies, and writing. Research in the Teaching of English, 23(1), 77–108. https://www.jstor.org/stable/40171289
  • Rosenshine, B., Meister, C., & Chapman, S. (1996). Teaching students to generate questions: A review of the intervention studies. Review of Educational Research, 66(2), 181–221. https://doi.org/10.3102/00346543066002181
  • Rosli, R., Capraro, M. M., & Capraro, R. M. (2014). The effects of problem posing on student mathematical learning: A meta-analysis. International Education Studies, 7(13), 227–241. https://doi.org/10.5539/ies.v7n13p227
  • Seifert, T., & Feliks, O. (2019). Online self-assessment and peer-assessment as a tool to enhance student-teachers’ assessment skills. Assessment & Evaluation in Higher Education, 44(2), 169–185. https://doi.org/10.1080/02602938.2018.1487023
  • Sepp, S., Howard, S. J., Tindall-Ford, S., Agostinho, S., & Paas, F. (2019). Cognitive load theory and human movement: Towards an integrated model of working memory. Educational Psychology Review, 31(2), 293–317. https://doi.org/10.1007/s10648-019-09461-9
  • Shakurnia, A, A., Aslami, M., & Bijanzadeh, M. (2018). The effect of question generation activity on students’ learning and perception. Journal of Advances in Medical Education & Professionalism, 6(2), 70–77. PMID: 29607334; PMCID: PMC5856907.
  • Silver, E. A. (1994). On mathematical problem posing. For the Learning of Mathematics, 14, 19–28. https://www.jstor.org/stable/40248099
  • Silver, E. A. (1997). Kreativität fördern durch einen Unterricht, der reich ist an Situationen des mathematischen Problemlösens und Aufgabenerfindens. Zentralblatt für Didaktik der Mathematik, 29(3), 75–80. https://doi.org/10.1007/s11858-997-0003-x
  • Snowball, J. D., & McKenna, S. (2017). Student-generated content: An approach to harnessing the power of diversity in higher education. Teaching in Higher Education, 22(5), 604–618. https://doi.org/10.1080/13562517.2016.1273205
  • Song, D. (2016). Student-generated questioning and quality questions: A literature review. Research Journal of Educational Studies and Review, 2(5), 58–70. http://pearlresearchjournals.org/journals/rjesr/index.html
  • Stoyanova, E., & Ellerton, N. F. (1996). A framework for research into students’ problem posing in school mathematics. In P. Clarkson (Ed.), Technology in mathematics education (pp. 518–525). Mathematics Education Research Group of Australasia.
  • Sweller, J., Van Merriënboer, J. J. G., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296. https://doi.org/10.1023/A:1022193728205
  • Torrance, E. P. (1974). Torrance tests of creative thinking. Scholastic Testing Service.
  • Touissi, Y., Hjiej, G., Hajjioui, A., Ibrahimi, A., & Fourtassi, M. (2022). Does developing multiple-choice questions improve medical students’ learning? A systematic review. Medical Education Online, 27(1), 1–13. https://doi.org/10.1080/10872981.2021.2005505
  • Tuma, F. (2022). Educational benefits of writing multiple-choice questions (MCQs) with evidence-based explanation. Postgraduate Medical Journal, 98(1156), 77–78. https://doi.org/10.1136/postgradmedj-2021-139876
  • van Dijk, A. M., & Lazonder, A. W. (2016). Scaffolding students’ use of learner-generated content in a technology-enhanced inquiry learning environment. Interactive Learning Environments, 24(1), 194–204. https://doi.org/10.1080/10494820.2013.834828
  • Wang, M., Walkington, C., & Rouse, A. (2022). A meta-analysis on the effects of problem-posing in mathematics education on performance and dispositions. Investigations in Mathematics Learning, 14(4), 265–287. https://doi.org/10.1080/19477503.2022.2105104
  • Ward, M. (2009). Squaring the learning circle: Cross-classroom collaborations and the impact of audience on student outcomes in professional writing. Journal of Business and Technical Communication, 23(1), 161–182. https://doi.org/10.1177/1050651908324381
  • Yelon, S. L. (1996). Powerful principles of instruction. Allyn & Bacon.
  • Yu, F. Y. (2009). Scaffolding student-generated questions: Design and development of a customizable online learning system. Computers in Human Behavior, 25(5), 1129–1138. https://doi.org/10.1016/j.chb.2009.05.002
  • Yu, F. Y. (2015). Online student-constructed tests with citing capability: Perceived uses, usage and considerations. In T. Kojiri, T. Supnithi, Y. Wang, Y.-T. Wu, H. Ogata, W. Q. Chen, S. C. Kong, & F. Y. Qiu (Eds.), Workshop proceedings of the 23rd International Conference on Computers in Education, Hangzhou, China (pp. 534–538). Asia-Pacific Society for Computers in Education.
  • Yu, F. Y. (2019). The learning potential of online student-constructed tests with citing peer-generated questions. Interactive Learning Environments, 27(2), 226–241. https://doi.org/10.1080/10494820.2018.1458040
  • Yu, F. Y., & Kuo, C.-W. (in press). A systematic review of published student question-generation systems: Supporting functionalities and design features. Journal of Research on Technology in Education. https://doi.org/10.1080/15391523.2022.2119448
  • Yu, F. Y., & Liu, Y. H. (2005). Potential values of incorporating multiple-choice question-construction for physics experimentation instruction. International Journal of Science Education, 27(11), 1319–1335. https://doi.org/10.1080/09500690500102854
  • Yu, F. Y., & Pan, K.-J. (2014). Effects of student question-generation with online prompts on learning. Educational Technology and Society, 17(3), 267–279. https://www.jstor.org/stable/jeductechsoci
  • Yu, F. Y., & Su, C.-L. (2015). A student-constructed test learning system: The design, development and evaluation of its pedagogical potential. Australasian Journal of Educational Technology, 31(6), 685–698. https://doi.org/10.14742/ajet.2190
  • Yu, F. Y., & Wu, C. P. (2016). The effects of an online student-constructed test strategy on knowledge construction. Computers & Education, 94, 89–101. https://doi.org/10.1016/j.compedu.2015.11.005
  • Yu, F. Y., Wu, C. P., & Hung, C.-C. (2014). Are there any joint effects of online student question generation and cooperative learning? The Asia-Pacific Education Researcher, 23(3), 367–378. https://doi.org/10.1007/s40299-013-0112-y
  • Zheng, L., Zhang, X., & Cui, P. (2020). The role of technology-facilitated peer assessment and supporting strategies: A meta-analysis. Assessment & Evaluation in Higher Education, 45(3), 372–386. https://doi.org/10.1080/02602938.2019.1644603
  • Zuya, H. E. (2017). The benefits of problem posing in the learning of mathematics: A systematic review. International Journal of Advanced Research, 5(3), 853–860. https://doi.org/10.21474/IJAR01/3581

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.