4,881
Views
0
CrossRef citations to date
0
Altmetric
Editorial

Methodological choices for research into interactive learning

How do researchers choose their research methodologies? If we have recourse to the standard research methods textbooks we will find guidance for students conducting research which broadly asks them to consider a qualitative or quantitative or mixed methods route based on the context and discipline of their study. These books focus on simple dichotomous decisions to enable students to navigate their way through what can be a whole new domain of terminology, when all they wanted to do was ask people a few questions.

We can ask the same question of academic researchers - how did you choose your methodology? And the answers are more likely to be well framed rationales based on evidence of prior studies in their domain, simply because that is a general requirement of doctoral and post-doctoral research. It is the justification of choice which comes to the fore in the doctoral viva, where examiners seek common ground on which to base debates with the candidate. But are these rationales developed post-hoc? Did the process begin with a thorough grounding in relevant disciplinary literature, searching for precedent methodologies and research designs and models which could be adapted or adopted with fervour since the precedents were published authorities? Or, like many a student, did the process begin with simpler ideas - who knows what I need to know?, how might I find them?, what kind of answers would be helpful?, how much information do I need to convince my examiners or my publishers?

This posits a much more personal process than the rational one written up in the final paper. A process involving anxiety, vulnerability, a search for reassurance that one’s methods and approach stack up with professional requirements. This may not be such a problem in the physical sciences, where the experimental method has long held sway, providing statistics for analysis and interpretation. The “gold standard” of the randomised controlled trial provides a measure of reassurance in these disciplines, even though questions may be asked about assumptions and moral choices made in that process, when results are applied to individual cases. However, in the social sciences, or where research questions blur the boundaries of provable quantitative evidence with opinion, behaviour, emotion and other un-auditable phenomena, the choice of research methodology becomes a potential trip hazard. Should self-reported, opinion-based data be treated as something which can provide confident statistical outcomes? Should qualitative analysis such as content, discourse or narrative analysis be subject to any triangulation further than a survey or view from the literature? Should developed theory or models arising from such analysis be tested before publication? Should case studies be subject to any particular criteria to be acceptable?

We might suggest that educational research is often reduced to the binary thinking found in undergraduate research methods courses - do we favour this theory or that theory?, do we test the theory that exists or invent a new one?, do we show our research effort to the world by offering statistical analysis or by presenting a new model, an adapted model or a confirmed model? In the domain of digital technologies for learning, we find many research papers built on self-report surveys, often student satisfaction surveys. What exactly does that tell us? Surely our knowledge is deepened and enriched more by combinations of methods, by looking at performance data, satisfaction data, student learning over a longer period of time than just one semester, and by building methodologies which fit the digital world. Five years ago Anderson and Shattuck (Citation2012) published a review of articles using the Design-Based Research methodology, one closely associated with Action Research in the educational domain. This methodology focusses on educational interventions, mixed methods for which there is no specific rule, and multiple iterations, as well as collaboration between practitioner and researcher. The purpose of this methodology is to advance both theory and practice (Barab Citation2014), a vital driver in interactive learning research. That is not to say this is a single preferred methodology, but that methodological choice should consider the broader goals behind the research, the extent of reflexive thinking undertaken and the reach of the outcomes presented.

We look then in this journal for that range of original contributions which moves away from binary thinking: proving an existing model in one new time-limited context, offering a new model with little testing except from one group of students. Researchers are encouraged to use multiple methods in their search for ways to explore the digital learning environment, producing not just one more case study but practical solutions to educational issues through technology, which can also cause us to rethink underlying theoretical ideas. Small-scale studies, which are often reported in this journal, are rarely linked or offered with public domain data. Perhaps without such sharing, or collaboration across research groups, the opportunities for genuine theory development as well as ground-breaking educational change is limited. The choice of methodology will always depend on the judgement and methodological standpoint of the researcher and their relationship with practice and the practitioner and there are clearly no simple rules to follow in order to achieve educational breakthroughs. This issue alone offers a wide range of methodologies, including the randomised controlled trial, and the rationales for these methodologies are sound. Educational interventions are complex, multifactorial and culturally grounded. A research methodology which cannot account for this complexity may not be a wise choice.

References

  • Anderson, T., & Shattuck, J. (2012). Design-Based Research: A Decade of Progress in Education Research? Educational Researcher, 41(Jan/Feb.), 16–25. Retrieved from http://edr.sagepub.com/content/41/1/7.full.pdf+html. doi: 10.3102/0013189X11428813
  • Barab, S. (2014). Design-based research: A methodological toolkit for engineering change. In The Cambridge Handbook of the Learning Sciences, Second Edition (pp. 151–170). Cambridge University Press. DOI: 10.1017/CBO9781139519526.011

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.