2,696
Views
6
CrossRef citations to date
0
Altmetric
Articles

Exploring the potential of using undergraduates’ knowledge, skills and experience in research methods as a proxy for capturing learning gain

, ORCID Icon, , , &
Pages 222-248 | Received 09 Aug 2017, Accepted 18 Feb 2018, Published online: 23 Mar 2018

Abstract

Learning gain is a politicised concept within contemporary HE, and as such has been aligned with agendas of teaching excellence and learning outcomes but the extent to which it captures actual learning has yet to be clarified. Here, we report the outcomes of a learning gain study which examines how students’ knowledge, skills and experiences as researchers develops throughout their studies. We examine data from a self-reporting survey administered across a university and college-based HE providers during students’ second year of undergraduate study. The data highlight disciplinary differences in student engagement with research methods and the significance of perceived relevance of research methods to students’ learning. These findings do have a bearing on the development of measures of learning gain as they are demonstrating the complexity of capturing student learning across disciplines. Our findings can be employed to develop a method of capturing learning gain that can be integrated into undergraduates’ research methods education.

Introduction

Learning gain is an emerging concept in higher education (HE) discussed in terms of the distance travelled by a student. To some (e.g. Coates, Citation2009; Schleicher, Citation2016) connections are made between the idea of learning gain and value added, but unlike value added which is centred on comparing ‘performance predicted at the outset of studies and actual performance achieved’ (McGrath, Guerin, Harte, Frearson, & Manville, Citation2015, p. xi), learning gain represents ‘the difference between the skills, competencies, content knowledge and personal development demonstrated by students at two points in time’. (McGrath et al., Citation2015, p. xi). Internationally, there are three drivers underpinning current rhetoric around learning gain – accountability, teaching enhancement and student learning (Varsavsky, Matthews, & Hodgson, Citation2014). Existing measures of learning gain such as the Collegiate Learning Assessment and the Assessment of Higher Education Learning Outcomes have attempted to address the first two of these drivers (Coates, Citation2009; Klein, Benjamin, Shavelson, & Bolus, Citation2007). The extent to which these serve to promote student learning has been questioned (e.g. Douglass, Thomson, & Zhoo, Citation2012). Nevertheless, through careful framing and explicit integration of a discourse of learning gain into undergraduate study, there is considerable potential for measures of learning gain to stimulate student learning (Varsavsky et al., Citation2014).

This paper reports interim findings of a Higher Education Funding Council for England’s (HEFCE) pilot project funded to evaluate different approaches to measure a student’s learning gain. There are many challenges to measuring learning gain, not the least of which are a lack of uniformity in entry-level qualifications into university, non-comparable entry and exit tests at university (McGrath et al., Citation2015; Varsavsky et al., Citation2014). This project focused on examining how students develop knowledge, skills and expertise of research methods, as they progress through their degrees, exploring the potential for students’ research experiences to serve as a meaningful way to measure learning gain. This paper will explore the rationale for using research methods, introduces the methods used to capture students’ development in this area and report on the outcomes generated from a bespoke self-reporting survey. We conclude by examining the potential of using research methods as a proxy for learning gain and discuss development of an innovative research methods toolkit to stimulate dialogue around learning gain to direct future learning and capture student performance.

Research methods and their role in undergraduate education

Developing the research capacity of undergraduates, provides them with the abilities to generate, apply and adapt new knowledge; these are skills that are integral not only to their success as undergraduates but also to the knowledge economy (Davis, Evans, & Hickey, Citation2006). The knowledge economy is reliant on graduates having a flexible knowledge base, underpinned by their ability to seek out and identify relevant knowledge, reconfigure and evaluate this knowledge to respond to the demands or challenges they are facing (Green, Hammer, & Star, Citation2009). As Jenkins, Healey, and Zetter (Citation2008, p. 3) commented: ‘teaching students to be enquiring or research-based in their approach is not just a throwback to quaint notions of enlightenment or liberal education but central to the hard-nosed skills required of the future graduate workforce’. These themes resonate with the wider discourse around learning gain. In developing an agenda to measure and capture learning gain, the UK government want to provide employers with clearer information about the knowledge, skills and experience graduates possess (Schleicher, Citation2016).

Rhetoric relating to research methods education has foregrounded policy and practice in HE internationally, leading to policy recommendations triggering investment in infrastructure and development activities in UK universities (Healey & Jenkins, Citation2009; Levy & Petrulis, Citation2012). This has led to the emergence of a body of work relating to the pedagogy of research methods education. This highlights the gains to students’ research methods education of early exposure to research activities (Levy & Petrulis, Citation2012), and the role of active pedagogies that involve students practising and rehearsing essential research skills (Benson & Blackman, Citation2003; Earley, Citation2014; Wagner, Garner, & Kawulich, Citation2011; Winn, Citation1995). But the focus on research-based curricula does not represent a recent innovation; equipping students with the skills and abilities to think as researchers underpins many undergraduate programmes (Earley, Citation2014). Indeed, this is evidenced through the extensive use of the final year dissertation (Ashwin, Abbas, & McLean, Citation2017; Todd, Bannister, & Clegg, Citation2004). More widely, students’ development as researchers is framed by credit level descriptors used by UK HE providers to ‘define the level of complexity, relative demand and autonomy expected of a learner’ (SEEC, Citation2016). Credit level descriptors, such as those of the Southern England Education Consortium for Credit Accumulation and Transfer (SEEC) which are applied in UK HE contexts, and map to the European Qualifications Framework (European Commission, Citationn.d.), provide a generic overview of the knowledge and understanding students should develop as they progress through their studies, as well as indicating the cognitive, intellectual, practical and transferable skills they should gain (SEEC, Citation2016). These descriptors increase in complexity over the duration of an undergraduate’s study, as shown in Table , and can be used by HE providers to inform curriculum and assessment design, as well as communicating expectations of learners (SEEC, Citation2016).

Table 1. Undergraduates developing knowledge, skills and experience in research as categorised through SEEC level descriptors.

The progression in the knowledge, skills and experiences HE providers can expect of undergraduates as they progress through their studies is clearly reflected in the UK level descriptors and the broader European Qualifications Framework (Table ). However, developing students’ skills and abilities as researchers is challenging (Deem & Lucas, Citation2006; Shaw, Holbrook, & Bourke, Citation2013). Many students, particularly early on in their studies, have a weak understanding of the epistemological foundations of research, which often manifests as a fixed conception of knowledge and the acceptance of facts as truth. This may often be confounded by limited experience or knowledge of critical reflection and evaluation practices (Murtonen, Citation2005; Schommer, Citation1990). This fixed conception of knowledge needs to be transformed through their studies in order for them to appreciate how knowledge evolves and changes, and also to equip them with the transferable skills of critical thinking, analysis and evaluation (Murtonen, Citation2015; Schommer, Citation1990). This transformation can be achieved via pedagogies such as enquiry or problem-based learning, and early exposure to research (Brew, Citation2013). More widely, students need to gain a comprehensive understanding of research methodologies in order to be able to rigorously and effectively function as researchers, reflect on their strengths and articulate these to future employers (Davis et al., Citation2006). The role of research methods within SEEC level descriptors is implicit, experienced through the increasing complexity of the knowledge and tasks students engage with through their studies (SEEC, Citation2016; Table ).

Often the term ‘research methods’ is taken to narrowly represent methods of data collection or analysis (Murtonen, Citation2015). However, research methods represent a complex domain of knowledge encompassing the general principles of science, research paradigms, research approaches and methods, as well as addressing issues relating to the theoretical framing and philosophical underpinnings of knowledge (Murtonen, Citation2015). It means that research methods represent a combination of knowledge domains and practices. Students need to develop an understanding of these, as well as gain proficiency in specific practical research skills, in order to effectively conduct research (Earley, Citation2014). The challenges students face when engaging with this subject are often reported in terms of issues such as statistical anxiety (Chamberlain, Hillier, & Signoretta, Citation2015; Sizemore & Lewandowski, Citation2009), negative dispositions towards research (Murtonen, Citation2015), or students failing to recognise the relevance of research methods courses (Deem & Lucas, Citation2006; Papanastasiou & Zembylas, Citation2008). These issues can be exacerbated by curriculum marginalisation of research methods training which can potentially undermine students’ engagement with, and development as, researchers (MacInnes, Citation2012; Williams et al., Citation2016). These challenges have resulted in considerable attention being paid to this area, with researchers attempting to identify solutions in order to promote or better understand students’ learning about research methods (e.g. Earley, Citation2014; Howard & Brady, Citation2015; Kilburn, Nind, & Wiles, Citation2014; Nind, Kilburn, & Wiles, Citation2015).

One solution has been to consider the role of self-efficacy in research methods education (Forester, Kahn, & Hesson-McInnis, Citation2004; Seymour, Wiese, Hunter, & Daffinrud, Citation2000; Shaw et al., Citation2013). The concept of self-efficacy is widely applied in work focused on vocational theory (e.g. social cognitive theory) and career development studies, as it has been identified as shaping individuals’ career choices and persistence in a chosen profession (van Dinther, Dochy, & Seges, Citation2011; Zimmerman, Bandura, & Martinez-Pons, Citation1992). It has been suggested that researcher self-efficacy relates to students developing the confidence in performing research tasks, which in the long term promotes their learning and engagement with research methods (Forester et al., Citation2004; Shaw et al., Citation2013). Self-efficacy also determines students’ motivation and attainment in HE, and these in turn influence students’ self-regulation and the strategies they employ to achieve their own learning goals (van Dinther et al., Citation2011; Zimmerman et al., Citation1992). Self-efficacy therefore is highly relevant to research methods education, as students who exhibit high levels of self-efficacy are more likely to achieve the skill and knowledge goals they set for themselves (van Dinther et al., Citation2011; Shaw et al., Citation2013). Indeed Forester et al. (Citation2004) identified self-efficacy in research tasks as effective in predicting students’ interest in conducting research, and that confidence measures provided a reliable measure of researcher self-efficacy. Similar conclusions were reached in related work by Shaw et al. (Citation2013). As self-efficacy shapes an individual’s attitudes, beliefs, thoughts and feelings it does have direct relevance to a measure of a student’s learning; a student with a strong sense of self-efficacy can demonstrate higher levels of effort, persistence and resilience (van Dinther et al., Citation2011). Progression in a student’s sense of self-efficacy is achieved through mastery of specific skills or knowledge as this provides evidence of success. Students then draw on this sense of success to develop their capacity to perform tasks in the future, with more challenging tasks fostering a stronger sense of self-efficacy as they signal the benefit of maintained effort and persistence (Bandura, Citation1997; Palmer, Citation2006).

Self-efficacy has also been applied in a number of learning gain studies (e.g. Arico, Citation2016; Cox & Lemon, Citation2016; Lim, Hosack, & Vogt, Citation2012) which have used students’ self-assessment of their learning, knowledge and/or skills and demonstrated that students’ self-assessment of their learning can provide a robust measure of learning. Ongoing application of these measures in classroom settings (as explored, for example by Arico, Citation2016) have demonstrated learning gain over time. These studies are founded on the principles of self-efficacy, and given the connections that have been demonstrated by Forester et al. (Citation2004)and Shaw et al. (Citation2013) between self-efficacy and researcher development, this study centred on piloting a measure of learning gain based on researcher self-efficacy to explore new methods of capturing learning gain in HE.

Developing connections between learning gain and research methods

Researchers have indicated that learning gain is a complex and context-dependant construct, influenced by a range of factors such as student attitudes and entry profiles (Kandiko Howson, Citation2016). Attempts to capture learning gain have been shaped by a variety of definitions and methodological approaches (e.g. Corlu & Aydin, Citation2016; Cox & Lemon, Citation2016; Lim et al., Citation2012; Varsavsky et al., Citation2014), which make comparative work in this area challenging to undertake (McGrath et al., Citation2015).

The idea of learning gain is being widely critiqued by academics and students (Gourlay & Stevenson, Citation2017; TSEP, Citation2016). Capturing learning gain through metrics such as employability, student satisfaction or student performance data, as is the focal point of some contemporary work, threatens to simplify HE. It overlooks its transformative power and the need for higher level study to stimulate risk taking, to challenge students and in some cases to allow students to learn from failure (Gourlay & Stevenson, Citation2017). There is a concern that focusing on learning gain, as part of the wider discourse of teaching excellence, could lead to a measure of mediocrity (Gourlay & Stevenson, Citation2017). Wood and Su (Citation2017) state that any outcome measure should be based on an ethical and relational concept that promotes growth and development, as advocated by Nixon (Citation2007). This would allow for institutional context to be accommodated as well as student need, and this would rely on the pedagogic relationship underpinning learning to be considered. These writers have shaped our work, and as we demonstrate, capturing learning gain through the development of pedagogic relationships is integral to this study.

Based on the literature, we have identified the following as ‘success criteria’ which need to be addressed in order for a measure of learning gain to be useful. Any measure should:

evidence its validity as a measure of student learning and recognition of other student characteristics and environmental factors which might influence its outcomes;

generate student data which is representative and comparable at different stages of HE;

allow for meaningful comparison across institutions, while also recognising the diversity in the goals of HE institutions.

Drawing on the outcomes of the extensive body of work centred on student learning through engagement with research and research-related activities, research methods, and examining how a student’s knowledge in this subject develops and changes over their studies. It provides a useful starting point to develop a valid and useful measure of learning gain, which we present here drawing on data resulting from this first phase of this study.

As demonstrated by the SEEC level descriptors, all undergraduate programmes include an element of research training, and whilst the specific focus of this training may vary according to disciplines, it will cover key elements of the definition of research methods presented above from Murtonen (Citation2015). This broad application of research methods, and academic progression (again, as framed by the SEEC level descriptors), creates the potential for developing a measure of learning gain that can be used across a number of academic levels, therefore capturing the temporal element of the McGrath et al. (Citation2015) definition of learning gain, but also creates the potential for this measure to be applied across a range of disciplines. It is on this basis the following research questions were developed:

How do undergraduates’ attitudes towards, and confidence in, research methods change through their undergraduate study?

What pedagogic approaches are used to develop undergraduates’ knowledge, skills and expertise in research methods?

Can research methods be used as an effective measure of learning gain and what measures can be used to assess this?

Research design

Overall, the study employs a longitudinal research design to capture students’ knowledge, skills and experience of research methods as they progress through their undergraduate studies. This would potentially account for phenomena such as ‘sophomore slump’ – a term used to describe the underperformance of second-year undergraduates (Lemons & Richmond, Citation1987). Our work is ongoing, and here we present the self-reported survey data from the first phases of fieldwork. Due to sample size and coverage, cross-sectional analyses are presented. This study gained ethical approval from the University of Plymouth, Faculty of Business Research Ethics Committee.

A self-reporting survey represented the primary mechanism for data collection. These are widely used in research methods studies (Forester et al., Citation2004; Sizemore & Lewandowski, Citation2009; Williams, Payne, Hodgkinson, & Poade, Citation2008; Williams et al., Citation2016) and also by those examining learning gain (e.g. McGrath et al., Citation2015; Seymour et al., Citation2000; Varsavsky et al., Citation2014; Vogt, Atwong, & Fuller, Citation2005).

The research team have previously examined the role of research methods teaching in college-based HE. Here, we adapted a survey developed by Williams et al. (Citation2008, 2016) to explore research methods teaching in Sociology programmes. These two surveys provided the basis of a data collection instrument that included a range of confidence measures and attitudinal scales that had been extensively tested in college and university-based HE that could be incorporated into the current study. Confidence measures and attitudinal scales have both been applied in studies examining research methods education (e.g. Williams et al., Citation2016) and self-efficacy (Shaw et al., Citation2013) as both have a significant impact on students’ development. Drawing on work by Jenkins et al. (Citation2008) and Levy and Petrulis (Citation2012), we then integrated further measures to capture students’ experiences and approaches to learning research methods, paying particular attention to the pedagogic practices (i.e. whether they were learning about research methods through structured or lecturer-led activities, practiced in a simulated task/research activity instigated by a lecturer, or through independent, student-led research. These descriptors were derived from discussions within the literature relating to effective pedagogic practice around research methods education (e.g. Levy & Petrulis, Citation2012; Williams et al., Citation2016). Confidence scales were used to capture students’ self-efficacy across a range of research approaches, practice and skills. We also captured data on students’ definitions and perceptions of research methods to generate evidence of their epistemological and methodological development as researchers, and pays heed to the work of Murtonen (Citation2015) who observed that studies into research methods education need to look beyond skill development. Essential demographic information was captured and individual identifiers allocated to track individual students. Given that prior educational experiences have been identified as influential (Shaw et al., Citation2013), data on students’ previous academic qualifications were also collected. Consequently, the survey primarily captured quantitative data, with students responding to a series of statements or questions presented as Likert scales.

Despite widespread use of self-reporting surveys, some commentators have noted limitations in their application (e.g. Porter, Citation2013). Concerns have been raised the extent to which they capture actual learning and the ease at which students are able to access the experiences they need to draw upon to accurately rate their learning. To mediate any potential effects of this, the survey was prefaced with a definition of research, research methods and other key terminology. This decision was based on the recommendations of Porter (Citation2013) and Tourangeau, Rips, and Rasinski (Citation2000) regarding the use of self-reporting surveys to capture learning gain. We also did not want to cause undue anxiety amongst respondents around a lack of knowledge on aspects of research methods they had yet to study. Therefore, it was explained to participants that some of the areas of research methods may only relate to specific disciplines or stages of study.

The first iteration of the learning gain self-reporting survey underwent extensive development through consultation with programme teams to ensure that the survey was accessible and applicable to a range of disciplines and then piloted with a group of level five Business & Social Science students. The students’ feedback on the survey design, question style and ease of completion informed final revisions before the survey was administered.

Participant recruitment

A novel aspect to this project is that it spans a range of UK HE providers including a university and further education colleges delivering higher education courses. Established network of further education (FE) colleges that deliver HE (in the form of foundation and full honours degrees) validated by the University. College-based HE echoes the community college model found in the United States (Dougherty, Citation2009; Gray, Citation2016). FE colleges have long been involved in the delivery of HE. However, provision expanded considerably as part of moves to widen participation and promote engagement for underrepresented groups in the later 1990s (Gray, Citation2016). Students studying on a college-based HE course can either graduate following completion of a foundation degree, which represents the first two years of a full degree, or progress on the honours level either within the college, if provision is available locally, or to the validating partner (Gray, Citation2016). Foundation degrees are required to conform to the Foundation Degree Characteristics statement (QAA, Citation2015a) in providing vocationally focused HE at levels four and five. Predominantly (but not exclusively) delivered by FE colleges, these short cycle degree programmes provide a work-focused emphasis whilst still meeting the other requirements of any undergraduate subject at these levels (such as research methods training). Therefore, programmes based at the University and across its FE college network were selected to participate within the project in order to provide a measure of learning gain that has been tested across different HE contexts.

As research methods feature in all undergraduate programmes, it created the potential for the resulting measure of learning gain to be applied across disciplines. Therefore, a range of programmes were purposefully selected to be involved in this study. Sixteen programmes across one university and six FE colleges were chosen to represent the following disciplinary areas; Arts & Humanities (n = 6; university = 2 and college-based HE = 4), Business & Social Sciences (n = 8; university = 2 and college-based HE = 6) and Science (n = 2; university = 1 and college-based HE = 1). More college-based programmes were selected to compensate for smaller class sizes in colleges. The subject groupings of the Higher Education Academy were used to identify and align the disciplinary programmes. Programmes were selected for which there was mirrored provision at the University and the FE college network. This was considered essential to facilitate comparisons. Each programme within our sample draws on the SEEC level descriptors to shape the curriculum, representing an integral aspect of the quality assurance protocols of the validating university, hence their application in the framing of the research methods provision and the survey. To facilitate analysis, data are reported on a disciplinary basis only; future analysis will centre on disciplinary groupings and educational contexts.

Survey administration

There is debate on the most effective way to administer surveys, with observable variations in the profile of survey respondents according to whether surveys are administered online or paper-based (Nulty, Citation2008). Another challenging issue that persists, regardless of the method of administration, is response rates (Nulty, Citation2008). As the data resulting from this study would inform a learning gain toolkit, the research team were keen to maximise response rate, and therefore made the decision to administer paper-based surveys in class time. Three administration points were identified, ‘survey one’ in autumn 2015 where level five students would be prompted to reflect on their knowledge, experience and confidence in research methods following level four, ‘survey two’ at the end of level five, and ‘survey three’ at the end of level six. Most research methods teaching is concentrated in level five to support students planning their level six dissertation, therefore administering the surveys at these time frame would potentially capture resulting gains in learning. Here, we focus on data from survey one and two.

Data analysis

Following each period of data collection, data were input into SPSS by two members of the research team. Following the recommendations independent data checking was undertaken with 10% of the sample (Bryman, Citation2008). Descriptive analyses were undertaken of item statements at the level of study and by discipline group with a focus on baseline measures and mean differences.

Reliability analysis

The survey was subjected to reliability checks using level 4 data from the first sampling point. Item pools with illustrative example statements, alongside Cronbach’s α values are shown in Table . Since each item pool represented different dimensions, separate reliability analysis was undertaken on five of the identified pools. The item pool on engagement activities collected categorical multiple responses and was consequently omitted from reliability analysis. Self-reported confidence and self-efficacy: research skills evidenced higher levels of internal consistency (α > .90). Research orientation: emotion/feelings (a = .664) and learning motivations: frequency of studying (a = .674) showed levels of internal consistency below the accepted .7 cut-off for point. Since we are reporting research data mid-way through the study we will review when the data from all three data collection points is available. Research orientations: perceptions of research methods had lower levels of internal consistency (α = .445) with Corrected Item-Total Correlations of less than .3 suggesting that this subscale has less internal consistency (Field, Citation2013, p. 709). Preliminary Principal Component Analysis, PCA, (rotated varimax) was undertaken on five of the item pools, with factors retained where eigenvalue were >1. The Kaiser–Meyer–Olkin measure for sampling adequacy was applied and found to be mediocre for three item pools, and meritorious for two item pool. Investigation of the individual items clustering on the factors suggested some identifiable subscales, however given the size of the sample, and that the measurement tool was designed to reflect the learning gain in research methods across levels 4, 5 and 6, the research team took the conservative decision to undertake the remaining data collection for levels 5 and 6 prior to any future latent variable analysis.

Table 2. Item pools, reliability analysis and preliminary principal component analysis.

Findings

Here, we present an overview of the sample profile, and report data relating to respondents at survey points one and two in one University and six college-based HE providers. Cross-sectional results from the first two surveys are presented. As we report ongoing work we are yet to undertake longitudinal analyses involving matched case analysis including academic performance data, though this is a long-term ambition as these data becomes available to the research team. We also intend to control for institutional context, however, where relevant disciplinary trends are considered. These future analyses will facilitate a more detailed examination of the complexity of measuring learning gain through measures of pedagogic engagement, self-efficacy and performance outcomes at the level of the individual, and provide clear insights into the long-term viability of using research methods as a proxy for learning gain. The analysis presented here allows us to begin to assess these factors at a cohort/disciplinary level and provide a valuable basis on which to shape subsequent data collection.

Sample profile

To date 312 surveys have been collected from 219 students (74 M, 131 F, 14 prefer not to say/unspecified) (see Table for a breakdown of respondents across survey points and disciplinary areas). At level 4 this represents a response rate of 48.0%, and level 5, 33.9%. Participant engagement with the study has been variable across each disciplinary area (Table ), despite the best efforts of the research team. Attrition in learning gain work is an ongoing challenge (Kandiko Howson, Citation2016; Porter, Citation2013), and has resulted in many studies adopting cross-sectional design or undertaking cross-sectional analyses (McGrath et al., Citation2015). Indeed accessing Arts and Education participants for the second administration point was initially unsuccessful. As data collection are ongoing, we are monitoring respondents’ engagement and exploring strategies to maintain and promote participant engagement (e.g. extending data collection beyond the current academic year and recruiting additional cohorts in disciplines where attrition is concerning). Once this work has been completed a profile analysis of non-participation can be undertaken to provide insights into non-administrative barriers to engagement, and potential impacts this has on representativeness considered (De Vaus, Citation2014). Furthermore, we plan to consider response profiles alongside summative academic performance. The research team also intend to undertake an in-depth analysis of all data once all surveys have been administered which will allow the analysis of matched cases to be considered.

Table 3. Number of survey responses across each data collection point by disciplinary area.

The participants’ ages ranged from 18 to 50 years with a mean of 23.5 (SD = 7.14). The majority of respondents (69.8%) identified A-levels as their most recent qualification, 9.3% had entered based on their professional/vocational qualifications, 6.0% had completed Access to HE courses and 5.5% studied GCSE’s. The varied entry profile represents the breadth of educational contexts included in this study as college-based HE is recognised as attracting vocationally orientated students or those entering with non-traditional qualifications (Lea & Simmons, Citation2012). However, the prominence of A-level qualifications indicates that some respondents from this group are likely to have completed these before commencing HE study, even though they are based within an FE college where these qualifications are not generally perceived to be the norm (Lea & Simmons, Citation2012).

Research orientations: student attitudes and perceptions of research methods

Based on their level 4 experiences 92.9% of respondents stated they thought studying research methods and developing research skills was an essential part of their course, dropping slightly to 88.4% at Level 5. This is a notable finding; a major issue for research methods educators is students failing to perceive the relevance of research methods provision, which impacts on their motivation to learn the subject and their level of interest (Earley, Citation2014). This may reflect changing attitudes to research methods more widely – a consequence of the focus placed upon this area by policy-makers and universities (Kilburn et al., Citation2014; Nind et al., Citation2015). This trend mirrored across discipline groups (Table ). The level 4 to level 5 reduction was not unanticipated; students enter HE with preconceived ideas of research (Murtonen, Citation2005). Those with prior family experience of HE have been observed to possess more realistic understandings of research (Bangera & Brownell, Citation2014). However, even these individuals experience a change as they begin to engage with research and research-related activities. First years often perceive activities such as literature searching as research (Sizemore & Lewandowski, Citation2009). Engaging with research methods courses fosters realistic understandings of research. Earley (Citation2014) suggests that students adjust their expectations according to the aspects of research methods they engage with, although this can depend on the pedagogic approaches used. Integral to this is the extent to which students formulate an awareness of the wider applicability of research methods, for example, whether they are taught primarily research skills or the wider applicability to their discipline (Murtonen, Citation2015). The impact of this is evident as we examine learning motivations drawing on respondents’ pedagogic experiences and practising of research methods (Table ).

Table 4. Studying and developing research skills is essential by level and discipline group.

Table 5. Pedagogical learning experiences at Level 4 and Level 5 (%).

Pedagogic engagement with research methods education

Students were asked to identify how they had studied and practiced essential aspects of the research process (Table ). Consistent with the move from level 4 and level 5, and as students accrue and recognise their gain in knowledge and experience of different learning activities in research methods training, the survey data demonstrate a move towards more practice and applied learning activities. For example, at level 4 ‘considering research design and methodology/critical approaches and critical theories’ were more likely to be studied through structured learning activities (61.9%) than a practice-based activity (37.0%) or applied to independent research (19.3%) (Table ). By level 5, there is a shift towards higher proportions of practice (44.2%) and applied (45.7%) activities. Active engagement with research methods is essential to fostering positive attitudes to research methods. Deem and Lucas (Citation2006) identified a relationship between how students are taught about and engage with research methods and perceived relevance. In their early encounters with research skills, especially when encountered through didactic approaches, students can perceive research methods as abstract; their appreciation only matures through active engagement with research (Murtonen, Citation2015). Active engagement with research methods can also uncover the limits to their knowledge, and this is where attention should be focused to promote future learning and avoid limitations in a student’s confidence in an aspect of research methods undermining their future engagement – an ongoing challenge for the quantitative research methods educators in particular (Chamberlain et al., Citation2015).

Some research areas are reported as having less coverage in the curriculum (e.g. ‘constructing a research question’, and ‘reporting research’; Table ). It is likely that as students progress through their studies (and equally this project), and they complete their dissertation, that their familiarity with these topics may increase. At levels 4 and 5 students’ ‘reporting’ of research is driven by assessment guidelines that focus their attention towards specific activities (e.g. report writing) and provided prescriptive templates that seem removed from how they may commonly encounter primary research in journals. Interestingly though, growing in prominence, undergraduate research conferences represent a recent innovation (Walkington, Hill, & Kneale, Citation2017) which may explain why this is an underdeveloped aspect of research methods education.

Disciplinary differences were evident in the pedagogical approaches used, which to some extent may be expected, as they reflect the research traditions of disciplines. There is a clear rationale for teaching research methods within the context of a substantive discipline to promote relevance and engagement amongst students (Bridges, Gilmore, Pershing, & Bates, Citation1998). For example, level 5 respondents in Science reported practising and engaging in activities that involved them ‘considering research design and methodology/critical approaches and critical theories’ practising and applying, which were higher than those in Business & Social Science and Arts & Education (Table ). Likewise engagement with methods of qualitative data analysis of text, images and videos was greatest in Arts & Education, with application of these skills reported (Table ). Business & Social Science students tend to encounter their research methods through structured activities, with limited opportunities to practice or apply research skills reported (Table ). This could potentially indicate a connection between the pedagogic framing of research methods within the discipline and student engagement.

Table 6. Pedagogical learning experiences at Level 4 and Level 5 (%) around specific aspect of research methods/discipline.

These trends reflect the positioning of research methods in QAA Subject Benchmark Statements. Research methods are foregrounded in the Benchmark Statements for Arts (QAA, Citation2015c), Education (QAA, Citation2014a) and Science (QAA, Citation2014b). Research methods feature in the description of the discipline reflecting characteristics of the discipline; e.g. Science students are expected to engage with ‘quantitative and qualitative approaches to acquiring and interpreting data’ (QAA, Citation2014b, p. 7). These are reinforced by the intellectual and practical skills graduates in these disciplines are expected to demonstrate. Interestingly, the role of research and research methods in the Business Subject Benchmark is less explicit. Here, the emphasis is on students being able to ‘analyse and evaluate business data’ and ‘evidence informed decision making’ (QAA, Citation2015b, p. 8). These skills naturally lead towards more passive engagement with research methods as students are less likely to be involved in research design and reviewing methods of data collection, and instead working with existing data-sets. This is evidenced through the extent to which many key aspects of research methods are engaged with through structured, lecturer-led activities and the limited application of research methods (Table ). This contrasts the student experience in Arts & Education and Science where there is an overall move towards the application of research methods. This may explain why the Business students report a reduction in their perception of ‘studying and developing research as essential’ (Table ), as the absence of active engagement with research methods means it remains an abstract skill. The absence of a pedagogical practice around research methods teaching in Business & Social Science disciplines may relate to a wider issue recognised by Wagner et al. (Citation2011) regarding the absence of a pedagogical culture surrounding research methods teaching in social sciences. Collectively, these factors are likely to impact significantly on students’ motivation and willingness to engage with research methods.

Table 7. Studying or practising research methods by discipline (very often/often%) at Level 4 and 5.

Students were asked to report how often they engaged with a range of learning activities (e.g. pre-course reading, accessing support from peers/lecturers and collaborative working; Table ). Again there are disciplinary trends. For example, Science students increasingly report reaching conclusions based on their own analysis of numerical information, as do Business & Social Science students (Table ). The extent to which there is a limited change across some of the categories may indicate the increasing sophistication of the research skills they are learning and the extent to which they are mastering these skills (e.g. limited progress is made across each disciplinary area with respect to the survey items ‘during research training, connected ideas to your prior knowledge and experiences’ and ‘combine ideas from different modules when engaged in research-related tasks’). However, this may also be influenced by the foundation on which they are building. In some areas, the Business & Social Science students appear to begin from a lower base of engagement compared to their peers in other disciplines (Table ).

Student confidence as researchers: self-efficacy

Studies that have examined how students’ sense of self-efficacy in research methods changes identify that as students become more competent as researchers they grow in confidence (Forester et al., Citation2004; Shaw et al., Citation2013). Consequently, students were asked to rate their confidence on a five-point scale across a range of research-related activities. Mean confidence across the statements varied between 2.44–3.09 (level 4) and 2.73–3.42 (level 5). Positive mean differences were found across all areas, and the greatest confidence gains are in activities such as applying research techniques to new areas of study (+.39), organising, planning and managing a research process (+.37), correctly citing existing published research (+.36), reading and understanding research articles (+.29), developing research proposals (+.27), presenting research (+.27) and developing research proposals (+.27).

Gains in areas such as developing research proposals may reflect recent engagement with a module to prepare them for their level 6 dissertation. However, growing confidence in planning research and applying research techniques to new areas of study are notable, and resonate with the findings of Forester et al. (Citation2004) to compare and contrast measures of undergraduate research self-efficacy. This potentially indicates the emergence of more sophisticated understandings of knowledge and research, potentially also, a sense of themselves as researchers – a move towards being ‘research prepared’. Making this transition is challenging for many undergraduates, and where this does not occur, it can lead to the persistence of negative attitudes towards research methods (Murtonen, Citation2015; Shaw et al., Citation2013). It is often dependant on how research methods are framed within a degree programme i.e. are they taught within the confines of a standalone module and not integrated or embedded across the whole curriculum (Williams et al., Citation2016; Table ).

Table 8. Confidence in research methods (Level 4 to Level 5).

Once again, there were disciplinary trends (Table ) evident through changes in mean confidence from level 4 to 5. Skills students reported studying or practising ‘very often/often’ at level 4 (Table ) show lower level of change in mean confidence from level 4 to 5. This is interesting; it may be anticipated that as students become familiar with research activities they experience increased confidence. However, this could also relate to the academic complexity of these skills, in that they have yet to gain a sense of ‘mastery’. Business & Social Science students report considerable gains in confidence across numerous items (e.g. +.64 ‘apply research techniques studied in new areas of your discipline’, and + .45 ‘read and understand research findings in academic journals’). As noted above, research methods appear to be less explicit in the Business curriculum, and this impacts on student engagement and confidence in these areas. The Science students are actively exposed to research methods from level 4 therefore in some areas only small gains in confidence are reported (e.g. no change in confidence ‘form a new idea or understanding from various pieces of information’) whereas again engaging activities related to developing a dissertation proposal are positively impacting their confidence (e.g.+.38 ‘develop a research proposal for your own or your team’s research’ and + .16 ‘apply facts, theories or methods to practical problems or new situations’). Science students reported negative confidence gain in ‘undertaking collaborative research and working cooperatively as a team’ (−.21) and may reflect wider students concerns about group work and summative assessment (Allan, Citation2017). Arts & Education show reductions in confidence in some areas (Table ) despite reporting the application and practising of many aspects of research methods (Table ). This perhaps indicates students previously possessing incorrect or limited understanding in items such as ‘debate whether a proposed research study is ethical’ (−.10), ‘apply facts, theories or methods to practical problems or new situations’ (−.06), ‘critically evaluate research articles for sound methodology, data analysis and interpretation’ (−.06). (Table ). This could have previously overestimated their confidence, as once they practised and realised what these skills actually entailed they reassessed their level of confidence. This is not unexpected; partly due to the issues of ‘sophomore slump’ but also as this is a finding reported in related work on research methods education (e.g. Chamberlain et al., Citation2015; Sizemore & Lewandowski, Citation2009).

Table 9. Mean change in confidence in research methods by discipline group: Level 4 to Level 5.

Measures of self-efficacy are valuable in directing future learning and encouraging goal setting, hence why many studies centred on motivations and attitudes to research methods examine it (e.g. Shaw et al., Citation2013). Students that exhibit high levels of self-efficacy are usually self-regulated learners who are good at identifying future learning needs (Zimmerman et al., Citation1992). However, not all students, particularly those with diverse entry profiles, have the expertise or capacity to identify their learning goals (Hsieh, Sullivan, & Guerra, Citation2007). Encouraging students to assess their confidence or expertise may highlight the focus of future learning, and within a supportive or directive framework, may motivate students to achieve (Hsieh et al., Citation2007).

Conclusions

In this paper, we report findings of a HEFCE Learning Gain pilot to capture data on undergraduates’ research methods education to develop a measure of learning gain. Given the challenges students face with research methods, it may not appear an obvious area on which to develop a learning gain measure. Indeed, whilst we do identify positive learning gains, these need to be considered in relation to disciplinary framing and pedagogic practice underpinning research methods education. Evidence from our data demonstrates that there are clear trends towards disciplinary differences in the extent to which students are making judgements on the validity of existing knowledge, engaging in the generation of new knowledge through the application of research methods. However, as is increasingly been reported, learning gain is a complex and context-dependant construct (Kandiko Howson, Citation2016). Therefore, it is likely that any resulting measure will need to pay heed to student profile, disciplinary practice and institutional focus.

Our learning gain study is founded on the concept of self-efficacy, which is increasingly been applied in learning gain studies. Lim et al. (Citation2012) identified value in students’ self-assessing through a measure of learning gain, as it can encourage students to connect with the material being ‘measured’. This can have two benefits, as well as overcoming some of the limitations associated with other learning gain measures (e.g. lack of student engagement with standardised tests). Firstly, if the measure connects with a student’s future learning (i.e. they gain feedback on current performance and receive guidance on how to direct their future learning to maximise performance through, for example, targeted resources, recommendations on learning strategies or readings), they should be able to set future learning goals (Hsieh et al., Citation2007). Such an approach resonates with work from Arico (Citation2016), who explores how in-class self-assessment and reflection activities can, over the course of a module, promote and capture learning gain. It would also provide a measure of learning gain that goes beyond institutional accountability and teaching enhancement, to consider the process of student learning itself (Varsavsky et al., Citation2014). For this to be successful, it requires explicit integration of a learning gain measure into the curriculum and for students to engage in a dialogue around their growing knowledge, skills and experience in the area the measure is centred, which in this case would be research methods. The extent to which this self-assessment, reflection and dialogue needs scaffolding requires consideration, particularly for those in environments where research may not be an explicit feature of institutional practice (e.g. in college-based HE). However, this does create opportunities for relationships to be fostered to promote learning. Recent work (e.g. HEA, Citation2011; Hagenauer & Volet, Citation2014) identified the integral role of pedagogic relationships, (between and across students and educators) as creating safe spaces that encourage risk taking and experimentation, and in the long-term student engagement and learning. Good pedagogical relationships are identified by Wood and Su (Citation2017) as a sign of ‘excellence’. Here we propose, therefore, that self-assessment, reflection and dialogue around research methods education could foster relationships that scaffold and capture learning.

Additionally, encouraging students to self-assess, reflect and discuss their current/future learning of research methods would highlight the role of research for disciplinary and academic development. This may overcome the basic challenges faced by many research methods educators of students struggling to perceive the relevance of the subject, or failing to recognise the progress they are making (e.g. Murtonen, Citation2015; Shaw et al., Citation2013). Also, participating in a dialogue around their engagement with, and application of, research methods within their studies may better prepare students for applying such expertise in their professional lives. This could also work towards addressing the concerns of research methods educators, employers and policy-makers regarding undergraduate preparation for working within the knowledge economy (Nind et al., Citation2015). Indeed, though our findings demonstrate that disciplinary groups may differentiate the focus of the research methods content and learning, there is a commonality of purpose in underlying skills across disciplines. A more explicit examination of the applicability of skills in employment/vocational contexts would be a welcome addition to the curriculum.

Our approach would allow programme teams and institutions to build up longitudinal data relating to the learning gain students make in research methods. Indeed, these are the premises on which the Toolkit are based, and form the intended outcome of this project. In developing a measure of learning gain centred on research methods, we intend to create a ‘learning toolkit’ which allows students, with the support of research methods educators, to self-assess, reflect on and direct their future learning. Though the focus here is upon research methods, given that many HE providers are integrating research-based pedagogies across the curriculum (Earley, Citation2014; Healey, Jenkins, & Lea, Citation2014), which may sit alongside or be integrated into research methods provision, it may prompt wider reflections on learning and development. Self-assessment of current levels of knowledge, skills and experiences in research methods, reflection on the outcomes with a tutor, signposting to relevant resources and consideration of methods of learning/pedagogic engagement, to stimulate future learning gain is the basis of this resource and will be the focus of the ongoing work.

The need for institutions to evidence students’ skills acquisition for employers is a significant factor underpinning the development of measures of learning gain (McGrath et al., Citation2015). However, in developing such measures, institutions and programme teams should be cautious. It is unlikely that there will be a single solution, and institutions will need to adapt and contextualise any learning gain measure they employ. This is evident in the presentation of data obtained in our project. The differential framing of research methods in disciplines shapes students’ pedagogic engagement with research methods, which equally impacts on their research orientations and students’ sense of self-efficacy. This is before factors such as entry profile are controlled for. Overlooking the significance of context, and also student need, could undermine the potential of a learning gain measure to promote actual student learning. To some, student learning has been overtaken by a discourse of excellence and accountability. Maintaining the central role of student learning in studies of learning gain is essential and will support to development and application of measures that have credibility across the sector.

Funding

This work was supported by the Higher Education Funding Council for England to pilot and evaluate measures of learning gain [grant number G0177/IPLY02/LG1516].

Disclosure statement

No potential conflict of interest was reported by the authors.

References

  • Allan, E.G. (2017). ‘I hate group work!’: Addressing students’ concerns about small-group learning. Insight. A Journal of Scholarly Teaching , 11 , 81–89.
  • Arico, F. (2016). Promoting active learning through peer-instruction and self-assessment: A toolkit to design, support and evaluate teaching. Educational Developments , 17 (1), 15–18.
  • Ashwin, P. , Abbas, A. , & McLean, M. (2017). How does completing a dissertation transform undergraduate students’ understandings of disciplinary knowledge? Assessment & Evaluation in Higher Education , 42 (4), 517–530. doi:10.1080/02602938.2016.1154501
  • Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.
  • Bangera, G. , & Brownell, S.E. (2014). Course-based undergraduate research experiences can make scientific research more inclusive. CBE Life Science Education , 13 (4), 602–606.10.1187/cbe.14-06-0099
  • Benson, A. , & Blackmore, D. (2003). Can research methods ever be interesting? Active Learning in Higher Education, 4 (1): 39–55.
  • Brew, A. (2013). Understanding the scope of undergraduate research: A framework for curricular and pedagogical decision-making. Higher Education , 66 , 603–618. doi:10.1007/s10734-013-9624-x
  • Bridges, G.S. , Gilmore, G.M. , Pershing, J.L. , & Bates, K.A. (1998). Teaching quantitative research methods: A Quasi-experimental analysis. Teaching Sociology , 26 (1), 14–28.10.2307/1318676
  • Bryman, A. (2008). Social research methods . Oxford: Oxford University Press.
  • Chamberlain, J.M. , Hillier, J. , & Signoretta, P. (2015). Counting better? An examination of the impact of quantitative method teaching on statistical anxiety and confidence. Active Learning in Higher Education , 16 (1), 51–66. doi:10.1177/1469787414558983
  • Coates, H. (2009). What's the difference? A model for measuring the value added by higher education in Australia. Higher Education Management and Policy , 21 (1), 1–20. doi:10.1787/hemp-v21-art5-en
  • Corlu, M.A. , & Aydin, E. (2016). Evaluation of learning gains through integrated STEM projects. International Journal of Education in Mathematics, Science and Technology , 4 (1), 20–29. doi:10.18404/ijemst.35021
  • Cox, T.D. , & Lemon, M.A. (2016). A curricular intervention for teaching and learning: Measurement of gains of first-year college student learning. Journal of the Scholarship of Teaching and Learning , 16 (3), 1–10. doi:10.14434/josotl.v16i3.19268
  • Davis, H. , Evans, T. , & Hickey, C. (2006). A knowledge-based economy landscape: Implications for tertiary education and research training in Australia. Journal of Higher Education Policy and Management , 28 (3), 231–244. doi:10.1080/13600800600979983
  • De Vaus, D.A. (2014). Surveys in social research . (6th ed.). Oxford: Routledge.
  • Deem, R. , & Lucas, L. (2006). Learning about research: Exploring the learning and teaching/research relationship amongst educational practitioners studying in higher education. Teaching in Higher Education , 11 (1), 1–18. doi:10.1080/13562510500400040
  • Dougherty, K.J. (2009). English further education through american eyes. Higher Education Quarterly , 63 (4), 343–355. doi:10.1111/j.1468-2273.2009.00437.x
  • Douglass, J.A. , Thomson, G. , & Zhoo, C.M. (2012). The learning outcomes race: The value of self-reported gains in large research universities. Higher Education , 64 , 317–335. doi:10.1007/s10734-011-9496-x
  • Earley, M.A. (2014). A synthesis of the literature on research methods education. Teaching in Higher Education , 19 (3), 242–253. doi:10.1080/13562517.2013.860105
  • European Commission . (n.d.). Learning Opportunities and Qualifications in Europe . Retrieved from: https://ec.europa.eu/ploteus/en
  • Field, A. (2013). Discovering statistics using IBM SPSS statistics . London: Sage.
  • Forester, M. , Kahn, J.H. , & Hesson-McInnis, M.S. (2004). Factor structures of three measures of research self-efficacy. Journal of Career Assessment , 12 (1), 3–16. doi:10.1177/1069072703257719
  • Gourlay, L. , & Stevenson, J. (2017). Teaching excellence in higher education: Critical perspectives. Teaching in Higher Education , 22 (4), 391–395. doi:10.1080/13562517.2017.1304632
  • Gray, C. (2016). Implementing english further/higher education partnerships: The street level perspective. Higher Education Quarterly , 70 (1), 43–58. doi:10.1111/hequ.12078
  • Green, W. , Hammer, S. , & Star, C. (2009). Facing up to the challenge: Why is it so hard to develop graduate attributes? Higher Education Research and Development , 28 (1), 17–29. doi:10.1080/07294360802444339
  • Hagenauer, G. , & Volet, S.E. (2014). Teaching-student relationships at university: An important yet under-researched field. Oxford Review of Education , 40 (3), 370–388.10.1080/03054985.2014.921613
  • HEA . (2011). A good practice guide to learning relationships in higher education . York: Higher Education Academy. Retrieved from: https://www.heacademy.ac.uk/system/files/northumbria_guide_to_relationships_dec_11_1.pdf
  • Healey, M. , & Jenkins, A. (2009). Developing undergraduate research and inquiry . York: Higher Education Academy.
  • Healey, M. , Jenkins, A. , & Lea, J. (2014). Developing research-based curricular in college-based higher education . Retrieved from: https://www.heacademy.ac.uk/system/files/resources/developing_research-based_curricula_in_cbhe_14.pdf
  • Howard, C. , & Brady, M. (2015). Teaching social research methods after the critical turn: Challenges and benefits of a constructivist pedagogy. International Journal of Social Research Methodology , 18 (5), 511–525. doi:10.1080/13645579.2015.1062625
  • Hsieh, P. , Sullivan, J.R. , & Guerra, N.S. (2007). A closer look at college students: Self-efficacy and goal orientation. Journal of Advance Academics , 18 (3), 454–476.10.4219/jaa-2007-500
  • Jenkins, A. , Healey, M. , & Zetter, R. (2008). Linking teaching and research in disciplines and departments . York: Higher Education Academy.
  • Kandiko Howson, C. (2016, December). Measuring learning gain . Paper presented at the SRHE International Annual Research Conference, Newport in South Wales, UK.
  • Kilburn, D. , Nind, M. , & Wiles, R. (2014). Learning as researchers and teachers: The development of a pedagogical culture for social science research methods? British Journal of Educational Studies , 62 (2), 191–207. doi:10.1080/00071005.2014.918576
  • Klein, S. , Benjamin, R. , Shavelson, R. , & Bolus, R. (2007). The collegiate learning assessment facts and fantasies. Evaluation Review , 31 (5), 415–439. doi:10.1177/0193841X07303318
  • Lea, J. , & Simmons, J. (2012). Higher education in further education: Capturing and promoting HEness. Research in Post-Compulsory Education , 17 (2), 179–193. doi:10.1080/13596748.2012.673888
  • Lemons, L.J. , & Richmond, D.R. (1987). A developmental perspective of sophomore slump. NASPA Journal , 24 (3), 15–19.
  • Levy, P. , & Petrulis, R. (2012). How do first-year university students experience inquiry and research, and what are the implications for the practice of inquiry-based learning? Studies in Higher Education , 37 (1), 85–101. doi:10.1080/03075079.2010.499166
  • Lim, B. , Hosack, B. , & Vogt, P. (2012). A framework for measuring student learning gains and engagement in an introductory computing course: A preliminary report of findings. Electronic Journal of e-Learning , 10 (4), 428–440.
  • MacInnes, J. (2012). Quantitative methods teaching in UK higher education: The state of the field and how it might be improved. Social Sciences Teaching and Learning Summit: Teaching Research Methods. University of Warwick.
  • McGrath, C.H. , Guerin, B. , Harte, E. , Frearson, M. , & Manville, C. (2015). Learning gain in higher education . Cambridge: Rand/HEFCE.
  • Murtonen, M. (2005). University students’ research orientations: Do negative attitudes exist toward quantitative methods? Scandinavian Journal of Educational Research , 49 (3), 263–280. doi:10.1080/00313830500109568
  • Murtonen, M. (2015). University students’ understanding of the concepts empirical, theoretical, qualitative and quantitative research. Teaching in Higher Education , 20 (7), 684–698. doi:10.1080/13562517.2015.1072152
  • Nind, M. , Kilburn, D. , & Wiles, R. (2015). Using video and dialogue to generate pedagogic knowledge: Teachers, learners and researchers reflecting together on the pedagogy of social research methods. International Journal of Social Research Methodology , 18 (5), 561–576. doi:10.1080/13645579.2015.1062628
  • Nixon, J. (2007). Excellence and the good society. In: A. Skelton (Ed.), International perspectives on teaching excellence in higher education: Improving knowledge and practice (pp. 15–31). Abingdon: Routledge.
  • Nulty, D.D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment and Evaluation in Higher Education , 33 (3), 301–314.10.1080/02602930701293231
  • Palmer, D.H. (2006). Sources of self-efficacy in science methods courses for primary teacher education students. Research in Science Education , 36 , 337–353.10.1007/s11165-005-9007-0
  • Papanastasiou, E.C. , & Zembylas, M. (2008). Anxiety in undergraduate research methods courses: Its nature and implications. International Journal of Research & Method in Education , 31 (2), 155–167. doi:10.1080/17437270802124616
  • Porter, S.R. (2013). Self-reported learning gains: A theory and test of college student survey response. Research in Higher Education , 54 (2), 201–226. doi:10.1007/s11162-012-9277-0
  • QAA . (2014a). Subject benchmark statement: Early childhood studies . (2nd ed.). Gloucester: The Quality Assurance Agency for Higher Education . Retrieved from http://www.qaa.ac.uk/en/Publications/Documents/SBS-early-childhood-studies-14.pdf
  • QAA . (2014b). Subject benchmark statement: Earth sciences, environmental sciences and environmental studies . (3rd ed.). Gloucester: The Quality Assurance Agency for Higher Education . Retrieved from http://www.qaa.ac.uk/en/Publications/Documents/SBS-earth-sciences-14.pdf
  • QAA . (2015a). Characteristics statement: Foundation degree . Gloucester: The Quality Assurance Agency for Higher Education . Retrieved from http://www.qaa.ac.uk/publications/information-and-guidance/publication?PubID=2976#.WPYlf9LyuUk
  • QAA . (2015b). Subject benchmark statement: Business and management . Gloucester: The Quality Assurance Agency for Higher Education . Retrieved from http://www.qaa.ac.uk/en/Publications/Documents/SBS-business-management-15.pdf
  • QAA . (2015c). Subject benchmark statement: English . Gloucester: The Quality Assurance Agency for Higher Education . Retrieved from http://www.qaa.ac.uk/en/Publications/Documents/SBS-English-15.pdf
  • Schleicher, A. (2016). Value-added: How do you measure whether universities are delivering for their students? (HEPI Report 82). London: HEPI 2015 Annual Lecture. Retrieved from http://www.hepi.ac.uk/wp-content/uploads/2016/01/Andreas-Schleicher-lecture1.pdf
  • Schommer, M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology , 82 (3), 498. doi:10.1037/0022-0663.82.3.498
  • SEEC . (2016). Credit level descriptors for higher education 2016 . Luton: Southern England Consortium for Credit Accumulation and Transfer \ . Retrieved from http://www.seec.org.uk/wp-content/uploads/2016/07/SEEC-descriptors-2016.pdf
  • Seymour, E. , Wiese, D.J. , Hunter, A. , & Daffinrud, S.M. (2000, March). Creating a better mousetrap: On-line student assessment of their learning gains . Paper presented at the National Meeting of the American Chemical Society, San Francisco, CA. Retrieved from http://t.salgsite.org/docs/SALGPaperPresentationAtACS.pdf
  • Shaw, K. , Holbrook, A. , & Bourke, S. (2013). Student experience of final-year undergraduate research projects: An exploration of ‘research preparedness’. Studies in Higher Education , 38 (5), 711–727. doi:10.1080/03075079.2011.592937
  • Sizemore, O.J. , & Lewandowski, G.W., Jr (2009). Learning might not equal liking: Research methods course changes knowledge but not attitudes. Teaching of Psychology , 36 (2), 90–95. doi:10.1080/00986280902739727
  • Todd, M. , Bannister, P. , & Clegg, S. (2004). Independent inquiry and the undergraduate dissertation: Perceptions and experiences of final-year social science students. Assessment & Evaluating in Higher Education , 29 , 335–355. doi:10.1080/0260293042000188285
  • Tourangeau, R. , Rips, L.J. , & Rasinski, K. (2000). The psychology of survey response . Cambridge: Cambridge University Press.10.1017/CBO9780511819322
  • TSEP . 2016. Learning gain and student engagement: Opportunity or threat? Retrieved from http://tsep.org.uk/learning-gain-and-student-engagement-opportunity-or-threat/
  • van Dinther, M. , Dochy, F. , & Seges, M. (2011). ‘Factors affecting students’ self-efficacy in higher education. Educational Research Review , 6 (1), 95–198.10.1016/j.edurev.2010.10.003
  • Varsavsky, C. , Matthews, K.E. , & Hodgson, Y. (2014). Perceptions of science graduating students on their learning gain. International Journal of Science Education , 36 (6), 929–951. doi:10.1080/09500693.2013.830795
  • Vogt, G. , Atwong, C. , & Fuller, J. (2005). Student assessment of learning gains (SALGains). Business and Professional Communication , 68 (1), 36–43.
  • Wagner, C. , Garner, M. , & Kawulich, B. (2011). The state of the art of teaching research methods in the social sciences: Towards a pedagogical culture. Studies in Higher Education , 36 (1), 75–88. doi:10.1080/03075070903452594
  • Walkington, H. , Hill, J. , & Kneale, P.E. (2017). Reciprocal elucidation: A student-led pedagogy in multidisciplinary undergraduate research conferences. Higher Education Research & Development , 36 (2), 416–429. doi:10.1080/07294360.2016.1208155
  • Williams, M. , Payne, G. , Hodgkinson, L. , & Poade, D. (2008). Does British sociology count? Sociology students’ attitudes toward quantitative methods. Sociology , 42 (5), 1003–1021. doi:10.1177/0038038508094576
  • Williams, M. , Sloan, L. , Cheung, S.Y. , Sutton, C. , Stevens, S. , & Runham, L. (2016). Can’t count or won’t count? Embedding quantitative methods in substantive sociology curricula: A quasi-experiment. Sociology , 50 (3), 435–452. doi:10.1177/0038038515587652
  • Winn, S. (1995). Learning by doing: Teaching research methods through student participation in a commissioned research project. Studies in Higher Education , 20 (2): 203–214.
  • Wood, M. , & Su, F. (2017). What makes an excellent lecturer? Academics’ perspectives on the discourse of ‘teaching excellence’ in higher education. Teaching in Higher Education , 22 (4), 1–16. doi:10.1080/13562517.2017.1301911
  • Zimmerman, B.J. , Bandura, A. , & Martinez-Pons, M. (1992). Self-motivation for academic attainment: The role of self-efficacy beliefs and personal goal setting. American Educational Research Journal , 29 (3), 663–676.10.3102/00028312029003663