2,549
Views
1
CrossRef citations to date
0
Altmetric
Introduction

Linguistic predictors of academic achievement amongst international students and home students in higher education: introduction

, , &
Pages 1453-1457 | Received 05 Nov 2020, Accepted 08 Nov 2020, Published online: 16 Oct 2021

Academic language proficiency and academic achievement

The last 20 years have seen a massive expansion and diversification of post-compulsory education world-wide. Two forces, in particular, have fuelled this trend: the internationalisation of higher education, and the widening participation amongst home students. The number of international students rose from 2 million in 2000 to 5.3 million in 2017 (UNESCO Citation2019). This is the case not only in English speaking countries, such as Australia, the UK and the US, but also in countries where English is used as an additional or foreign language and where local universities offer programmes in which the language of instruction is English. These two phenomena have considerably increased the range in academic language and literacy skills of incoming students. As a result, many universities face serious challenges in supporting a linguistically diverse population of both home and international students (Snow and Uccelli Citation2009; Uccelli et al. Citation2015).

All students who enrol in higher education need to develop academic abilities and skills that are required to produce and understand oral and written texts typical in a particular subject area. Academic language is a special register, characterised by interpersonal stance, information density, specific lexical choices (abstract/technical concepts), and complex noun phrases, that has to be learned by everyone (Biber Citation2006; Biber and Gray Citation2011; Hulstijn Citation2015; Van Dijk Citation2015). International and home students may thus have many problems in common. Indeed, although most previous research focused on individual differences in language proficiency of international students who speak the language of instruction as a foreign language (L2), those that investigated home students suggest that weaker language skills lead to poorer academic outcomes even in students who speak the language of instruction as their first language (L1) (e.g. Milton and Treffers-Daller Citation2013).

Studies that directly compare language and literacy skills of L1 and L2 university students are still scarce (e.g. Elder, Bright, and Bennett Citation2007; Trenkic and Warmington Citation2019), but they reveal that in addition to individual differences, there are very large differences in linguistic abilities at the group level, too. And although many international students rank among the top performers, studies based on national statistics for higher education outcomes demonstrate that as a group, they do not enjoy the same level of academic success as home students (Iannelli and Huang Citation2014; Morrison et al. Citation2005) and are at a greater risk of failure (Paton Citation2007). The reasons for this differential attainment are undoubtedly complex, but in contexts in which linguistically disparate groups compete academically, it is essential to understand how academic language and literacy skills develop, and how they affect learning and academic achievement.

Academic language proficiency is just one factor amongst many that influence academic success (e.g. Davies Citation2007). Yet, precisely because academic success is dependent upon a wide array of factors, even a small correlation with an individual variable can indicate a meaningful relationship (Ginther and Yan Citation2018). This is why in the context of higher education, we need a much better understanding of how individual and group differences in academic language proficiency impact content learning and academic success of both home and international student populations.

This special issue aims to fill the gaps in our knowledge regarding (i) the identification of students at risk and the use of remedial programmes for students with insufficient linguistic abilities; (ii) screening tests and measures of academic language proficiency, with high predictive validity in relation to study success; (iii) specific aspects of academic language that are important for academic achievement. Addressing these questions is crucial not just for detecting students at risk and providing practical remediation, but also vital for a full understanding of language learning development at advanced levels.

This special issue

This special issue contains six papers that address the three topics listed above from different angles. Firstly, the identification of students at risk and the impact of a remedial programme are discussed in two contributions, respectively by Heeren, Speelman and De Wachter, and Kuiken and Vedder.

Heeren, Speelman and De Wachter investigate the predictive validity of a low-stake, web-based academic reading and vocabulary screening test, administered in a university in Flanders to all incoming students (n = 15.032). Study achievement was operationalised in terms of credit completion rate. Additionally, the study examined the predictive value of the screening test when demographic background variables (gender, age, SES, nationality, home language) and educational background variables (educational track and average high school score) were controlled for. The results confirmed prior research that academic language proficiency is a small but meaningful predictor of study achievement, together with other individual factors, such as intelligence, personality, and learning attitude. The study showed that the screening test made it possible to identify at-risk candidates, by screening large groups of incoming students in a quick and practical way. Contrary to demographic background variables, which did not seem to affect the predictive validity of the screening test, educational background variables were found to have a considerable influence. In order to reduce the effect of test speed (a time constraint of 30 min) on the internal consistency of the test, it may be preferable, as concluded by the authors, to develop a longer version of the screening test.

In the same vein, Kuiken and Vedder report on a diagnostic writing test, used in a Dutch university for L1 and L2 students (n = 5810). The main objective was to explore the relationship between written language proficiency and academic achievement, operationalised in terms of grade point average and study credits obtained. An additional in-depth analysis was conducted among 300 students, in order to investigate the impact of the diagnostic writing test, next to other possible variables which may influence academic achievement (i.e. former training and time investment). The outcomes showed that 20% of the students failed the diagnostic writing test, including some who had obtained a (largely) sufficient exam mark for Dutch at high school. The second aim of the study was to explore the effects of a remedial writing programme, set up for students who had failed the diagnostic test. Results on the diagnostic writing test and former training appeared both to be related to grade point average. Time investment, not surprisingly, was associated with the number of study credits and grade point average. An important finding was that remediation appeared to be successful, as 75% of the participants who had followed the remedial programme were able to convert a ‘fail’ on the writing test into a ‘pass’.

Secondly, the three papers, respectively by Hu and Trenkic, Daller, Müller and Wang-Taylor, and by Clark and Yu, address the predictive validity of different tests in relation to academic achievement. Hu and Trenkic explore whether the test preparation industry undermines the English language proficiency qualifications of incoming students. The study investigates to what extent IELTS preparation programmes and repeated test taking may threaten the concurrent and predictive validity of IELTS. In the study, the English proficiency of 153 Chinese students in the UK was tested on the Duolingo English Test and a C-test. Prior to the study, all participants had met the language entry requirements by sitting an IELTS test. Students who attended IELTS coaching programmes prior to IELTS, and to a lesser extent those who had repeated IELTS several times, scored lower on the two alternative tests, compared to participants who met entry requirements without these preparation courses. Despite this, the IELTS scores were predictive of academic success. The results of the study attested the robustness of IELTS as a measure of ability to study in English, and confirmed that well-developed language and literacy skills are critical for success in higher education. But they also showed that coaching, and to some extent repeated test taking, may boost IELTS scores without generalising to other proficiency measures. This suggests many students arrive with qualifications that indicate stronger English proficiency than they can practically demonstrate and that, therefore, more extensive measures need to be in place to support their learning.

Daller, Müller and Wang-Taylor discuss in their paper the construct validity of the C-test, and present an overview of a number of studies in which the C-test has proven to have a high predictive validity, not only for academic success, but also for achievement in professional training in L2. The rationale underlying their research is that a timed C-test, combining the assessment of processing speed (which is related to working memory), general language proficiency (particularly vocabulary knowledge), and in-depth conceptual knowledge (‘crystallised intelligence’) may be a good predictor of study success. The authors zoom in on the results of two studies that they conducted among 134 international students in the UK (study 1), and 89 first-year students of English language and culture (native or near-native speakers of English) in the Netherlands (study 2). The findings confirmed that students’ C-test scores correlate highly with the marks obtained at the end of the academic year (study 1 and 2), and that this also holds for a variety of linguistic and literature topics and test formats (study 2).

The case study by Clark and Yu form part of a larger research project focussing on IELTS test preparation in China and Japan. They explore the challenges that international Master’s students encounter when learning to write in academic English, after having achieved the required IELTS entry score. Interviews with six Chinese and Japanese students at different stages of the Master’s study showed that, although the participants were largely managing their writing assignments, they experienced recurrent difficulties with regard to critical thinking, source-based writing and referencing. The study showed that although IELTS had provided an important first step to help students develop basic writing skills, meeting university assessment standards necessitated considerable further progress. A possible modification to IELTS, requiring candidates to demonstrate certain essential academic writing skills, might include the introduction of an integrated reading and writing component. Critical thinking skills might consequently be further developed at an earlier stage, and learners would be expected to use skills they have already encountered rather than acquiring them during an already challenging postgraduate course.

Thirdly, the paper by Szabo, Stickler and Adinolfi investigates specific aspects of academic language and learning at university level, the relationship between vocabulary knowledge in L2 and L3, and academic achievement, measured in terms of grade point average (GPA). The participants of the study were Hungarian (L1) and Romanian (L2) consecutive bilinguals, members of a linguistic minority living in Romania. All students were enrolled in a Bachelor’s programme of English (L3) language and literature. Receptive vocabulary tests were employed to examine the vocabulary sizes of the learners in both Romanian and English. The results showed significant correlations between vocabulary knowledge in the two tested languages and academic achievement. Regression analysis indicated that English vocabulary scores emerged as the best predictor, explaining 30% of the variance. This suggests that in case of typologically closer languages (Romanian and English), vocabulary tests are likely to measure the same underlying traits. Vocabulary size is thus an important explanatory factor of academic achievement, also in the case of multilinguals.

Perspectives

All the six papers in this special issue contribute, from a specific angle, to the theoretical discussion on academic language proficiency and academic achievement, and indicate avenues towards practical solutions. However, a number of issues need to be further addressed in future research. The main focus of the studies presented here is on the relationship between academic language proficiency, as assessed by various screening tests and measures, and academic achievement, of mostly first-year university Bachelor’s students or students on one-year Master’s degrees. A next step would be to examine the effects of academic language proficiency on academic success over longer periods of study. Another crucial issue for further investigation concerns the comparison of different screening tests for identifying students at risk, in terms of validity as well as efficiency, practicality and cost. Future research should furthermore establish how group-level differences relate to individual learning trajectories of both L1 and L2 students. The interaction of language proficiency with other factors affecting academic success should be investigated too, for example, personal traits (e.g. aptitude, attitude, motivation, self-confidence), educational background (e.g. learning trajectory, general mark at high school) and demographic factors (e.g. socio-economic status, home languages, L1 or L2 status, age, sex). A final issue would be remediation and tutoring: exploring the effects (in the short- and long-run) of different remediation courses, academic writing support and a writing mentor or tutor.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Biber, D. 2006. University Language: A Corpus-based Study of Spoken and Written Registers. Amsterdam: John Benjamins. doi:https://doi.org/10.1075/scl.23.
  • Biber, D., and B. Gray. 2011. “The Historical Shift of Scientific Academic Prose in English. Towards Less Explicit Styles of Expression.” In Researching Specialized Languages, edited by V. Bathia, P. Sánchez Hernández, and P. Pérez-Paredes, 11–24. Amsterdam: John Benjamins. doi:https://doi.org/10.1075/scl.47.04bib.
  • Davies, A. 2007. “Assessing Academic English Proficiency: 40+ Years of U.K. Language Tests.” In Language Testing Reconsidered, edited by J. Fox, M. Wesche, D. Bayliss, L. Cheng, C. E. Turner, and C. Doe, 73–88. Ottawa: University of Ottawa Press. doi:https://doi.org/10.2307/j.ctt1ckpccf.10.
  • Elder, C., C. Bright, and S. Bennett. 2007. “The Role of Language Proficiency in Academic Success: Perspectives from a New Zealand University.” Melbourne Papers in Language Testing 12: 24–58.
  • Ginther, A., and X. Yan. 2018. “Interpreting the Relationship Between TOEFL iBT Scores and GPA: Language Proficiency, Policy, and Profiles.” Language Testing 35: 271–295. doi:https://doi.org/10.1177/0265532217704010.
  • Hulstijn, J. H. 2015. Language Proficiency in Native and Non-Native Speakers: Theory and Research. Amsterdam: John Benjamins. doi:https://doi.org/10.1177/0265532217701468.
  • Iannelli, C., and J. Huang. 2014. “Trends in Participation and Attainment of Chinese Students in UK Higher Education.” Studies in Higher Education 39: 805–822. doi:https://doi.org/10.1108/IJEM-11-2017-0339.
  • Milton, J., and J. Treffers-Daller. 2013. “Vocabulary Size Revisited: The Link Between Vocabulary Size and Academic Achievement.” Applied Linguistics Review 4: 151–172.
  • Morrison, J., B. Merrick, S. Higgs, and J. Le Métais. 2005. “Researching the Performance of International Students in the UK.” Studies in Higher Education 30: 327–337. doi:https://doi.org/10.1515/applirev-2013-0007.
  • Paton, M. J. 2007. “Why International Students Are at Greater Risk of Failure.” International Journal of Diversity 6: 101–111.
  • Snow, C. E., and P. Uccelli. 2009. “The Challenge of Academic Language.” In The Cambridge Handbook of Literacy, edited by D. R. Olson and N. Torrance, 112–133. Cambridge: Cambridge University Press.
  • Trenkic, D., and M. Warmington. 2019. “Language and Literacy Skills of Home and International University Students: How Different are They, and Does it Matter?” Bilingualism: Language and Cognition 22: 349–365.
  • Uccelli, P., E. P. Galloway, C. D. Barr, A. Meneses, and C. L. Dobbs. 2015. “Beyond Vocabulary: Exploring Cross-Disciplinary Academic-Language Proficiency and its Association with Reading Comprehension.” Reading Research Quarterly 50: 337–356. doi:https://doi.org/10.1002/rrq.104.
  • UNESCO. 2019. “UIS Education Data Release: September 2019.” Information Paper No. 59, September 2019. UIS/2019/ED/IP/59.
  • Van Dijk, T. 2015. “ʻTried and Tested’: Academic Literacy Tests as Predictors of Academic Success.” Tijdschrift voor Taalbeheersing 37: 159–186. doi:https://doi.org/10.2117/tvt2015.2.vand.