585
Views
0
CrossRef citations to date
0
Altmetric
Introduction

Mean machines? Sociotechnical (r)evolution and human labour in the translation and interpreting industry

ORCID Icon & ORCID Icon

1. Of translators and machines

This special issue originates from a panel held during the 2021 edition of the annual Science and Technology Studies Conference at the University of Technology in Graz, Austria. Entitled ‘Digitalization as a Transformational Force for Transcultural Communication,’ the panel was attended by numerous contributors to this issue. However, why do we now refer to mean machines in the main title? Little did we anticipate that just a few years later, we would find ourselves amidst an AI revolution. The advent of generative AI, showcased by ChatGPT’s debut in November 2022, hit the translation industry and translation studies by storm. Advanced neural machine translation (NMT) and AI-driven translation are now on their way to become the major performers in interlingual written translation, with AI-driven speech translation expected to follow suit soon. Machines cannot be mean, of course, but, technological advancement remains shrouded in intransparency and secrecy, not least owing to political power play and capitalist competition.

Why sociotechnical (r)evolution? Again, at the start of the project, the process of digitalization in all walks of life already was – and still is – in full swing, a process which can be described as evolutionary in the sense of the natural evolution of flora and fauna, precisely because of its networked complexity and the absence of a distinct primary driver steering its course. Against the backdrop of an intensifying automation, datafication and platformization within almost all professional fields, and inspired by an accelerated post-Covid techno-scientific push, it is not far-fetched to declare a parallel drive towards machinization in the world of translation, a drive which is revolutionary in its far-reaching consequences for the corporate-controlled translation industry and those who need to sell their labour power in the global digital economy.

And why sociotechnical (r)evolution? In recent years, the humanities and social sciences have seen an explosion in research on the interface of technology, culture and society (e.g. Ramsay, Citation2011; Gunkel, Citation2012; Roberge & Castelle, Citation2021). As human-machine interaction continues to evolve, intelligent machines themselves are becoming increasingly autonomous. In this current sociotechnical context, Yolgörmez (Citation2021, pp. 144–145) proposes a ‘relational approach to the sociology of AI’, emphasizing the importance of an epistemology of interactive practices:

In order to start thinking about AI as an integral part of a social interaction, and not just a mechanical tool that is the extension of already established structures, it is appropriate to focus on the very dynamism that underlies the operations of these intelligences. What separates some genres of AI from other machinic entities and straightforward computational processes and makes it a potential sociological being, is its capacity for interaction, which in turn takes its force from uncertainty.

Among the most dynamic ‘genres of AI’ are the new self-learning translation machines which are progressively integrated into modern translation workflows. By now, translation memory systems are a well-known phenomenon of human-machine interactivity, whereas NMT systems have only really begun to find their way into professional translation practice.

This is one of the reasons why translation scholarship struggles to incorporate – what David J. Gunkel (Citation2012, p. 8) calls – ‘the machinic other’ into its theoretical and conceptual repertoire. While the ‘translation machine’ has undoubtedly become a fixture in our landscape, it remains imperative for the academic health and intellectual vitality of our interdiscipline to overcome a certain anthropocentric bias. Doing so allows for a more dynamic and sophisticated discourse that accounts for the cultural, social, economic and ethical dimensions and interrelations of the ongoing techno-scientific revolution in intercultural and transcultural communication. This special issue, therefore, aims to provide a first modest step towards a systematic epistemological and theoretical–conceptual incursion into three sociotechnical dimensions that impinge on the translation industry, the translator’s interactions with the ‘machinic other’ and the question of human labour. After providing a brief historical and disciplinary context for translation technologies, the discussion proceeds along three sociotechnical domains: practical labour relations, structural socioeconomic dynamics, and, what we could call, underlying techno-philosophical premises. The latter is particularly essential for sustaining momentum in critical research on the interconnections among translation, technology, and culture. John Steinbeck’s famous novella Of Mice and Men talks about an inherent human drive for power and subjugation. Therefore, one hopes that the future story Of Translators and Machines does not (r)evolve along a similar trajectory.

2. The elephant in the room of translation studies

The impact of technology on translation has been profound and transformational. The technological revolution took off properly in the 1980s with the widespread availability of personal computers, while the adoption of PCs by translators coincided with the first expansion of the software industry and the gradual digitization of work processes. By the early 1990s, the integration of CAT-tools into translation workflows came into full swing, with translation memories and terminology management tools allowing translators to free themselves from the laborious tasks of scouring existing source and target texts for previously translated phrases and specialized terms (Sin-Wai, Citation2015). In ever shorter bursts of time, robust computing technologies and advances in software facilitated the extensive accumulation and management of translation data in the form of aligned translation units. As a significant technological innovation at the time, translation memory tools signify a pivotal moment in the translation profession. After an initially sceptical uptake by the translation community, CAT-tools now have matured into crucial components of a professional translator’s toolkit (Doherty, Citation2016).

The new digital gadgetry proved useful for offering terminological consistency and higher efficiency in translation work and time management, whilst enhancing collaborative work on large-scale projects. In large measure, it was also translation memory software which introduced cost-effectiveness for clients, as previously translated digital segments reduced the volume of new, billable text. Despite their drawbacks, such as the blurring of context caused by text segmentation or the depersonalization of the translation process, by now CAT-tools seem to have been embraced by most translators. The positive contemporary reception can also be attributed to the recognition that CAT-tools enhance translation quality while still preserving labour autonomy (Pietrzak & Kornacki, Citation2021). In view of today’s widespread application of translation memories, translation in the twenty-first century surely happens in the mode of human–computer interaction (HCI) (cf. O’Brien, Citation2012).

Meanwhile, up until the late 1990s, machine translation (MT) only had a minimal impact on the translation profession and remained far from becoming today’s widespread consumer product. Despite the initial excitement of the 1950s and 1960s, the development of fully automatic high-quality machine translation was falling short of meeting originally high expectations (Hutchins & Somers, Citation1992). It was not until the shift from rule-based to corpus-based methods in the 1990s that MT gradually became applicable to real-life scenarios and the translation industry, setting the stage for further advancements in the new millennium (Hutchins, Citation2015). The early 2000s witnessed a significant leap forward with the advent of statistical machine translation (SMT), especially owing to the availability of a vast reservoir of bilingual text data on the internet. The algorithms of SMT-systems steadily improved as they incorporated more and more data, thus significantly surpassing the limitations of rule-based systems (Qun & Xiaojun, Citation2015). With the exponential growth of online content, the advent of the collaborative internet and the introduction of free online translation services, MT systems became increasingly accessible to the general public. Despite such remarkable advancements, scholars long overlooked MT as a socially and culturally relevant subject of enquiry. Rozmysłowicz (Citation2014) attributes this neglect to epistemological and theoretical inclinations within the discipline that focused on meta-concepts such as power and ideology, identity and culture, as well as agency and intentionality. In translation studies, the 1990s ‘cultural turn’ and the subsequent turn to sociological empiricism favoured human-centred and actor-oriented approaches that felt – and still do feel – theoretically and conceptually challenged by MT in its functionality as a non-human actor. Unlike the taken for granted indispensability and ethical necessity of human agency for the theory and practice of translation, MT keeps being considered as an anti-concept for translation research. The current ‘actor-oriented paradigm’ prevents an integration of MT as a significant meta-concept into the predominant epistemological, theoretical and methodological orientations of the field (Rozmysłowicz, Citation2014, p. 148).

This overall lack of conceptual engagement with MT within translation studies resulted in a rather muted academic discourse on the impact and implications of automatic translation, leading to a gap in critical engagement. It is in fact not unreasonable to argue that the field may have been underprepared for the pending technological ‘tsunami’ on the translation profession and industry, with MT becoming increasingly sophisticated and integrated into professional workflows (Tieber, Citation2022). When eventually NMT systems entered the scene with even more significant quality improvements (cf. Bentivogli et al., Citation2016), it became almost impossible to ignore the proverbial elephant in the room. It is against this (r)evolutionary backdrop that we are now trying to come to grips with the massive transformative potential of NMT and its AI-driven offshoots, both as technological artefacts and as socially relevant actors in their own right. All the combined technology-induced transformations in the translation industry and beyond prompted scholars to rediscover and reconceptualize both MT and translation technology in general. Not only did the attention towards translation technology intensify, but there also occurs a notable expansion and specialization of the sociotechnological spectrum inside our interdiscipline.

3. Translation technology in society: a three-pronged roadmap

This special issue reflects the diverse strands and perspectives within translation studies that examine the professional, industrial and socioeconomic as well as paradigmatic implications of technology in translation. This section therefore represents an effort to organize and categorize existing research by laying out a broadly conceived three-pronged roadmap for a still nascent sociotechnical approach towards translation technology. First, we explore hands-on implementations of translation technology, examining its direct effects on attitudes, translation processes and workflows. Second, we focus on the broader implications of translation automation, delving into its transformative effects on the socioeconomic landscape of the profession and industry. Third, we discuss research dedicated to examining paradigms, theories, concepts and methodologies required for understanding the digital revolution within translation theory and practice. In the following, and in close connection with the contributions to this special issue, each of the three sociotechnical research dimensions will be examined in some more detail.

3.1. Technological impact on labour

Translation studies has been engaging in practical terms with translation technology by focusing on the integration and impact of digital artefacts in translation processes and workflows. This involves analysing how translators interact with technology, the effectiveness of MT tools in aiding translators, and the challenges for professional workflows. More than 10 years ago, O’Brien (Citation2012, p. 101) characterized translation in terms of the evolving relationship between translators and technology as ‘translator-computer-interaction’ (TCI), including the use of translation memory tools, terminology databanks, MT systems, translation management programmes and various types of software. Technology has profoundly transformed and impacted translation in terms of quality standards, speed and cost, while translation technologies keep reshaping the cognitive and ergonomic aspects of translating. Significant challenges for TCI remain a perceived dehumanization and devaluation of translation work, a perceived loss of creativity and a lack of usability of some translation tools (O’Brien, Citation2012, p. 118ff.). A key focus within this sociotechnical dimension is machine translation post-editing (MTPE), which has been explored from multiple angles. Translation studies has delved into MTPE by evaluating its efficiency and quality in comparison to traditional human translation (Läubli et al., Citation2019; Terribile, Citation2023), investigating its cognitive impact on translators in terms of usability (Moorkens & O’Brien, Citation2017), and considering the viewpoints and attitudes of various stakeholders such as clients, project managers, and translators (Sakamoto & Yamada, Citation2020). A significant issue for MTPE is the hesitancy or outright resistance among translators to undertake MTPE tasks. This reluctance is primarily attributed to lower remuneration and the perceived relegation of the translator’s role towards the mere correction of MT-output (Guerberof-Arenas, Citation2013; Cadwell et al., Citation2018). Additional challenges for translation processes and workflows include the changing skill sets of translators in the digital age (Pym, Citation2013; Mellinger, Citation2017), the corresponding necessary adaptations in translation curricula (Kenny, Citation2019; Besznyák et al., Citation2020), and the emergence of crowdsourcing and online collaborative translation (OCT) (Jiménez-Crespo, Citation2017). Notably, the latter also implicates broader consequences beyond the scope of processes and workflows, as OCT significantly influences the public perception of translation work, inducing further transformations in professional roles which become manifest in the growing involvement of non-professional translators in the field (Zwischenberger, Citation2022).

The first two papers in this special issue focus on this practical human element in translation technology, exploring the intricate relationship between translation, humans and technological artefacts. They probe into the social and cognitive effects of technology on translation practices and challenge deterministic perspectives on technology. Both papers advocate for more subtle, human-centred, and socially conscious approaches in understanding and integrating technology in translation. In her contribution Human-Centered Augmented Translation: Against Antagonistic Dualisms, Sharon O’Brien explores various perspectives and definitions of augmentation in relation to past, present, and possible future scenarios concerning its relevance for and implementation in translation practices and processes. By highlighting the concept of ‘antagonistic dualism’, which frames the relationship between humans and machines as an adversarial ‘us versus them’ scenario, O’Brien imagines an alternative perspective for our interaction with machines. She suggests perceiving machines as extensions of human capabilities, not as adversaries. Leaning on Shneiderman's (Citation2020) notion of human-centered artificial intelligence (HCAI), O’Brien characterizes augmented translation rather as an empowering tool for translators, envisioning a symbiotic collaboration between humans and machines, where the machine may enhance and augment human skills without supplanting them. This approach goes beyond the mere pursuit of enhancing translators’ productivity and efficiency, as it encompasses the utilization of AI and sensor technologies to bolster human cognitive capabilities while ensuring that translators retain authority over the translation process.

In the second paper of this special issue, Paola Ruffo uses the social construction of technology (Pinch & Bijker, Citation1984) framework to analyse perceptions and interactions of literary translators with technology. Her paper Exploring Literary Translators’ Relationship with Technology: SCOT as a Proactive and Flexible Approach argues that literary translators, a group often marginalized in technology development, are faced with tools primarily designed for boosting speed and efficiency, tools which entirely neglect the creativity and intuition required for literary translation skills. By means of a questionnaire involving 150 respondents, she reveals a connection between literary translators’ self-perception and their relationship with technology. While a substantial portion of the respondents indicate a positive attitude towards technology, there is notable scepticism towards digital translation tools as these rarely align with the specific needs of literary translators. Ruffo therefore highlights the necessity for a greater involvement on the part of literary translators in the development of translation technology to ensure that these tools adequately reflect and support the professional identities, specific requirements and especially the practical needs of this group of MT users.

3.2. Socioeconomic dimensions

The increasing digitization of translation has a significant socioeconomic impact on the translation market and their practitioners. Scholars recognized this trend and have begun to explore the sociotechnical dimensions of translation in the global digital economy. Not least due to a growing trend towards the platformization of translation services, the industry is under pressure from falling prices and globalized competition. In the face of increasing automation and the continuous accumulation of vast translation data by large corporate players, the sustainability and livelihood of the translation profession has also been questioned by leading thinkers in the translation industry (e.g. van der Meer, Citation2021). In the light of this narrative, do Carmo (Citation2020) explored the relationship between time, money, and the value of professional translation. He critically examined the translation industry’s narratives about MTPE by contrasting their framings with the reality of translation work against the – often neglected – complexity and value of translation as an expert activity. In the context of the translation industry and market, Nunes Vieira (Citation2020) highlighted translators’ perspectives on the impact of technology by employing the term ‘automation anxiety’ to show how MT affects their job security and pay. An analysis of translator blogs showed various nuanced attitudes towards technological changes, which lead him to the conclusion that ‘most criticism of MT concerned primarily not a fear of being outperformed by MT systems or an intrinsic aversion to the technology, but rather MT’s current limitations and some of the business practices that surround its use’ (Nunes Vieira, Citation2020, p. 16). The organizational side of translation labour, the integration of Taylorist principles such as task segmentation and efficiency maximization on digital platforms within the translation industry, has also come under increased scrutiny (Moorkens, Citation2020). This trend towards monitoring and optimizing every aspect of the translation process offers potential benefits in terms of speed and cost for clients and agencies, but it also brings forth concerns regarding reduced job satisfaction, limited work autonomy, and a reduced sense of professional fulfilment for translators.

This special issue features four contributions that specifically focus on the ways in which technological advancements, gig economy models and algorithmic processing are affecting the working conditions, pricing structures, and professional identity of translators and interpreters. In particular, digital translation platforms have significantly transformed translation work through the introduction of new business models and labour practices. In Translators in the Platform Economy: A Decent Work Perspective, Gökhan Fırat, Joanna Gough and Joss Moorkens assess the working conditions of Turkish translators engaged on translation platforms. The authors conducted a quantitative survey involving 48 translation workers based in Turkey, aiming to obtain deeper insights into the commercial and socioeconomic underbelly of translation platforms. The questionnaire was designed in alignment with six fundamental decent work standards as established by the International Labour Organization (ILO). The findings suggest that while digital platforms offer certain advantages, they pose significant challenges to the decent work conditions of translators, especially in terms of work-life balance, decision-making influence, collective representation, and financial security. The authors stress the need for a deeper understanding of the diverse conditions and challenges faced by translators in different regions and under varying business models, with the goal of developing more equitable and sustainable practices in the translation industry.

The platform economy does not only affect labour conditions, power dynamics and worker autonomy in translation but also significantly reshapes the field of interpreting. In her paper ‘You Can Book an Interpreter the same Way you Order your Uber’: (Re)interpreting Work and Digital Labour Platforms, Deborah Giustini delves into the effects of the gig economy and of platformization on interpreting services. Utilizing the framework of labour process theory, a Marxist sociological perspective, she investigates the power dynamics between clients, platforms and interpreting workers in the interpreting industry. Giustini conducted a case study encompassing three platforms based in Northwestern Europe that provide interpreting services, including an analysis of the platforms’ key operational aspects and interviews with managers and interpreters. The findings highlight the complexities and challenges faced by interpreters in the gig economy, particularly in terms of platform control, income diversification, and the impact of digital presence and reputation scores on their work opportunities. The findings underline the urgency for a deeper comprehension of unequal structures and unethical practices within the gig economy for translation and interpreting as well as the urgent need to begin challenging the unsatisfactory current situation.

Akiko Sakamoto’s and Sarah Bawa Mason’s investigation of pricing practices provides further evidence of oligarchic relations in the language industry at large. Their study In Search of a Fair MTPE Pricing Model: LSPs’ Reflections and the Implications for Translators is grounded empirically on a focus group with representatives from eight multi-national, yet small and medium-sized, language service providers. Theoretically, the study intertwines practice theory with organisational psychology, facilitating an analysis of the industry’s three major pricing models – hourly, per-word and edit-distance rating. This makes their findings compelling for follow-up studies and future deliberation by industry stakeholders. The fact that all eight focus group members were interested in participating mainly because ‘they wanted to know what was happening in the industry’ not only illustrates the intransparency of finance discourse but also the undemocratic and thus illegitimate power yielded by the largest market players. Corporate industry leaders shape and dictate industrial standards, professional frameworks, work processes and pricing models, and ultimately the livelihoods of translation workers worldwide.

Joss Moorkens’ paper ‘I am not a number’: On Quantification and Algorithmic Norms in Translation critically confronts the dehumanising effects of today’s digitally automated – and thus largely opaque – translation workflows. Moorkens’ contribution represents a refreshing critical look at the ethics of digital translation from the viewpoints of mathematics and critical philosophy. Particularly noteworthy is the author's engagement with Toury's and Chesterman's norm theories in the context of today’s technologized translation workplace. This is a physical and virtual space that demands the translator’s subjection to the dictates – and thus norms – of new regimes of ‘algorithmic management’. Given that the translation industry highly depends on quantified and quantifiable production and output measures, it has become ‘colonised’ by a spirit of instrumental thinking (cf. Dizdar, Citation2014), in particular on the part of industrial decision-makers. Meanwhile, the rise of opaque performance evaluations has forced translators on platforms to adhere to novel standards of correctness – ‘algorithmic norms’ – eroding their agency, their work autonomy and thus undermining their social and economic value as transcultural professionals.

3.3. Techno-philosophical premises

A sociotechnical approach towards translation technology and MT remains to be integrated into broader onto-epistemological and axiological discourses within the field of translation studies. This lag in integration has been ascribed to the way in which translation automation contests the traditionally human-centric epistemologies underpinning the field (cf. Rozmysłowicz, Citation2014; Citation2019). A pioneer and seminal figure in theorizing the digitization of translation is Michael Cronin (Citation2012), who examines the profound sociocultural and professional impact of digital technology and the internet on the global translation landscape. He characterizes the ongoing techno-industrial shift as having significant and far-reaching consequences for human languages, cultures, and societies, underscoring the urgency for new perspectives and dialogues that recognize the dramatic changes in the digital era. Cronin (Citation2019) also stresses the importance of considering the environmental ramifications of extractivist industries and their impact on the language service industry, particularly how certain resource-intensive translation technologies such as MT ultimately contribute to climate change through their extensive consumption of power. He urges to critically examine which translation technologies and practices can be justified in light of their environmental impact.

O’Thomas (Citation2017) also takes a culture-sensitive approach to translation technology and places the evolving role of human translators and the concept of translation in the context of technological advancement. By engaging with the emerging metaphysical paradigm of posthumanism and its ideological offshoot transhumanism, he argues for a still pending redefinition of both the translator’s role and of what we actually mean by translation in the digital age. A need for theoretical and conceptual innovation arises in the wake of continuously transforming roles, processes, MT-technologies, and in view of the unstoppable rise of AI. The lines between human and machine translation increasingly blur, the role of human translators may dramatically change or even diminish, which is why the potential for a ‘posthuman translation theory’ arises, given that ‘the survival of translation studies will be contingent on the survival of translation itself and its ability to question its own subjective posthuman self’ (O’Thomas, Citation2017, p. 284).

To effectively navigate all these existential transformations induced by AI and technology, we need a broadening of epistemological perspectives and theoretical frameworks in connection with an evolution of methods (cf. McDonough Dolmaya, Citation2024). In this context, Olohan (Citation2017, p. 280) contends that ‘translation studies can expand its repertoire of applicable social theories to account for the hitherto rather neglected technological and material dimensions’. To this end, Olohan suggests the incorporation of concepts from science and technology studies (STS) into translation studies, which will contest deterministic views on technology and supplant them with a clearer perspective on the social and ideological dimensions of translation technology. By pairing culture-sensitive epistemologies with a non-deterministic perspective on technology, scholars can more convincingly articulate the numerous ways in which translation technologies intersect with power dynamics and ideological values. These technologies, with their capacity to reinforce unquestioned hegemonic norms of thought and behaviour, are imbricated with underlying belief systems. The remaining two papers in this special issue take such a broad epistemological outlook by engaging with new paradigmatic and ideological challenges faced by the translation studies community and by the translation industry at large.

Tomasz Rozmysłowicz’ contribution The Politics of Machine Translation. Reprogramming Translation Studies highlights some of the many theoretical and conceptual gridlocks that our interdiscipline encounters in the face of ongoing sociotechnical transformation. By leaning on Gaston Bachelard’s notion of ‘epistemological obstacle’, Rozmysłowicz presents a compelling argument about the current state of scholarship on translation, machines and humans. On the one hand, a human-centric bias in translation research remains an ‘obstacle’ for the inception of a sophisticated sociotechnical discourse. And on the other, a persistent ontological – and thus instinctive – temptation to clearly differentiate between ‘us’ humans and ‘them’ machines remains part of what the author refers to as the ‘anthropolitics’ of translation studies. To overcome this epistemological hurdle, Rozmysłowicz argues for a ‘translation-sociological attitude’ along empirical lines of enquiry which might help to denaturalize our tenacious – frequently subconscious – adherence to the human-machine dichotomy. Moreover, such an empirically-inspired manoeuvre may foster greater self-reflexivity and critical engagement with the field’s ideological entanglements in times of translation automation, multilingual artificial intelligence and digital hyperconnectivity.

The concluding piece by Stefan Baumgarten and Carole Bourgadel entitled Digitalisation, neo-Taylorism and Translation in the 2020s offers a critical-philosophical examination of the global translation industry and its treatment of the translation workforce. Seen from a broad critical perspective, language service providers are subject to the neoliberal system of free market capitalism with all its accompanying pressures to successfully compete in a digitally interconnected translation industry destined to generate profits and shareholder value. While owners of translation production resources are competing for market influence and capital – increasingly through the aggregation of ‘big translation data’ – the majority of translation workers is left to compete for largely underpaid translation commissions on a global market. Viewed through a philosophical lens, the digital economy of translation favours production values such as speed, efficiency, quantification, and cost savings, in its pursuit of commercial profit. However, it remains doubtful if the industry’s relentless drive towards ever-enhanced productivity and translation perfection – characterized by its unyielding ‘technological progressivism’ – is the right path towards the utopia of ethically sustainable work environments on a global scale.

4. Summary and outlook

The eight articles compiled in this special issue centre around a common theme: the social and cultural ramifications of translation technologies. There is a growing body of research on labour relations, economic implications, and on the philosophical premises necessary for critically informed research on translation technologies. It is our hope that the research presented here will form the springboard for critical follow-up enquiries on the various aspects related to the practical, structural and epistemological dimensions of translation work in the global digital economy. First, from a practical labour point of view, significant scholarly, organisational and institutional efforts are needed to level the playing field – to evoke Bourdieusian field theory – for the majority of salaried translation workers. Second, from a structural sociological perspective, today’s globalized system of production lacks a clearly defined locus of power, not least given its decentred and digitally interconnected nodes of influence and interest. The power of today’s networked system lies less in the power of individuals and groups than in the power of the system’s machines. Third, for these reasons, any future research on the sociotechnical dimensions surrounding translation technology needs a robust onto-epistemological foundation aligned with an ethically sound axiology.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Michael Tieber

Michael Tieber holds a PhD in translation studies and is a postdoctoral researcher at the University of Graz, Austria. His research interests are focused on translation in the context of globalization and digitization. In his doctoral thesis, he investigated the concept of translation in computational linguistics. His current project is dedicated to examining divergent perceptions and framings of machine translation across different fields and groups.

Stefan Baumgarten

Stefan Baumgarten is currently head of the Department of Translation Studies at the University of Graz. He is also heading the research area ‘Translation, Ethics and Digital Transformation’. His research centres on the social impact of translation technologies, (critical) translation theories, and the role of translation as an ideological practice. He is co-editor (with Michael Tieber) of the forthcoming Routledge Handbook of Translation Technology and Society.

References

  • Bentivogli, L., Bisazza, A., Cettolo, M., & Federico, M. (2016). Neural versus phrase-based machine translation quality: A case study. Paper presented at the Conference on empirical methods in natural language processing (EMNLP), November 1–5, 2016, Austin, Texas, USA. https://arxiv.org/pdf/1608.04631.pdf.
  • Besznyák, R., Márta, F., Csilla, S., & Arevalillo, J. J.2020). Fit-for-market translator and interpreter training in a digital age. Vernon Press.
  • Cadwell, P., O’Brien, S., & Teixeira, C. (2018). Resistance and accommodation: Factors for the (non-) adoption of machine translation among professional translators. Perspectives, 26(3), 301–321. https://doi.org/10.1080/0907676X.2017.1337210
  • Cronin, M. (2012). The translation age: Translation, technology and the new instrumentalism. In L. Venuti (Ed.), The translation studies reader (pp. 469–483). Routledge.
  • Cronin, M. (2019). Translation, technology and climate change. In M. O’Hagan (Ed.), The Routledge handbook of translation and technology (pp. 516–530). Routledge.
  • Dizdar, D. (2014). Twenty years EST: Same place, different times. Target. International Journal of Translation Studies, 26(2), 206–223. https://doi.org/10.1075/target.26.2.03diz
  • do Carmo, F. (2020). Fair MT. Translation Spaces, 9(1), 35–57. https://doi.org/10.1075/ts.00020.car
  • Doherty, S. (2016). The impact of translation technologies on the process and product of translation. International Journal of Communication, 10, 947–969. https://ijoc.org/index.php/ijoc/article/viewFile/3499/1573.
  • Guerberof-Arenas, A. (2013). What do professional translators think about post-editing? The Journal of Specialised Translation, 19, 75–95. https://aclanthology.org/www.mt-archive.info/10/JOST-2013-Guerberof.pdf.
  • Gunkel, D. J. (2012). The machine question – Critical perspectives on AI, robots, and ethics. MIT Press.
  • Hutchins, W. J. (2015). Machine translation: History of research and applications. In C. Sin-Wai (Ed.), The Routledge encyclopedia of translation technology (pp. 120–136). Routledge.
  • Hutchins, W. J., & Somers, H. L. (1992). An introduction to machine translation. Academic Press.
  • Jiménez-Crespo, M. A. (2017). Crowdsourcing and online collaborative translations. Expanding the limits of translation studies. John Benjamins.
  • Kenny, D. (2019). Technology and translator training. In M. O’Hagan (Ed.), The Routledge handbook of translation and technology (pp. 498–515). Routledge.
  • Läubli, S., Chantal, A., Düggelin, P., Gonzalez, B., Zwahlen, A., & Volk, M. 2019. Post-editing productivity with neural machine translation: An empirical assessment of speed and quality in the banking and finance domain. Proceedings of machine translation summit XVII volume 1: Research track. Dublin, Ireland, August 19–23, 2019, 267–272. European Association for Machine Translation. https://www.aclweb.org/anthology/W19-6626
  • McDonough Dolmaya, J. (2024). Digital research methods for translation studies. Routledge.
  • Mellinger, C. (2017). Translators and machine translation: Knowledge and skills gaps in translator pedagogy. The Interpreter and Translator Trainer, 11(4), 280–293. https://doi.org/10.1080/1750399X.2017.1359760
  • Moorkens, J. (2020). Fair MT. Translation Spaces, 9(1), 12–34. https://doi.org/10.1075/ts.00019.moo
  • Moorkens, J., & O’Brien, S. (2017). Assessing user interface needs of post-editors of machine translation. In D. Kenny (Ed.), Human issues in translation technology (pp. 109–130). Routledge.
  • Nunes Vieira, L. (2020). Automation anxiety and translators. Translation Studies, 13(1), 1–21. https://doi.org/10.1080/14781700.2018.1543613
  • O’Brien, S. (2012). Translation as human-computer interaction. Translation Spaces, 1, 101–122. https://doi.org/10.1075/ts.1.05obr
  • Olohan, M. (2017). Translation in times of technocapitalism. Target. International Journal of Translation Studies, 29(2), 264–283. https://doi.org/10.1075/target.29.2.04olo
  • O’Thomas, M. (2017). Translation in times of technocapitalism. Target. International Journal of Translation Studies, 29(2), 284–300. https://doi.org/10.1075/target.29.2.05oth
  • Pietrzak, P., & Kornacki, M. (2021). Using CAT tools in freelance translation. Insights from a case study. Routledge.
  • Pinch, T. J., & Bijker, W. E. (1984). The social construction of facts and artefacts: Or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399–441. https://doi.org/10.1177/030631284014003004
  • Pym, A. (2013). Translation skill-sets in a machine-translation age. Meta, 58(3), 487–503. https://doi.org/10.7202/1025047ar
  • Qun, L., & Xiaojun, Z. (2015). Machine translation: General. In C. Sin-wai (Ed.), Routledge encyclopedia of translation technology (pp. 105–119). Routledge.
  • Ramsay, S. (2011). Reading machines – toward an algorithmic criticism. University of Illinois Press.
  • Roberge, J., & Castelle, M.2021). The cultural life of machine learning – An incursion into critical AI studies. Palgrave Macmillan.
  • Rozmysłowicz, T. (2014). Machine translation: A problem for translation theory. New Voices in Translation Studies, 11, 145–163. https://newvoices.arts.chula.ac.th/index.php/en/article/view/257.
  • Rozmysłowicz, T. (2019). Die Geschichtlichkeit der Translation(swissenschaft). Zur paradigmatischen Relevanz der maschinellen Übersetzung. Chronotopos, 2(1), 17–41. https://doi.org/10.25365/cts-2019-1-2-3
  • Sakamoto, A., & Yamada, M. (2020). Fair MT. Translation Spaces, 9(1), 78–97. https://doi.org/10.1075/ts.00022.sak
  • Shneiderman, B. (2020). Human-centered artificial intelligence: Three fresh ideas. AIS Transactions on Human-Computer Interaction, 12(3), 109–124. https://doi.org/10.17705/1thci.00131
  • Sin-Wai, C. (2015). The development of translation technology. In C. Sin-Wai (Ed.), The Routledge encyclopedia of translation technology (pp. 3–31). Routledge.
  • Terribile, S. (2023). Is post-editing really faster than human translation? Translation Spaces, https://doi.org/10.1075/ts.22044.ter
  • Tieber, M. (2022). Investigating translation concepts in machine translation: A case for translaboration. In C. Zwischenberger, & A. Alfer (Eds.), Translaboration in analogue and digital practice – Labour, power, ethics (pp. 109–134). Frank & Timme.
  • van der Meer, J. (2021). Translation economics of the 2020s: A journey into the future of the translation industry in eight episodes. Multilingual Magazine July/Aug 2021. https://multilingual.com/issues/july-august-2021/translation-economics-of-the-2020s/.
  • Yolgörmez, C. (2021). Machinic encounters: A relational approach to the sociology of AI. In J. Roberge, & M. Castelle (Eds.), The cultural life of machine learning – An incursion into critical AI studies (pp. 143–166). Palgrave Macmillan.
  • Zwischenberger, C. (2022). Online collaborative translation: Its ethical, social, and conceptual conditions and consequences. Perspectives, 30(1), 1–18. https://doi.org/10.1080/0907676X.2021.1872662

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.