1,873
Views
1
CrossRef citations to date
0
Altmetric
Other research papers

Man or machine? Comparing the difficulty of human translation versus neural machine translation post-editing

ORCID Icon & ORCID Icon
Pages 950-968 | Received 04 Jan 2021, Accepted 19 Sep 2022, Published online: 24 Oct 2022
 

ABSTRACT

This study aims to compare neural machine translation (NMT) post-editing and human translation in terms of task difficulty while considering source text (ST) complexity and machine translation (MT) quality levels, two factors that have been rarely examined in previous comparison studies. Data were obtained from 60 trainee translators concerning the perceived and objective difficulties of post-editing and human translation tasks and the participants’ performance. It was found that (1) the difficulty of the NMT post-editing task, compared to human translation, was significantly influenced by both NMT quality and ST complexity; the difficulty of the post-editing task was significantly lower than that of the human translation task only in the case of high-quality NMT paired with complex ST, while the results were mixed for other interactions between NMT quality and ST complexity levels; (2) no strong correlations were found between the participants’ perceived difficulty and the measurements of objective difficulty and task performance for both post-editing and human translation tasks. Practical and research implications were discussed.

Acknowledgments

The authors would like to thank the editor, the anonymous reviewers, and Dr. Su Steward for their comments and feedback on the earlier drafts of this article. Many thanks also go to Dr. Jiajun Qian for his constructive advice on the statistical analyses in this study.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 The STs and MT outputs used in this study can be found at https://github.com/translationdata/post-editing

3 This study is part of a larger project, where both fixation duration and fixation count were collected, but only fixation count was analysed for this study.

4 The R scripts used in this study can be found at https://github.com/translationdata/post-editing

5 Fixed effects and random effect for the eight lme models: STcomplexity * Tasktype +(1|participant)

Additional information

Funding

This work was supported by Scientific Research Fund of Hunan Provincial Education Department: [Grant Number 20B370]; Fundamental Research Funds for the Central Universities: [Grant Number 2020JS001]; Scientific Research Fund of the Research Institute of Languages and Cultures at Hunan Normal University: [Grant Number 2020QNP01].

Notes on contributors

Yanfang Jia

Dr. Yanfang Jia holds a PhD in Translation Studies from Hunan University. She is a lecturer in Translation Studies at Hunan Normal University, China. Her current research interests are related to the cognitive aspects of human–computer interaction, machine translation post-editing, computer-aided translation, and translation technology pedagogy.

Sanjun Sun

Sanjun Sun is Associate Professor in Translation Studies at Beijing Foreign Studies University. He holds a PhD in Translation Studies from Kent State University and is the editor of Fanyi Jie (Translation Horizons), a Chinese journal on translation studies. His research interests include cognitive translation studies, research methods, and translation technology. Email: [email protected]

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.