57
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Machine Discriminating: Automated Speech Recognition Biases in Refugee Interviews

&
Published online: 21 May 2024
 

Abstract

This study scrutinizes Automated Speech Recognition (ASR) software, a powerful instrument to expedite the translation process, and their unintended bias against Arabic speakers, particularly refugees. We propose that pre-existing biases in ASR training data reflect societal prejudices, leading to orientalist and Islamophobic misrepresentations. We used four ASR tools to transcribe interviews with Arabic-speaking refugee women, employing ideological textual analysis to detect biases. Our findings indicate that ASR algorithms may inadvertently associate Arabic speakers with conflict, war, religion, and narrow down Arab identities to Islam-centric representations. Acknowledging these biases is essential for fostering a more equitable and culturally sensitive technological environment.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 415.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.