262
Views
2
CrossRef citations to date
0
Altmetric
Research Articles

Perceived naturalness of emotional voice morphs

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 731-747 | Received 22 Jun 2022, Accepted 05 Apr 2023, Published online: 27 Apr 2023
 

ABSTRACT

Research into voice perception benefits from manipulation software to gain experimental control over acoustic expression of social signals such as vocal emotions. Today, parameter-specific voice morphing allows a precise control of the emotional quality expressed by single vocal parameters, such as fundamental frequency (F0) and timbre. However, potential side effects, in particular reduced naturalness, could limit ecological validity of speech stimuli. To address this for the domain of emotion perception, we collected ratings of perceived naturalness and emotionality on voice morphs expressing different emotions either through F0 or Timbre only. In two experiments, we compared two different morphing approaches, using either neutral voices or emotional averages as emotionally non-informative reference stimuli. As expected, parameter-specific voice morphing reduced perceived naturalness. However, perceived naturalness of F0 and Timbre morphs were comparable with averaged emotions as reference, potentially making this approach more suitable for future research. Crucially, there was no relationship between ratings of emotionality and naturalness, suggesting that the perception of emotion was not substantially affected by a reduction of voice naturalness. We hold that while these findings advocate parameter-specific voice morphing as a suitable tool for research on vocal emotion perception, great care should be taken in producing ecologically valid stimuli.

Acknowledgements

The original voice recordings that served as a basis for creating our stimulus material were provided by Sascha Frühholz. We are grateful to all participants of the study. We thank Laura Luther, Sven Kachel, and Annett Schirmer for helpful contributions to this manuscript, and Andrea E. Kowallik for contributing to the validation of voice stimuli used in this paper.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability

Supplemental figures and tables, analysis scripts, and preprocessed data can be found on the associated OSF repository (https://osf.io/jzn63/).

Credit author statement

Christine Nussbaum – Conceptualisation, Methodology, Software, Visualisation, Formal analysis, Writing – Original Draft, Supervision.

Manuel Pöhlmann – Conceptualisation, Methodology, Formal analysis, Visualisation, Writing – Review & Editing.

Helene Kreysa – Methodology, Supervision, Writing – Review & Editing, Supervision.

Stefan R. Schweinberger – Conceptualisation, Writing – Review & Editing, Supervision.

Notes

1 Note that the number of participants is slightly unequal for each pseudoword (16/18/17). Therefore, we ran a second analyses where we randomly excluded three participants to have equal group size, resulting in an identical pattern of effects.

Additional information

Funding

CN has been supported by the German National Academic Foundation (“Studienstiftung des Deutschen Volkes”). Experiment 1 was conducted by MP in partial fulfilment of a Bachelor degree. Experiment 2 was conducted by Laura Luther in partial fulfilment of a Bachelor degree.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 503.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.