611
Views
9
CrossRef citations to date
0
Altmetric
Articles

Externalizing the Private Experience of Pain: A Role for Co-Speech Gestures in Pain Communication?

, , &
Pages 70-80 | Published online: 31 Jan 2014
 

Abstract

Despite the importance of effective pain communication, talking about pain represents a major challenge for patients and clinicians because pain is a private and subjective experience. Focusing primarily on acute pain, this article considers the limitations of current methods of obtaining information about the sensory characteristics of pain and suggests that spontaneously produced “co-speech hand gestures” may constitute an important source of information here. Although this is a relatively new area of research, we present recent empirical evidence that reveals that co-speech gestures contain important information about pain that can both add to and clarify speech. Following this, we discuss how these findings might eventually lead to a greater understanding of the sensory characteristics of pain, and to improvements in treatment and support for pain sufferers. We hope that this article will stimulate further research and discussion of this previously overlooked dimension of pain communication.

ACKNOWLEDGMENTS

We thank the three anonymous reviewers for their comments on an earlier draft of this article.

Notes

1 Other studies have considered whether the form of the gesture (e.g., a point to the chest versus a palm laid flat on the chest) can be used to distinguish between myocardial infarction (MI) and non-MI chest pain within an emergency setting (Albarran et al., Citation2000; Marcus et al., 2007). Although patients did use their hands to indicate the location of the pain, the form of the gesture was not a reliable indicator of outcome, suggesting that these gestures are not useful for diagnosis in this context. However, the researchers did not consider whether gestures can tell us anything about the sensory experience of pain, and as patients were asked to “show” their pain, the focus was not on the spontaneous co-speech gestures that we consider in this article.

2 Of course, in this example the speech also contains information that is not depicted in gestures, e.g., about the type of pain (“headache”) and to some degree the intensity (“really bad”), but the current argument is concerned with what gestures can add to speech. It is accepted that speech will also contain information that is not in gesture, hence our suggestion that both modalities should be attended to as they interact in the representation of meaning.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 371.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.