585
Views
30
CrossRef citations to date
0
Altmetric
Original Articles

Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search

, , , , &
Pages 781-795 | Received 27 Sep 2014, Accepted 15 Sep 2015, Published online: 20 Nov 2015
 

Abstract

In joint tasks, adjusting to the actions of others is critical for success. For joint visual search tasks, research has shown that when search partners visually receive information about each other’s gaze, they use this information to adjust to each other’s actions, resulting in faster search performance. The present study used a visual, a tactile and an auditory display, respectively, to provide search partners with information about each other’s gaze. Results showed that search partners performed faster when the gaze information was received via a tactile or auditory display in comparison to receiving it via a visual display or receiving no gaze information. Findings demonstrate the effectiveness of tactile and auditory displays for receiving task-relevant information in joint tasks and are applicable to circumstances in which little or no visual information is available or the visual modality is already taxed with a demanding task such as air-traffic control. Practitioner Summary: The present study demonstrates that tactile and auditory displays are effective for receiving information about actions of others in joint tasks. Findings are either applicable to circumstances in which little or no visual information is available or when the visual modality is already taxed with a demanding task.

Acknowledgements

We want to thank Alexandra Bidler, Anna-Lena Lumma, Aleksey Lytochkin, Vikash Murthy Peesapati, Ricardo Ramos Gameiro and Niklas Wilming for their help in designing and conducting this study. In addition, we want to thank Laura Schmitz for her valuable feedback on the manuscript.

Notes

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Cognition and Neuroergonomics / Collaborative Technology Alliance under [grant number #W911NF-10-2-0022]; by ERC-2010-AdG [grant number #269716 – ‘MULTISENSE’]; and by H2020 [grant number H2020-FETPROACT-2014 641321 – "socSMCs"].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 797.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.