297
Views
3
CrossRef citations to date
0
Altmetric
Research Articles

Keep It Brief: Videoconferencing Frequency and Duration as Predictors of Visual and Body Discomfort

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, & ORCID Icon
Pages 1150-1161 | Received 20 Mar 2022, Accepted 22 Sep 2022, Published online: 31 Oct 2022
 

Abstract

The recent COVID-19 pandemic has led to a drastic increase in the frequency of videoconferencing used for work, school, and socialization. To date, the user experience impact of this increased screen time is unknown. We surveyed 489 participants (Mage = 24.19, Range = 18–72) to determine which factors best predict visual and body discomfort. Along with gender, screen time, and level of subjective meeting fatigue, meeting duration significantly predicted visual discomfort. In contrast, meeting frequency (along with the level of meeting engagement, subjective meeting fatigue, and other covariates) significantly predicted bodily discomfort. These results highlight the need for greater ergonomic evaluation of work-from-home setups, as well as point to a need for shorter, fewer, and more engaging video meetings for the average worker from home.

Acknowledgments

The authors would like to thank the members of the Visual Information Sciences and Neuroscience (VISN) lab for their contributions to this work.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the California State University, Northridge College of Social and Behavioral Sciences Summer Research Award.

Notes on contributors

Taylor A. Doty

Taylor A. Doty is a third year in the Cognitive Psychology and Human–Computer Interaction PhD programs at Iowa State University. She primarily works in the Navigation Lab under Dr. Jonathan Kelly. Her focus is on user experience research with an emphasis on technology and virtual reality.

Lauren E. Knox

Lauren E. Knox is a graduate student in the Psychology department of University of California, Santa Cruz. Her interests are in cognitive and social neuroscience. Currently, Lauren is working on a project investigating the relationships between mindfulness and cognitive, social, and affective factors.

Alexander X. Krause

Alexander X. Krause is a senior at California State University, Northridge, studying Kinesiology with a concentration in exercise science in the Move-Learn lab. He is particularly interested in the mechanisms for movement with an understanding of the mind. Alex is currently researching how augmented reality can improve motor function.

Sara R. Berzenski

Sara R. Berzenski is an associate professor at California State University, Northridge in the Psychology department. She leads the Researching Emotion Across Childhood (REACH) Program lab where her primary research interests are understanding the development of emotional competence in both typical and atypical (i.e., adverse) contexts.

Jacob W. Hinkel-Lipsker

Jacob W. Hinkel-Lipsker is an assistant professor at California State University, Northridge in the Kinesiology department where he co-leads the Move-Learn lab. He has a wide range of research interests, all of which can be considered neuromechanics—or how an individual’s biomechanical movements are shaped by sensory information.

Stefanie A. Drew

Stefanie A. Drew is a full professor at California State University, Northridge where she leads the Visual Information Sciences and Neuroscience (VISN) Lab. She is also the co-director of the Psychological Science Graduate Program. Her research interests are in virtual reality, Zoom Fatigue, accommodation, asthenopia, reading, and ADHD.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.