290
Views
2
CrossRef citations to date
0
Altmetric
Research Articles

Parallel or Cross? Effects of Two Collaborative Modes on Augmented Reality Co-located Operations

ORCID Icon, , , , , & show all
Pages 3896-3907 | Received 14 Dec 2022, Accepted 03 Apr 2023, Published online: 04 May 2023
 

Abstract

Augmented reality (AR) can bring a new interactive experience to the collaboration between users. When users are in the same place, there are two modes of joint operation for the same object: parallel-work (PW) and cross-work (CW). PW means two users perform their tasks, while CW means assisting each other. To investigate the difference that collaboration using PW and CW in an AR environment brings to users, we developed a two-person local collaboration system, LoCol. We designed and conducted user experiments by selecting the tasks of adjusting the virtual model of the assembly and adding missing boundaries in the model. The results showed that CW led to a higher sense of social coexistence while reducing workload. In terms of task completion time and accuracy, CW and PW each had advantages. We found that users generally want to reduce unnecessary repetitive operations and frequent movement by working with others. This is likely an important criterion for determining who is better suited for a particular job in either approach.

Author contributions

Shuo Feng and Yizhe Liu wrote the main manuscript text. Weiping He gave inspiration to ideas and innovations. Qianrui Zhang and Xiaotian Zhang gave suggestions on writing. Shuxia Wang and Mark Billinghurst gave suggestions on writing and experimentation.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work is supported by the National Key R&D Program of China (Grant No. 2020YFB1712503) and the National Natural Science Foundation of China (Grant No. 52275513).

Notes on contributors

Shuo Feng

Shuo Feng is a PhD student in the School of Mechanical Engineering at Northwestern Polytechnical University. He received his BE degree in the School of Mechanical Engineering at Northwestern Polytechnical University. His research interests include computer graphics, augmented/virtual reality, and collaborative assembly.

Yizhe Liu

Yizhe Liu is a master student in the School of Mechanical Engineering at Northwestern Polytechnical University. He received his BE degree in the School of Mechanical Engineering at the Hangzhou Dianzi University. His research interests include AR/VR-based human–computer interaction.

Qianrui Zhang

Qianrui Zhang is a technician at the First Aircraft Institute of AVIC. She received her ME degree in the School of Mechanical Engineering at Northwestern Polytechnical University. Her research interests are precision parts processing and assembly.

Weiping He

Weiping He is a professor in the School of Mechanical Engineering at Northwestern Polytechnical University. He is interested in geometrical modeling, 2D bar-code recognition, direct part marking and automatic identification, augmented reality, and mixed reality.

Xiaotian Zhang

Xiaotian Zhang is a PhD student in the School of Mechanical Engineering at Northwestern Polytechnical University. He received his BE degree in the School of Mechanical Engineering at Northwestern Polytechnical University. His research interests include augmented/virtual reality and remote collaboration.

Shuxia Wang

Shuxia Wang is a professor at Northwestern Polytechnical University. She is interested in VR/AR/XR-based intelligent assembly, digital twin-based, Intelligent manufacturing, natural human–computer interaction based on augmented reality, virtual training, and assessment based on cognitive fatigue.

Mark Billinghurst

Mark Billinghurst is a professor of Human–Computer Interaction at the University of South Australia in Adelaide, Australia, and a professor at the Bio-Engineering Institute at the University of Auckland in New Zealand. He previously researched wearable computing, mobile interfaces, virtual reality, and collaborative systems and currently exploring empathic computing.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.