531
Views
0
CrossRef citations to date
0
Altmetric
Research Article

A multiplayer VR showdown game for people with visual impairment

ORCID Icon, ORCID Icon & ORCID Icon
Received 08 Feb 2022, Accepted 08 Apr 2024, Published online: 18 Apr 2024

ABSTRACT

Researchers have developed virtual reality (VR) systems for people with visual impairment using various audio feedback techniques. However, few studies have investigated collaborative VR systems where people with and without visual impairment can participate together. We developed a VR Showdown, similar to a real showdown game, where players with and without visual impairment can play together in a remote virtual environment. We incorporated auditory distance perception using a head-related transfer function based on the spatial position of the Showdown ball in VR. We developed two modes of VR Showdown. One is the player vs. AI agent (PvA) mode, where visually impaired people can play alone. The other is the player vs. player (PvP) mode, in which visually impaired people can play with another player in a network. The user study results showed that people with and without visual impairment enjoyed playing VR Showdown games that were well-matched.

1. Introduction

Virtual reality (VR) can support people with visual impairment, even though these systems rely mainly on visual feedback. One possibility is to play audio sports like “Showdown” virtually without the bulky table, racket, and ball. Showdown is a fast-moving sports game, similar to tabletop air hockey, originally designed for people with visual impairment, but anyone can play. Researchers are investigating using the virtual reality Showdown game for people with visual impairment: Wedoff et al. (Citation2019) developed the Virtual Showdown, where players hit the incoming ball based on verbal/vibration scaffolds. In their game, the system detects the player’s gestures and provides spatial audio and verbal feedback to the player to detect the ball’s movement. They used a head-related transfer function (HRTF) with a Microsoft audio spatializer to present a localization sound cue of a moving ball to people with visual impairment. In addition, the system used a Microsoft Kinect camera and a Nintendo Joy-Con controller to track the player’s body position and hand movement and provide vibrations via Joy-Con’s HD Rumble to detect collisions between the Joy-Con and virtual ball. However, their Virtual Showdown was a single-player game in which one player hit the one-off incoming ball; moreover, the rally was not supported, unlike a real showdown.

Collaborative virtual reality (VR) applications exist in several fields. In education, VR teachers can remotely teach students how to assemble aircraft engine combustion chambers (Oda et al., Citation2015). In real estate, a Dollhouse VR system (Ibayashi et al., Citation2015) facilitates collaboration between designers and consumers. Designers can arrange furniture in a VR environment using a top-down view, whereas consumers can explore virtual homes from a first-person perspective. In the medical field, therapists use collaborative VR systems to help children with neurodevelopmental disabilities improve their emotional recognition and communication skills (Loiacono et al., Citation2018).

We developed a multiplayer VR Showdown game using 3D spatial audio and haptic feedback to explore VR opportunities for the visually impaired. There are two modes for the VR Showdown game: a single-player mode called PvA (Player vs. artificial intelligence (AI) Agent) and a multiplayer mode called PvP (Player vs. Player). The PvA mode allows people with visual impairment to play independently by competing against an agent. In contrast, the PvP mode allows people with visual impairment to play remotely with people without visual impairment through photon unity networking (PUN) for multiplayer games. In PvP mode, two remotely located players share audiovisual scenes of the Showdown game in real time using head-mounted displays (HMDs) through the PUN.

The research goal of this study is to design a VR game that allows people with and without visual impairment to play together and verify that the game is enjoyable for all participants. We conducted two user studies with 27 visually impaired individuals. In the first user study, the Showdown game was between an AI agent and visually impaired individuals. In the second user study, the Showdown game was between people with and without visual impairment and played remotely in real-time. The contributions of this study are as follows:

  • We developed a multiplayer VR game that people with and without visual impairment could play together.

  • We demonstrated that people with visual impairment could competitively play a state-of-the-art VR game against an AI Agent.

  • We demonstrated that people with visual impairment and those without visual impairment (but blindfolded) could play well-matched VR Showdown games together.

2. Related works

This section introduces related work on the themes of VR for people with visual impairment, body tracking in VR games for people with visual impairment, and multiplayer serious games in VR.

2.1. VR for people with visual impairment

Researchers have developed various VR systems for people with visual impairment. One of the main features of these VR systems is the provision of spatial audio, such as HRTF. HRTF describes how sound waves interact with the listener’s torso, shoulders, head, and, particularly, the outer ear (LaViola et al., Citation2017). One of the main properties of HRTF is that they can contain considerable spatial information in real-world listening environments. This technical characteristic has facilitated the development of various VR applications for people with visual impairment. Applications can be divided into three categories depending on the sound generated by HRTF.

In the first, sound is generated at a position where the system responds to the user’s behavior. For example, in a VR walking system for people with visual impairment, a virtual cane sound is generated at the spatial point where the cane collides with another virtual object (Siu et al., Citation2020; Zhao et al., Citation2018). In the second, sound is generated at a particular location regardless of the user’s behavior. The sound indicates the location at which the user needs to arrive (Kreimeier & Götzelmann, Citation2019) or the location of the object they want to find (Eckert et al., Citation2018). HRTF gives a sense of a specific location via sound; however, the sound reflected differed depending on the surrounding environment.

In the third, the location where the sound is generated changes in real-time with a moving object. Thevin et al. (Citation2020) applied the HRTF to the sound of passing cars on the streets so that visually impaired people could recognize a passing car in a virtual environment just as they would in a real street environment). Simões and Cavaco (Citation2014) developed an audio game where people with visual impairment detected moving “aliens.” In a recent study, researchers developed VR applications whereby people with visual impairment could visualize bi-molecular structural models. When rendering the molecular structure of a protein in sound, the HRTF generates the sounds of atoms along with the spatial position of the protein’s backbone (Arce & McMullen, Citation2017). Using this method, researchers have noted that people with visual impairment can identify the spatial position of each atom and create a mental map of the bi-molecular structure. These studies show that many VR systems have been developed for people with visual impairment. However, few studies have been conducted on VR systems that allow people with and without visual impairment to play together. In this paper, we present a multiplayer VR game that enables people with visual impairment play remotely in real-time with people without visual impairment.

2.2. Body tracking in VR games for people with visual impairment

Tracking devices, such as the Nintendo WiiTM or Microsoft KinectTM have been used in VR games to support people with visual impairment. In VI-Bowling and VI-Tennis, the player rolls (Morelli, Foley, & Folmer, Citation2010) and smashes the ball (Morelli, Foley, Columna, et al., Citation2010), respectively, by swinging a Wii controller. In Pet-N-Punch, the player wields a Wii controller from top to bottom, mimicking hitting a mole with a hammer (Morelli et al., Citation2011). In virtual track and field games, the Kinect camera captures the running motions of a player and maps them to the athlete avatar’s running (Morelli & Folmer, Citation2011).

In a recent study, which is similar to our work, Wedoff et al. (Citation2019) used a Kinect camera to track the bodies of people with visual impairment and used the Joy-conTM (the improved version of a Wii controller) to track the x- and z-axis position of their hand. However, it did not fully reflect the swing of the bat, the real-time position and motion of the wrist, or the hand-introduced jitter; thus, the virtual racket was only implemented to turn discretely at 45° intervals.

Overall, there is a lack of VR games using immersive VR devices, such as the HTC ViveTM or Oculus RiftTM, for people with visual impairment, although there are studies of people without visual impairment (Koulouris et al., Citation2020; Michael & Lutteroth, Citation2020; Yoo et al., Citation2017, Citation2020). In our study, we developed an immersive VR game for people with visual impairment using an HMD and controllers to track the directional sound of a ball based on the user’s head position and provide haptic feedback based on the user’s hand position. We expected that people with visual impairment would have a more immersive and kinetic experience in our study than that reported in related studies based on the Nintendo Wii TM and Microsoft Kinect TM (Wedoff et al., Citation2019).

2.3. Multiplayer VR serious games

Several researchers have developed multiplayer VR serious games; Wegner et al. (Citation2017) developed a game in which one player played the role of a senior paramedic helping a junior paramedic using a virtual tablet PC, and the other player played the role of a junior paramedic treating the patient using various virtual tools at his waist. Ha et al. (Citation2016) developed a game in which several players put out a fire that blocked their way to escape from the building. Social MatchUp is a smartphone VR game in which children with neurodevelopmental disorders play activities with a therapist to improve their ability to recognize and verbalize emotions (Loiacono et al., Citation2018).

Although research has been conducted on multiplayer VR games for people with disabilities (Hong et al., Citation2017; Loiacono et al., Citation2018; Millen et al., Citation2014), research on multiplayer VR interactions for people with visual impairment is lacking. One benefit of multiplayer VR games for people with disabilities is that they can communicate in a virtual environment and learn to engage with others (Loiacono et al., Citation2018). According to Grabski et al. (Citation2016), multiplayer VR games between people with and without visual impairment are expected to help people with visual impairment be more widely included in society. However, the asymmetrical roles of people with and without visual impairment present challenges in the awareness of their counterparts (Gonçalves et al., Citation2021). In this paper, we present a user study involving people with and without visual impairment playing symmetrical roles through a newly developed multiplayer virtual reality game, VR Showdown.

3. VR showdown implementation

This section explains the implementation of the VR Showdown with respect to body tracking, racket positioning, spatial sounds, haptic feedback, and game design.

3.1. Body tracking and racket positioning

We used the HTC Vive ProTM because it provides accurate head tracking and creates more precise and realistic spatial sounds through the HMD’s headphone than previous work with Kinect or Nintendo Wii (Wedoff et al., Citation2019). The Vive ProTM HMD can play lossless audio in a high-quality sound format. In addition, the Vive ProTM HMD can detect the user’s head position, angle, and rotation to match the movement of the user’s head with that of the virtual head in the VR environment. The virtual head transformation consists of the position, orientation, and scale. This transformation is used to calculate the direction and distance between the user’s head position and the origin of the sound to produce 3D spatial sounds. A Vive ProTM controller was used to track the user’s hand movements to represent the user’s swing in a virtual environment precisely. shows the VR Showdown system used in the experiments.

Figure 1. Overall view of the VR Showdown.

Figure 1. Overall view of the VR Showdown.

3.2. Spatial sound system

We used a head-related transfer function (HRTF) to provide players with spatial audio relating to the the Showdown ball. Auditory distance recognition is caused by binaural signals such as acoustic parallax (Zahorik et al., Citation2005). Acoustic parallax detects significant differences in the relative angles of each ear to the virtual audio source object. As an alternative to HRTF, to create spatial audio, a stereo panning system adjusts the sound level on the left and right sides. However, stereo audio panning systems cannot provide users with sufficient acoustic parallax. Larsen et al. (Citation2013) compared a panning system with the HRTF to localize a virtual audio source object. They found that HRTF was superior to the panning system in terms of audio source location recognition accuracy and search speed. Therefore, in our VR Showdown, we used an HRTF to provide spatial audio feedback so that players could detect the precise position of the Showdown ball.

Various Software Development Kits (SDKs) are available for implementing HRTF. These include Oculus Spatializer 1.22.0, Google’s Resonance Audio 1.2.0, and Steam Audio 2.0 Beta 13. We used Google’s Resonance Audio SDK because it delivers accurate spatial sounds to identify the direction and location of nearby sound sources in real time (Gould, Citation2018). Its features include a reflection function that calculates sound waves bouncing through the air, a direct sound function that decreases the sound with increased distance, and a function that simulates environmental occlusion effects by low- and high-frequency components. These functions allow us to develop a system that enables players to precisely locate the ball by listening to sounds.

To properly migrate the Showdown into the virtual environment, a resonance sound listener component was assigned to the object corresponding to the player’s head position. For various conditions, such as the rolling ball sound, sound of the ball crashing into the wall, and sound of the racket colliding with the ball, we recorded audio bites from the original Showdown game and combined them with the HRTF. When the speed of ball movement dropped below 2.8 m/s, the sound size was reduced to 9/10 of the speed of 0.28 m/s, and when the ball stopped, no sound was heard from the ball. In addition, all the start and end sound wavelengths were continuously aligned to ensure that natural sounds were produced from repeated audio recordings.

3.3. Haptic system

We implemented the vibrations on Vive ProTM controllers when the racket hits the ball and the player uses the holding function. A detailed explanation of the holding function is provided in Section 3.4.2. The Vive ProTM controller has an internal function called “Pulse,” which momentarily vibrates the controller. We used “Pulse” to generate vibration frequencies of 70 Hz per second for strong vibrations and 30 Hz per second for weak vibrations. A short, strong vibration is generated when the racket hits the ball. In contrast, in the holding situation (when the racket drags the ball), a weak, continuous vibration is generated.

3.4. VR showdown game design

Our system replicated the original Showdown sports as closely as possible, as described below.

3.4.1. Basic rules for VR showdown

We designed the basic rules of VR Showdown to match the real-world “Showdown” sport in Paralympic games, as follows (IBSA, Citation2022).

  • Before the game begins, all the players are blindfolded.

  • The game begins with serving the ball: The system automatically serves the ball by rolling it from the center of the table to one of the players.

We chose this service method because it is difficult for the player to smash a stopped ball, which does not produce a sound. The sound of the ball rolling from the center of the Showdown table toward the player signals the start of the game. Each player receives a service twice for each turn.

  • Scoring: A player scores two points if they put the ball into the opponent’s goal pocket. Upon scoring, the game system announces (in the moderator’s voice) that the player has scored two points. The game system starts the next service sequence 5 s after the goal announcement.

  • Dead ball: When the ball lies “dead” (motionless and soundless), the serve sequence is re-executed by the last player to serve and with no demerits.

  • Victory: The player wins the game if they score 12 points first. The player wins the match if they win two of the three games.

3.4.2. New functions of VR showdown

A new function, “holding” has been added to the VR Showdown gameplay. In the original Showdown game, players use the racket to “hold” the ball by gently pushing the ball to the right or the left sidewall. Once the ball stops, the player can shoot it accurately. In reality, friction can stop the ball’s movement by pushing it against a wall using a racket. However, in the Unity engine, collisions occur between objects, walls, and rackets during such actions. If the collision occurs continuously in a short time, the limit of the physics engine causes the ball to pass through the racket or amplify the collision reaction, causing it to bounce at an abnormal speed. “holding” has been implemented in our game to address this problem. If the ball is in contact with the racket and the player simultaneously pushes down the trigger button of the Vive ProTM controller, holding is activated, and the racket drags the ball. The ball’s coordinates are calculated every 1/60th of a second, and the ball’s position is changed to follow the racket at a fixed distance from the end of the racket while the trigger button is pressed. During holding, the controller provides vibration feedback. The player can release the button at his/her preferred spot and moment and then strike the ball toward the opponent.

3.4.3. PvA game mode

For the PvA game mode, we implemented an AI Agent as the opponent, as shown in . The AI Agent counter-hits with the following success rates.

Figure 2. PvA (Player vs. AI Agent) play. The left image shows the participant playing the VR Showdown, using a green screen. The right image shows the AI Agent and participant’s virtual racket.

Figure 2. PvA (Player vs. AI Agent) play. The left image shows the participant playing the VR Showdown, using a green screen. The right image shows the AI Agent and participant’s virtual racket.
  1. The agent returns the ball in the first approach with a 100% success rate.

  2. After hitting the ball once, the probability of the agent counter-hitting the ball is reduced to 70%.

  3. After hitting the ball twice, the probability of the agent counter-hitting the ball is reduced to 40%.

  4. After hitting the ball thrice, the agent randomly moves to either the right or left area of the goal pocket and counter-hits the ball.

The average number of rallies between the visually impaired player and AI agent before a point was scored was 4.03, with the longest rally being 13. The AI Agent had difficulty blocking the players’ strong shots. The approximate criterion for strong shots is 11, which is indexed to the ball’s velocity. A value of 11 represents the threshold for shots greater than 10 m/s in the real world.

3.4.4. PvP game mode

The multiplayer mode between people with and without visual impairment was developed by implementing network communication through the photon engine from the Unity Asset (Unity3D, Citation2020), as shown in . We developed our program so that each user could estimate the distance and direction of the ball with the same but mirrored spatial sound information based on each user’s position. The ball information (e.g., position, speed, and direction) was transmitted to each player in real time through the network server through auditory feedback. Each player strikes the ball in the opponent’s goal pocket to score. The players could have vocal conversations while playing games.

Figure 3. PvP (Player vs. Player) play. The upper player (a) hits the ball on his left (b). The lower player (c) anticipates the ball on the other player’s side (d) by hearing. Each player estimates the distance and direction to the ball with the same but mirrored spatial sound information based on each user’s position.

Figure 3. PvP (Player vs. Player) play. The upper player (a) hits the ball on his left (b). The lower player (c) anticipates the ball on the other player’s side (d) by hearing. Each player estimates the distance and direction to the ball with the same but mirrored spatial sound information based on each user’s position.

4. Preliminary study: evaluation of HRTF system in VR showdown

Prior to the main user experiments, we investigated whether participants with and without visual impairment could navigate the Showdown ball using HRTF sounds without visual cues, as described by Wedoff et al. (Citation2019).

4.1. Participants and apparatus

Twenty-six participants with and without visual impairment were recruited for this study. However, one participant with visual impairment was excluded because of being deaf, and another without visual impairment was excluded because of a data recording error. Thus, the HRTF system was evaluated in the context of a VR Showdown involving 24 people with and without visual impairment. The mean age of the 12 participants with and without visual impairment was 59.67 (47 to 74 years) and 22.41 (20 to 27 years), respectively. Nine participants were male and three were female in both cases.

4.2. Tasks and measurement

For the main task, the participants were asked about the start and end positions of the moving ball in the Showdown. The ball was placed in three areas in front of the participant (left, middle, and right) and in two areas far from the participant (left and right). Thus, there were six departure routes from the three areas on the near side to the two on the opponent’s side. In addition, the six arrival routes correlated oppositely with the departure routes. Each route was played thrice by each participant. Each participant was asked 36 questions regarding the routes of ball movement (i.e., (six departure and six arrival routes) × three repetitions). The trajectories of the ball movements were randomized using a Latin square design. For the quantitative evaluation, we measured the accuracy of the participants in detecting the correct route.

4.3. Procedure

Each participant provided written informed consent and completed a demographic questionnaire. Subsequently, they participated in a training session. The participants were made to stand at the Showdown table, as shown in and made to hear the sound of the Showdown ball movement. During training, each route was announced in advance, and the six departure and six arrival routes of the ball were repeatedly played until they became familiar with the sound of the ball’s downward movement. For the main post-training tasks, all participants wore eye patches covering both eyes. Participants answered thirty-six questions about the ball’s movement based solely on auditory cues. Participants were allowed a 5-min break after answering 18 questions.

Figure 4. Setting for HRTF Evaluation.

Figure 4. Setting for HRTF Evaluation.

4.4. Results for preliminary study

We measured the accuracy of the participants in detecting the correct route for moving balls. The results show over 90% average accuracy in detecting the correct trajectory of the moving ball for participants with and without visual impairment (92.7% and 91.9%, respectively). summarizes the accuracy rates for detecting the various trajectories of outgoing and incoming balls.

Table 1. Results of the preliminary study.

5. Experiment 1: PvA mode

To evaluate the VR Showdown, we conducted two user experiments. The first experiment tested the PvA mode, specifically designed to assess whether visually impaired people can competently play a game against an AI agent. Participants in this experiment were involved in a best-of-three series of 12-point games, in which the goal was to detect the position of the Showdown ball and successfully hit the opponent’s goal.

5.1. Participants and apparatus

Experiment 1 was conducted with the same group of participants as in the preliminary study. All participants had previous experience in VR environments but no previous experience with Showdown. The PvA mode, detailed in Section 3.4.3, was used in the experiment. Details of the participants with visual impairment are presented in .

Table 2. Demographic information of participants with visual impairment in the preliminary study and Experiment 1.

5.2. Tasks and measurement

Participants played three sets of games against an AI Agent. We motivated the participants to do their best and win the game in two rounds by paying them a cash reward for winning both games. During gameplay, we systemically gathered game log data (see ). Unity’s trigger feature collects log data and automatically records them in a CSV file for subsequent analysis. We analyzed these log data to compare the game performance of participants with and without visual impairment.

Table 3. Systematically measured game data automatically logged by the system in Experiments 1 and 2.

5.3. Procedure

All participants began the study by signing a consent form and completing a demographic survey. Participants with visual impairment had an assistant sign the consent form. Survey questions were read aloud to visually impaired participants, and an assistant or research facilitator recorded their responses.

All participants were required to wear blindfolds while playing the VR Showdown game. Additionally, all participants underwent a training session to familiarize themselves with the game. During training, participants played a participants played a 3-min practice game against an AI agent to familiarize themselves with the game rules.

After training, participants played three 12-point games against an AI opponent. The third game was not played if a participant or AI agent won the first two. The experimental procedure is illustrated in .

Figure 5. Experimental procedure.

Figure 5. Experimental procedure.

6. Results for experiment 1

For the statistical analysis of the game data, we used a chi-squared test for the Match and Game Win data and an independent t-test for the other data. Overall, nine out of 12 people with visual impairment and eight out of 12 people without visual impairment won the match against the AI agent in the PvA mode. The cross-tabulation of the Match and Game Win data are presented in and , respectively. Participants with visual impairment have a significantly higher rate of “shots on target” than people without visual impairment (U = 10, p < .001). The remaining measures show no significant differences between the two groups. shows the game data for the two groups in Experiment 1.

Figure 6. Results of game data of the two participant groups in Experiment 1.(*p < .001).

Figure 6. Results of game data of the two participant groups in Experiment 1.(*p < .001).

Table 4. Cross table of Match Win in Experiment 1(X2(1, N = 24) = 0.202, p = .653).

Table 5. Cross table of Game Win in Experiment 1(X2(1, N = 55) = 0.041, p = .840).

In the interview, participants with visual impairment mentioned having a similar experience to a real showdown game by experiencing the spatial collision and movement of the virtual ball in VR. Participants who liked the game also mentioned having fun and wanting to recommend it to others with visual impairment. Additionally, they preferred a holding function that makes it easy to control the ball. However, some participants had difficulty playing the game because of difficulty recognizing the size of the virtual rackets. As the size of the virtual racket is different from that of the controller, the participants have to estimate the length of the racket.

7. Experiment 2: the PvP mode

In the second experiment, we tested the PvP mode to determine whether visually impaired people could play a Showdown game with people without visual impairment. In the PvP mode, ball movements are less predictable and significantly faster than those in the PvA mode. All studies were approved by the ethics committee, and a special COVID-19 protocol was applied to eliminate any health risks to the participants.

7.1. Participants and apparatus

Nineteen visually impaired participants were recruited for Experiment 2, none of whom had participated in Experiment 1. Four participants with real-life Showdown experiences were excluded from the analysis for a fair comparison. Participants with visual impairment ranged from 35–68 years (mean: 56.28). Further information is presented in .

Table 6. Demographic information of participants with visual impairment in Experiment 2.

Nineteen participants without visual impairment were recruited for Experiment 2. Four were excluded from the analysis because they played the same group as the visually impaired participants. The participants without visual impairment ranged from 37 to 72 years (mean: 56.05) and consisted of three males and 12 females.

All participants wore blindfolds during the experimental tasks regardless of their vision status. All participants were inexperienced with Showdown or VR. For a detailed explanation of the PvP mode, please refer to Section 3.4.4.

7.2. Tasks and measurement

Participants aimed to win two games against their opponents. The first participant to win two games in each set received an extra cash reward of approximately four dollars. We measured the same game data as in Experiment 1. In addition, we used the SSQ and SUS questionnaires (Kennedy et al., Citation1993; Usoh et al., Citation2000). SSQ was used to assess participants’ motion sickness. In contrast, SUS evaluated the degree of immersion and sense of presence within the virtual Showdown game, measuring their sense of being in another person’s company. Interviews were conducted with each participant after completing the questionnaires. The interview questions were open-ended to encourage the sharing of a wide range of opinions about the participants’ VR experiences.

7.3. Procedure

The procedure used in this experiment was the same as that in Experiment 1. After training, all participants wore eye patches on both eyes while playing the game to focus only on the sound, regardless of their visual condition (Imbriani et al., Citation2018). For the experiment, participants played a best-of-three series of 12-point games. All participants played three games except for one who won the first two games. Players were encouraged to communicate with each other while playing the games. Researchers used filler phrases, such as “Wow, good job!” to encourage conversation if it did not start. After playing, the participants completed questionnaires and were interviewed by an experimenter. shows a flowchart of the experimental procedure.

8. Results for experiment 2

For the statistical analysis of the game data, we used a chi-square test for both Match and Game Win data and an independent t-test for the other data. Overall, ten of 15 people with visual impairment won matches against their opponents. For each game, people with visual impairment won 20 of 35 games, a 57.1% win rate. The “Match Win Rate” and “Game Win Rate” have no statistically significant differences between the two groups of people with visual impairment and people without visual impairment. and show the cross tables of Match and Game Wins, respectively.

Table 7. Cross table of Match Wins (X2(1, N = 30) = 3.333, p = .068).

Table 8. Cross table of Game Wins (X2(1, N = 70) = 1.429, p = .232).

A comparison of other game data between the two participating groups revealed significant differences in all hit rates. Participants with visual impairment have a significantly higher Hit Rate than those without (t (15.798) = 5.308, p < .001), Left Hit Rate (t (14.375) = 4.121, p = .001), and Right Hit Rate (t (19.080) = 4.993, p < .001). In contrast, participants without visual impairment have a significantly higher Middle Hit Rate than participants with visual impairment (t (28) = −7.024, p < .001). The remaining measurements show no significant differences between the two groups. shows the game data for the two groups in Experiment 2.

Figure 7. Results of game data of the two participant groups in Experiment 2. (*p < .01, **p < .001).

Figure 7. Results of game data of the two participant groups in Experiment 2. (*p < .01, **p < .001).

Regarding the results of the questionnaires, we used the t-test for the SUS and the Mann-Whitney U test for the SSQ. For SSQ, we followed the formula introduced by Kennedy et al. (Citation1993): We multiplied 9.54 for nausea (N), 7.58 for oculomotor disturbance (O), 13.92 for disorientation (D), and 3.74 for total simulator sickness (TS). We found a significant difference in the SSQ-TS scores between the groups (U = 12.5, p < .05). We also found significant differences in the SSQ-N (U = 31.5, p < .05) and SSQ-O (U = 22, p < .01) scores using the Mann-Whitney U test (see ). We did not find any significant differences for the other questionnaires. People with visual impairment experience less motion sickness in our VR game than those without. A likely reason is that people with visual impairment are accustomed to an environment without vision, whereas people without visual impairment are less familiar with this environment and experience more motion sickness.

Table 9. Mean (and standard error) of scales from questionnaires between the two groups in Experiment 2 (*p < .05, **p < .01). SUS uses a 7-point Likert scale and SSQ value is 0–3. All values of SSQ are weighted based on Bimberg et al. (Citation2020).

Conversation data were collected by recording videos during the experiment. We also conducted conversation analysis between people with and without visual impairment to examine their emotional reactions to the game. The main objective was to understand the different emotions experienced during conversations while playing. After the experiment, we transcribed all statements made by the participants based on the recorded videos. Two researchers rated the emotions of the sentences in the transcription based on the valence-arousal dimensions originally presented by Russell (Camras, Citation1980; Russell, Citation1980). A representative emotion was selected from each category expressed in the interviews. Cohen’s Kappa was used to determine inter-rater reliability. presents the corresponding agreement and Cohen’s Kappa for each category. Cohen’s Kappa for each category was above 0.7, indicating good agreement between the ratings of the two researchers for each category. After rating, the researchers agreed on the code. Based on this analysis, all participants reported experiencing enjoyment. In addition, the high-arousal, positive-valence category, which includes the emotion of enjoyment, had the highest proportion of all categories except neutral. illustrates the proportion of each emotion observed in the conversation analysis.

Figure 8. Percentage of coding results for each emotion.

Figure 8. Percentage of coding results for each emotion.

Table 10. Table of code category titles, emotions included in the category, corresponding agreements (percentages), and Cohen’s Kappa.

This outcome indicates that people with and without visual impairment enjoyed the game.

8.1. Findings from experiment 2

Through conversation analysis, we evaluated participants’ reactions to the game based on valence-arousal dimensions (Russell, Citation1980).

  • Enjoy (High-arousal, Positive-valence): All participants mentioned that the game was enjoyable.

  • Tension (High-arousal, Negative-valence): Nine of 15 with visual impairment and eight of 15 without visual impairment talked about tension during the game, mainly when they anticipated that the ball would approach them.

  • Relaxed (Low-arousal, Positive-valence): Three of 15 with visual impairment and one of 15 people without visual impairment reported being calm while playing the game.

  • Bored (Low-arousal, Negative-valence): Five out of 15 with visual impairment and five out of 15 without visual impairment were bored, especially when the ball stopped in the middle of the table while playing the game.

During the postgame interview, we asked the participants follow-up questions regarding the meaning of their comments. All participants with visual impairment reported enjoying the VR Showdown. The common response was “fun” for the sound feedback of the Showdown ball and for the haptic response when they hit the ball. PVI 2–12 said noticing a rolling ball hit by sound and vibration was fun. PVI 2–9 said the system’s audio narration of saying “reset” when the ball has moved to its original position and stating the score when one of the participants has scored was interesting because it is like a judge’s announcement in the original Showdown game. PVI 2–14 felt excited because the narration updated the score as it increased after hitting the ball. With this narration, PVI 2–11 gained confidence as it became clear whether the ball was hit well.

Most people without visual impairment also reported that they enjoyed the VR Showdown because it was a new experience for them to play the VR game while blindfolded. They initially felt frustrated by the visionless environment, which helped them to focus on the sound of the ball. PNV 2–5 mentioned that he initially felt cramped by the visionless environment, but this feeling disappeared after he noticed that people with visual impairment could play the game competitively in such an environment. Furthermore, PNV 2–11, 10, and 12 mentioned that they could understand the visionless environments of people with visual impairment by covering their eyes. In addition, PNV 2–12 mentioned the need to understand and consider the difficulties of people with visual impairment, as well as participants who played the game together. Many participants concentrated on learning the sense of distance from the ball. PNV 2–7 focused on the sound of the ball to become accustomed to the distance of the ball. PVI 2–8 and PNV 2–8 mentioned that they could learn the sense of distance from the ball through vibrations in the controller. The participants recognized the relationship between the sound and position of the ball through the controller’s vibration when they hit the ball.

PNV (People without visual impairment) 2–7: It is necessary to get the sense of the distance of the ball. So, I was focused on how much the volume of the ball’s sound got louder when hitting the ball.

PVI (People with visual impairment) 2-8: I keep moving the controller to see whether it vibrates. After hitting the ball, I felt the vibration and got a sense of the distance of the ball.

From the VR Showdown, people with and without visual impairment became aware and could communicate with each other. PNV 2–11 and PNV 2–5 recognized that the opponent was hitting the ball through awareness of the sound from the opponent’s area. PVI 2–3 noticed that the opponent strongly desired to win through the scratching sound of the Showdown table as he moved the controller to hit the ball.

PNV 2-11: I recognized the opponent hitting the ball by the sound of hitting the ball from far away.

PVI 2-5: I recognized the opponent starting to hit the ball, and I noticed the sound of the ball movement was different from the first. This allowed me to know that the opponent was playing hard in the game.

PVI2-3: I kept hearing the scratching sound from the table because the opponent had a strong desire to win.

People without visual impairment thought positively about people with visual impairment who eagerly participated in games. PNV 2–6 noted that the VR Showdown game would be helpful for other people with visual impairment.

PNV 2-6: There is nothing much to put energy into for people with visual impairment, so I liked seeing my partner player pour her energy into this game. From my experience, people with visual impairment are usually passive in playing a game, and I think it would be great to have more activities (like this game) to put their energy into.

Many people with visual impairment unconsciously made a sound like “wak” when they tried to hit the ball with force. These sounds made participants without visual impairment feel that those with visual impairment were enjoying the game.

PVI 2-14: The reason for the sound was to hit the ball and score a goal with a strong hit.

PVI 2-2: In the original Showdown game, players should be quiet and not make a loud sound to hear the ball sound. But since it doesn’t interfere with the sound of the ball, we can make a sound in this version of the game.

PNV 2-2: I noticed that people with visual impairment who made a noise were enjoying the game. This sound comes out involuntarily. It is only triggered when they are engaged in the game.

Participants encouraged each other by saying, “Let’s play hard!.” Participants also said, “Hit it!” when the opponent was supposed to hit the ball. In one case, they cheered each other’s scoring.

PNV 2-1: (On the timing to hit for PVI) “Hit it!”

PVI 2-1: (When PNV scored a score) “Oh! You got the goal!”

PVI 2-1: (When PNV scored a goal) “You got the goal in one shot!”

PVI 2-8: “Let’s play hard!”/PNV2-8: “OK!”

After the game, participants with and without visual impairment had conversations about their game experience. PVI 2–9 and PNV 2–9 discussed holding the controller during the game. After the first match, PVI 2–10 and PNV 2–10 discussed how to hit the ball.

PVI 2-9: Did you have difficulty holding the controller?

PNV 2-9: I needed to hold a controller horizontally not vertically. That’s why I couldn’t block your ball. I only thought of scoring a goal, not of defending the ball.

PVI 2-10: Should we hit the ball when it comes closer?

PNV 2-10: As the ball comes closer, you will feel the vibration.

PVI 2-10: When do I feel the vibration?

PNV 2-10: When the ball comes, you feel the vibration.

PVI10: I got it.

Most participants agreed that there were limited activities that people with and without visual impairment could play together. PVI 2–15 appreciated the opportunity to participate in inclusive games.

PVI 2-15: It’s great for people with visual impairment and those with normal vision.

9. Discussion

We introduced a new opportunity for people with visual impairment by developing a collaborative VR game and conducting experiments involving interactions between people with and without visual impairment. Our VR Showdown used HMD to track the position and orientation of the player’s head and provided proper spatial audio feedback using HRTF technology. Compared with previous single-event modes of Virtual Showdown, which did not allow rallies (Wedoff et al., Citation2019), our PvA mode is a full-rally game capable with an AI Agent. We also demonstrated that people with and without visual impairment could play the VR Showdown together and could have rewarding social interactions. This is an important difference from previous studies that focused on the individual interactions of people with visual impairment (Siu et al., Citation2020; Wedoff et al., Citation2019; Zhao et al., Citation2018). We found that visually impaired individuals expressed and communicated their emotions to other players during games. Even after playing the game, participants with visual impairment had several conversations with other players. In previous studies, playing games with others evoked meta-gaming (Mueller et al., Citation2009) and provided rich resources for talking with other players (Segal, Citation1994). Likewise, in our case, the experience of playing the VR Showdown allowed people with and without visual impairment to talk and engage in rewarding social interactions.

People with and without visual impairment enjoyed the VR Showdown remotely through network connections. The participants impacted each other’s actions in real time by sending the shared object, the Showdown ball, to the other player’s area and making them block the ball. In our experiment, participants developed a strategy to score a goal by determining the other person’s movement. One strategy was sending the ball to an empty space the opponent had not defended. Participants focused on the sounds made by the opponent while playing the game. If the opponent made a scratching sound on the table with the racket, even with the ball in front of the player, the player kept moving their racket randomly, regardless of the ball’s position. Thus, it might be an opportunity for the player to score a goal, as mentioned in the interview.

This study provided insights into VR systems for people with visual impairment.

- Non-isomorphic VR interactions expand the VR experience of people with visual impairment. Two approaches have been considered for designing VR interfaces: isomorphic and nonisomorphic interactions (Knight, Citation1987). Isomorphic interactions are almost the same as real-life interactions; thus, they support natural interactions. However, this is limited by the input devices and the human itself (LaViola et al., Citation2017). For example, if the input device has a certain tracking area, users cannot move far from the tracking area. In addition, humans cannot select objects that are located away from arm’s length. A non-isomorphic interaction method was developed to overcome these limitations. Non-isomorphic interactions are not natural but often permit more usability and performance than isomorphic interactions (Bowman & Hodges, Citation1997; Poupyrev et al., Citation1998). For example, the Go-Go technique allows users to grasp a distant object by stretching their virtual hand without walking toward it (Poupyrev et al., Citation1996). For the VR Showdown, we developed a non-isomorphic interaction (i.e., holding) that is unavailable in a real-world Showdown. The holding feature allows the player to attach the ball to the racket and drag it as intended while holding the controller’s trigger button. During the interviews, the participants mentioned that the holding function of the VR Showdown was easy to understand and efficient in controlling the ball, allowing them to use various strategies to win the game. This non-isomorphic feature provides people with visual impairment with a satisfactory VR experience.

- Collaborative systems should give players a sense of synchronicity. De Kort and Ijsselsteijn (Citation2008) indicated that social presence is built based on awareness of others through synchronicity. During the VR Showdown, the ball positions were presented synchronously to both players in real-time. The system also provides a narrative to share the current status of the game. This synchronicity allows people with and without visual impairment to experience co-existing in the same time and space, even though their real spaces are separated. A collaborative game should give participants a feeling of synchronicity by sharing game information in real-time.

- By sharing the VR experience of people with visual impairment, people without visual impairment get to understand people with visual impairment. Previous collaborative games for people with visual impairment have led players to focus on their abilities. Kinaptics allows people with visual impairment to play a collaborative game using multimodal feedback (e.g., sound, haptic, and wind), while people without visual impairment play using their vision (Grabski et al., Citation2016). Gonçalves et al. (Citation2021) created a collaborative game in which people with and without visual impairment played asymmetric roles. In their collaborative game, people with visual impairment relied on auditory feedback, whereas people without visual impairment used their visual capability. However, the researchers mentioned that it was difficult for participants to understand each other empirically because they could not experience each other’s tasks (Gonçalves et al., Citation2021). By contrast, our study provided the same virtual environment for both types of participants. People without visual impairment could experience an environment where sight was lost, and had to depend on auditory information. In the interviews, many people without visual impairment mentioned that the VR Showdown helped them understand the sightless, sound-only environment that people with visual impairment experience. They mentioned they could connect better with people with visual impairment by playing together in a virtual environment where sight was excluded.

10. Limitations

In Experiment 1, we found age differences between participants with and without visual impairment, which could affect their motor skills and be a confounding variable. To mitigate this, we matched the age groups used in Experiment 2 for a more controlled comparison. Additionally, we could not assess the users’ sense of presence and motion sickness in Experiment 1 because we did not conduct related questionnaires. We administered the SUS and SSQ in Experiment 2 to address this issue.

In Experiment 2, we blindfolded all participants, following the rules of a real-world Showdown game. However, participants with visual impairment had a wide spectrum of impairment and different prior visual experiences. Having them play the VR Showdown without blindfolds would have provided more accurate data by matching their experiences of playing other VR games. In our future work, an alternative design will be considered to provide well-matched daily experiences for people with and without visual impairment rather than blindfolding all participants.

11. Conclusions and future work

We developed a collaborative VR Showdown program accessible to the visually impaired. We used HRTF so that people with visual impairment could detect a rapidly moving Showdown ball using auditory feedback. In addition, we used HTC Vive ProTM to track the participants’ head and hand positions and movements in real time. We tested the system on individuals with and without visual impairment. People with visual impairment could detect the position of a Showdown ball with an accuracy of over 90%. Two-thirds of the participants with visual impairment were matched with people without visual impairment. In the PvP mode, all participants reported having fun and recommending the game to other users. This study provides a potential avenue for further research on VR applications accessible to people with visual impairment. Future research should focus on enriching the interactive experience and expanding the range of e-sports for people with visual impairment by extending it to other paralympic games.

Acknowledgments

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF), funded by the Ministry of Education (No. 2018R1D1A1A02085645); This work was also supported by a grant of the Korea Health Technology R&D Project through the Korea Health Industry Development Institute (KHIDI), funded by the Ministry of Health & Welfare, Republic of Korea (grant number: HI22C0619).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Korea Creative Content Agency [R2021040093]; Korea Medical Device Development Fund [NTIS9991006786, KMDF_PR_20200901_0113]; Ministry of Culture, Sports and Tourism and Korea Creative Content Agency [R2021040093]; Korea Health Industry Development Institute (KHIDI) [HI22C0619]; National Research Foundation of Korea (NRF) [2018R1D1A1A02085645].

Notes on contributors

Hojun Aan

Hojun Aan is currently a PhD candidate in the Department of Human-Computer Interaction at Hanyang University in Seoul, South Korea. His research interests include human-computer interaction (HCI), virtual, augmented, and mixed reality (VR/AR/MR), and gamification.

Sangsun Han

Sangsun Han is currently the post-doc researcher in bionics research center of Korea Institutes of Science and Technology. He is interested in digital therapies based on XR (eXtended Reality).

Kibum Kim

Kibum Kim is a full professor in the Department of Human-Computer Interaction at Hanyang University, South Korea. He received a bachelor’s degree from Korea University, a master’s degree from the University of Illinois at Urbana–Champaign, and a Ph.D. from Virginia Tech, all in computer science. His research focuses on HCI, CSCW, CSCL, VR/AR/MR, and computer education.

References

  • Arce, T. R., & McMullen, K. A. (2017). Hearing biochemical structures: Molecular visualization with spatial audio. ACM SIGACCESS Accessibilty and Computing, 117(117), 9–13. https://doi.org/10.1145/3051519.3051521
  • Bimberg, P., Weissker, T., & Kulik, A. (2020). On the usage of the simulator sickness questionnaire for virtual reality research. 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA (pp. 464–467). https://doi.org/10.1109/VRW50115.2020.00098
  • Bowman, D. A., & Hodges, L. F. (1997). An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. Proceedings of the 1997 symposium on Interactive 3D graphics, Providence, Rhode Island, USA. https://doi.org/10.1145/253284.253301
  • Camras, L. (1980). The American Journal of Psychology, 93(4), 751–753. https://doi.org/10.2307/1422394
  • De Kort, Y. A., & Ijsselsteijn, W. A. (2008). People, places, and play: Player experience in a socio-spatial context. Computers in Entertainment (CIE), 6(2), 1–11. https://doi.org/10.1145/1371216.1371221
  • Eckert, M., Blex, M., & Friedrich, C. M. (2018). Object detection featuring 3D audio localization for Microsoft HoloLens - A Deep Learning based Sensor Substitution Approach for the Blind. Proceeding of 11th International Joint Conference on Biomedical Engineering Systems and Technologies, Funchal, Madeira, Portugal. https://api.semanticscholar.org/CorpusID:4308820
  • Gonçalves, D., Rodrigues, A., Richardson, M. L., Sousa, A. A. D., Proulx, M. J., & Guerreiro, T. (2021). Exploring asymmetric roles in mixed-ability gaming. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan. https://doi.org/10.1145/3411764.3445494
  • Gould, R. (2018). Let’s Test: 3D Audio Spatialization Plugins. Retrieved 09.17 from. https://designingsound.org/2018/03/29/lets-test-3d-audio-spatialization-plugins/
  • Grabski, A., Toni, T., Zigrand, T., Weller, R., & Zachmann, G. (2016,). Kinaptic - techniques and insights for creating competitive accessible 3D games for sighted and visually impaired users. 2016 IEEE Haptics Symposium (HAPTICS), Philadelphia, PA, USA (pp. 325–331). https://doi.org/10.1109/HAPTICS.2016.7463198
  • Ha, G., Lee, H., Lee, S., Cha, J., & Kim, S. (2016). A VR serious game for fire evacuation drill with synchronized tele-collaboration among users. Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Munich, Germany. https://doi.org/10.1145/2993369.2996306
  • Hong, Y., Bruniaux, P., Zeng, X., Liu, K., Chen, Y., & Dong, M. (2017). Virtual reality-based collaborative design method for designing customized garment for disabled people with scoliosis. International Journal of Clothing Science and Technology, 29(2), 226–237. https://doi.org/10.1108/IJCST-07-2016-0077
  • Ibayashi, H., Sugiura, Y., Sakamoto, D., Miyata, N., Tada, M., Okuma, T., Kurata, T., Mochimaru, M., & Igarashi, T. (2015). Dollhouse VR: A multi-view, multi-user collaborative design workspace with VR technology. In SIGGRAPH Asia 2015 Posters. http://doi.org/10.1145/2820926.2820948
  • IBSA. (2022). SHOWDOWN RULES 2022-2025. https://ibsasport.org/sports/Showdown/overview/
  • Imbriani, L., Mariani, I., & Bertolo, M. (2018). WaTa fight! How situated multiplayer competitive gaming can facilitate the inclusion of low vision and blind players. Game Journal - The Italian Journal of Game Studies, 7(1), 1–15. https://org/73f0dd3505034c6d89101131a530a7dc
  • Kennedy, R. S., Lane, N. E., Berbaum, K. S., & Lilienthal, M. G. (1993). Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The International Journal of Aviation Psychology, 3(3), 203–220. https://doi.org/10.1207/s15327108ijap0303_3
  • Knight, J. (1987). Manual Control and Tracking. In G.Salvendy (Ed.), Handbook of Human Factors. John Wiley & Sons.
  • Koulouris, J., Jeffery, Z., Best, J., O’Neill, E., & Lutteroth, C. (2020). Me vs. Super(wo)man: Effects of Customization and Identification in a VR Exergame. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. https://doi.org/10.1145/3313831.3376661
  • Kreimeier, J., & Götzelmann, T. (2019). First steps towards walk-in-place locomotion and haptic feedback in virtual reality for visually impaired. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland Uk. https://doi.org/10.1145/3290607.3312944
  • Larsen, C. H., Lauritsen, D. S., Larsen, J. J., Pilgaard, M., & Madsen, J. B. (2013). Differences in human audio localization performance between a HRTF- and a non-HRTF audio system. Proceedings of the 8th Audio Mostly Conference, Piteå, Sweden. https://doi.org/10.1145/2544114.2544118
  • LaViola, J. J., Jr., Kruijff, E., McMahan, R. P., Bowman, D., & Poupyrev, I. P. (2017). 3D user interfaces: Theory and practice. Addison-Wesley.
  • Loiacono, T., Trabucchi, M., Messina, N., Matarazzo, V., Garzotto, F., & Beccaluva, E. A. (2018). Social MatchUP -: A memory-like virtual reality game for the enhancement of social skills in children with neurodevelopmental disorders. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada. https://doi.org/10.1145/3170427.3188525
  • Michael, A., & Lutteroth, C. (2020). Race Yourselves: A longitudinal exploration of self-competition between past, present, and future performances in a vr exergame. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. https://doi.org/10.1145/3313831.3376256
  • Millen, L., Cobb, S., Patel, H., & Glover, T. (2014). A collaborative virtual environment for conducting design sessions with students with autism spectrum disorder.International Journal of Child Health and Human Development, 7(4), 367–376. https://www.proquest.com/scholarly-journals/collaborative-virtual-environment-conducting/docview/1655287782/se-2
  • Morelli, T., Foley, J., Columna, L., Lieberman, L., & Folmer, E. (2010). VI-Tennis: A vibrotactile/audio exergame for players who are visually impaired. Proceedings of the Fifth International Conference on the Foundations of Digital Games, Monterey, California. https://doi.org/10.1145/1822348.1822368
  • Morelli, T., Foley, J., & Folmer, E. (2010). Vi-bowling: A tactile spatial exergame for individuals with visual impairment. Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, Orlando, Florida, USA. https://doi.org/10.1145/1878803.1878836
  • Morelli, T., Foley, J., Lieberman, L., & Folmer, E. (2011). Pet-N-Punch: Upper body tactile/audio exergame to engage children with visual impairment into physical activity. Proceedings of Graphics Interface 2011, St. John’s, Newfoundland, Canada (pp. 223–230).
  • Morelli, T., & Folmer, E. (2011). Real-time sensory substitution to enable players who are blind to play video games using whole body gestures. Proceedings of the 6th International Conference on Foundations of Digital Games, Bordeaux, France. https://doi.org/10.1145/2159365.2159385
  • Mueller, F. F., Gibbs, M. R., & Vetere, F. (2009). Design influence on social play in distributed exertion games. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA. https://doi.org/10.1145/1518701.1518938
  • Oda, O., Elvezio, C., Sukan, M., Feiner, S., & Tversky, B. (2015). Virtual replicas for remote assistance in virtual and augmented reality. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, Charlotte, NC, USA. https://doi.org/10.1145/2807442.2807497
  • Poupyrev, I., Billinghurst, M., Weghorst, S., & Ichikawa, T. (1996). The go-go interaction technique: Non-linear mapping for direct manipulation in VR. Proceedings of the 9th annual ACM symposium on User interface software and technology, Seattle, Washington, USA. https://doi.org/10.1145/237091.237102
  • Poupyrev, I., Weghorst, S., Billinghurst, M., & Ichikawa, T. (1998). A study of techniques for selecting and positioning objects in immersive VEs: Effects of distance, size, and visual feedback. Proceedings of ACM Conference on Human Factors and Computing Systems, Los Angeles, CA, USA.
  • Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178. https://doi.org/10.1037/h0077714
  • Segal, L. D. (1994). Actions speak louder than words: How pilots use nonverbal information for crew communications. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 38(1), 21–40. https://doi.org/10.1177/154193129403800106
  • Simões, D., & and Cavaco, S. (2014). An orientation game with 3D spatialized audio for visually impaired children. Proceedings of the ACM 11th Conference on Advances in Computer Entertainment Technology, Funchal, Portugal (pp. 1–4). https://doi.org/10.1145/2663806.2663868
  • Siu, A. F., Sinclair, M., Kovacs, R., Ofek, E., Holz, C., & Cutrell, E. (2020). Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. https://doi.org/10.1145/3313831.3376353
  • Thevin, L., Briant, C., & Brock, A. M. (2020). X-Road: Virtual reality glasses for orientation and mobility training of people with visual impairment. 13(ACM Transactions Access Computing), Article 7. 13(2), 1–47. https://doi.org/10.1145/3377879
  • Unity3D. (2020). Unity Asset Store. https://assetstore.unity.com/
  • Usoh, M., Catena, E., Arman, S., & Slater, M. (2000). Using presence questionnaires in reality. Presence Teleoperators & Virtual Environments, 9(5), 497–503. https://doi.org/10.1162/105474600566989
  • Wedoff, R., Ball, L., Wang, A., Khoo, Y. X., Lieberman, L., & Rector, K. (2019). Virtual showdown: An accessible virtual reality game with scaffolds for youth with visual impairment. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland Uk. https://doi.org/10.1145/3290605.3300371
  • Wegner, K., Seele, S., Buhler, H., Misztal, S., Herpers, R., & Schild, J. (2017). Comparison of two inventory design concepts in a collaborative virtual reality serious game. Extended Abstracts Publication of the Annual Symposium on Computer-Human Interaction in Play, Amsterdam, The Netherlands. https://doi.org/10.1145/3130859.3131300
  • Yoo, S., Ackad, C., Heywood, T., & Kay, J. (2017). Evaluating the actual and perceived exertion provided by virtual reality games. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, Denver, Colorado, USA. https://doi.org/10.1145/3027063.3053203
  • Yoo, S., Gough, P., & Kay, J. (2020). Embedding a VR game studio in a sedentary workplace: use, experience and exercise benefits. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. https://doi.org/10.1145/3313831.3376371
  • Zahorik, P., Brungart, D. S., & Bronkhorst, A. W. (2005). Auditory distance perception in humans: A summary of past and present research. Acta Acustica United with Acustica, 91(3), 409–420. https://www.ingentaconnect.com/content/dav/aaua/2005/00000091/00000003/art00003
  • Zhao, Y., Bennett, C. L., Benko, H., Cutrell, E., Holz, C., Morris, M. R., & Sinclair, M. (2018). Enabling people with visual impairment to navigate virtual reality with a haptic and auditory cane simulation. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada. https://doi.org/10.1145/3173574.3173690