ABSTRACT
Immersive virtual reality (IVR) takes advantage of exponential growth in our technological abilities to offer an array of new forms of entertainment, learning opportunities, and even psychological interventions and assessments. The field of creativity is a driving force in both large-scale innovations and everyday progress, and imbedding creativity assessment in IVR programs has important practical implications for future research and interventions in this field. Creativity assessment, however, tends to either rely on traditional concepts or newer, yet cumbersome methods. Can creativity be measured within IVR? This study introduces the VIVA, a new IVR-based visual arts creativity assessment paradigm in which user create 3D drawings in response to a prompt. Productions are then rated with modern extensions of a classic product-based approach to creativity assessment. A sample of 67 adults completed the VIVA, further scored using item-response modeling. Results demonstrated the strong psychometric properties of the VIVA assessment, including its structural validity, internal reliability, and criterion validity with relevant criterion measures. Together, this study established a solid proof-of-concept of the feasibility of measuring creativity in IVR. We conclude by discussing directions for future studies and the broader importance and impact of this line of work for the field of creativity and virtual reality.
Acknowledgments
The preparation of this article was supported by grant RFP-15-05 to Baptiste Barbot from the Imagination Institute (www.imagination-institute.org), funded by the John Templeton Foundation. The opinions expressed in this publication are those of the authors and do not necessarily reflect the view of the Imagination Institute or the John Templeton Foundation. We thank Alexandra Blanchard, Brianna Heuser, Rosemary Leo, Catalina Mourgues, Danielle Saliani, and Mansi Shah for help with data collection.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Availability of data and material
Available from the corresponding author upon reasonable request.
Notes
1. In contrast to the experimental setting, the preview featured only seating experience using the Oculus Rift.
2. The order of both alternate forms was counterbalanced.
3. Across all the ratings, there was only 1 missing rating (of aesthetic quality) for 1 product. This observation was skipped in investigations of model fit and reliability.
4. Rules of thumb and simulations in the SEM literature (e.g., Wolf et al., 2013) suggest that such simple models does not require more than 60 participants for a stable estimation.
5. As a matter of fact, tilt brush allows exporting room-scale creations in a range of formats (e.g., .fbx, .usd, .json) which could let raters evaluate these productions as 3D object whether in IVR or not.