3,279
Views
2
CrossRef citations to date
0
Altmetric
Articles

Playful sport design: A game changer?

, & ORCID Icon
Pages 45-74 | Received 28 Mar 2022, Accepted 24 Feb 2023, Published online: 08 Mar 2023

Abstract

The aim of this paper is twofold. First, the concept Playful Sport Design (PSD) is introduced, based on play and proactivity theories. PSD is defined as a proactive cognitive-behavioral orientation that makes athletes incorporate fun and self-oriented challenges into training sessions. Second, we develop and test an instrument to assess PSD. In phase 1 (N = 562), the PSD instrument is tested on its reliability, factorial validity, and construct validity. In phase 2 (N = 131), the test-retest reliability and predictive validity of the PSD instrument is considered. Additionally, the nomological network of PSD is expanded. In phase 3 (N = 212), the predictive validity of PSD is further assessed. As hypothesized, the results of factor analyses show that PSD is best represented by two dimensions: designing fun and designing competition. The psychometric properties of the scale were shown to be satisfactory. Providing evidence for convergent validity, PSD was positively related to playfulness, competitiveness, fantasy proneness, personal initiative, openness to experience, achievement striving, and fun seeking. Specifically, designing fun showed more robust relations with fun-focused personality traits, while designing competition showed stronger relations with competition-focused traits. Providing evidence for divergent validity, PSD did not share variance with negative affect, procrastination, and perfectionism. Finally, in support of predictive validity, athletes who playfully designed their training sessions reported better subjective and objective sports performance.

Lay summary: We introduce the concept Playful Sport Design (PSD), which refers to athletes’ proactive addition of play elements to their sports training. We present a scale that reliably and validly measures PSD. Further, we show that PSD is positively associated with immersion in sports training and sports performance.

    IMPLICATIONS FOR PRACTICE

  • Results of this paper could result in more awareness among adult athletes that play during sports training has beneficial sports outcomes.

  • Athletes can use PSD to increase flow experiences during training sessions and enhance sports performance.

  • The results of this paper open the door for practitioners to develop PSD interventions, through which athletes are taught how they can approach their training sessions in a more playful manner.

Frequent practice makes athletes more proficient in the execution of their sport. Nevertheless, time spent practicing is not the only decisive element in enhancing athletic skills—the design of the training sessions likewise affects training outcomes (Ericsson, Citation2020). Although it is generally accepted that athletes’ coaches have an impact on the form and effectiveness of training (see Horn, Citation2008), athletes may also design their training sessions, which impact their effectiveness. The latter has not yet been studied as extensively. According to the cognitive-behavioral approach, individuals’ thoughts and perceptions are central in determining effective behavior (Mace, Citation1990), suggesting that athletes can significantly impact their training sessions. This paper introduces the idea that one of the ways through which athletes shape sports practice is by playfully framing their training activities. We propose that those who playfully design their training sessions are more likely to create an optimal training experience and achieve higher levels of sports performance.

More specifically, the present study has two central aims. First, we introduce a new concept, “Playful Sport Design” (PSD), which we define as a proactive cognitive-behavioral orientation that makes adult athletes use ludic (i.e., amusement) and agonistic (i.e., challenge) play elements during their sports training. In doing so, we aim to extend previous research in the following ways. In contrast to earlier research that mainly focused on top-down approaches of including play in sports training (i.e., introduced and designed by a trainer, teacher, or coach; e.g., Arufe-Giráldez et al., Citation2022; Barreiro & Howard, Citation2017; Erickson et al., Citation2017; Launder & Piltz, Citation2013; Reed, Citation2020), we propose a proactive bottom-up approach in which athletes purposefully integrate play into their sports training. As play is a voluntary pursuit, we suggest that self-imposed play is most effective. Additionally, we build on previous work showing that the playful design of work tasks has beneficial outcomes (e.g., Bakker et al., Citation2020; Scharp et al., Citation2019). We show that playfully designing tasks is also relevant in another performance domain, i.e., the sports domain. Furthermore, we complement earlier research that already established the benefits of several playful sports training forms during childhood (e.g., Côté et al., 2007; Erickson et al., Citation2017; Memmert et al., Citation2010) by displaying that play during sports training can also have perks during adulthood. Moreover, by defining PSD as consisting of ludic and agonistic play elements, we integrate aspects of previous playful sports training forms that often focused on primarily one of these approaches.

The following sections offer a theoretical basis for PSD, drawing from the playful design of work tasks, proactivity, and (sport-specific) play theories and earlier empirical findings on play. PSD is then operationalized and placed within a conceptual framework of already existing athletic play-related and cognitive-behavioral concepts. Second, we propose a psychometrically sound and generic instrument to assess PSD. As such, we answer recent calls to develop contextualized play measures for adults (see Van Vleet & Feeney, Citation2015). The current study has three phases. In phase 1, we developed the PSD scale and started testing its validity and reliability (i.e., its factorial validity, reliability and construct validity). We used a sample of adult athletes of various levels practicing a wide range of sports. In phase 2, this sample was approached again half a year later, and we evaluated additional validity and reliability indicators (i.e., test-retest reliability, construct validity, and predictive validity) with longitudinal data. Finally, in phase 3, we set up a study among runners in which the predictive validity of PSD was assessed further. Additionally, the factorial validity was tested once more for runners specifically.

Theoretical background

The definition of play

Research has defined play as a voluntary pursuit (Caillois, Citation1961; Huizinga, Citation1949), which is oriented toward amusement, enjoyment, or fun (Van Vleet & Feeney, Citation2015). Play has an artificial quality (Burke, Citation1971) and is structured through relevant rules (Huizinga, Citation1949; Suits, Citation1977) or other game elements (Caillois, Citation1961; Huizinga, Citation1949). It has been argued that playing is a universally enjoyable experience and a critical part of human nature (Huizinga, Citation1949). Similarly, Csikszentmihalyi (Citation1975) proposed that play initiatives are driven not merely by future benefits but rather by the process of reaching self-set goals. In other words: the play experience is autotelic (Csikszentmihalyi, Citation1990). Play can take two different forms: ludic play, related to amusement and fun, and agonistic play, linked with challenges and competition (e.g., Huizinga, Citation1949; Scharp et al., Citation2022). The former relates to childlike fun and the latter form is more serious and rational.

Play in sport

Play is essential in sport and has been studied for years. That is, it has been suggested that the defining aspects of play and sport partly overlap, although play is not a necessary condition for sport (Meier, Citation1988; Suits, Citation1988). Sports can be placed on a continuum of play—including very little to a lot of play features. Sports that contain more play elements are often described using play language (e.g., “I am playing softball”; Mareš & Ryall, Citation2021), while sports that contain fewer play elements are often not referred to in this way (e.g., “I am running”). Certain attitudes may also stimulate play in sport. It has been argued that, to realize play, athletes should have a lusory attitude in which they consciously know and accept that one is involved in playing a game and accept its rules tacitly (Suits, Citation1988). Games refer to a subset of play in which one plays sport in an organized environment (see Stenros, Citation2017). Play during sport may also be stimulated by athletes adopting a playful attitude that is intentionally and voluntarily open and responsive to the play situation (see Mareš & Ryall, Citation2021).

Play is known to be essential in children’s sports training. In general, the importance of (coaches) integrating fun, exploratory behaviors, and challenges into children’s sports training to increase learning and prolonged sport participation has been established (e.g., Chow et al., Citation2019; Reed, Citation2020; Visek et al., Citation2015). More specifically, children may engage in several forms of play during sports training. Children may engage in unstructured forms in which they intend to have fun and, to some extent, decide how they play. An example is deliberate play, which refers to developmental physical activities that are intrinsically motivating, and provide gratification and enjoyment (e.g., Côté et al., 2007). Another example is free play, which is entirely driven by the youth and takes place outside organized sports training (e.g., playing football on the streets; Barreiro & Howard, Citation2017). These unstructured play forms often include various ludic (i.e., fun) play elements. Children may also engage in structured forms of play, in which they play with a specific learning objective and have less freedom in how they play, such as structured play (e.g., Kingston et al., Citation2020), play practice (e.g., Launder & Piltz, Citation2013), gamification (e.g., Arufe-Giráldez et al., Citation2022) and teaching games for understanding (e.g., Griffin & Butler, Citation2005). These play forms are often characterized by agonistic (i.e., competitive) play elements. Play during and outside children’s sports training has been shown to predict positive outcomes such as children’s intrinsic motivation for sport, learning, sports performance, and the acquisition of sports expertise in later life (Barreiro & Howard, Citation2017; Côté et al., 2007; Erickson et al., Citation2017; Kingston et al., Citation2020; Memmert et al., Citation2010). Over one’s lifetime, a decline in play and fun in sports training is often observed: adults engage in less playful sports training than children and adolescents (Erickson et al., Citation2017).

Play in adulthood

Although a decline in playful forms of sports training from childhood to adulthood is common, it has been suggested that playing is not merely limited to children in its functionality and serves important purposes in adult life (e.g., Bakker et al., Citation2020; Proyer, Citation2012a; Van Vleet & Feeney, Citation2015). Play can be a way to discharge a surplus of energy, alleviate stress, spark feelings of enjoyment and excitement, enhance flexible thinking and problem-solving skills, increase motivation, and enhance social interaction and learning (Deterding, Citation2016; Van Vleet & Feeney, Citation2015). In line with these suggestions, work and organizational psychologists have recently studied the benefits of play during work (Bakker et al., Citation2020; Petelczyc et al., Citation2018). The concept of Playful Work Design (PWD) was introduced, which refers to employees who proactively and playfully design their work tasks. It was shown that on days employee playfully designed their work tasks, they were more creative, performed better, and had more positive work experiences (Bakker et al., Citation2020; Scharp et al., Citation2019).

Building on this recent work on playful work design and earlier studies on the benefits of play during children’s sports training, we suggest that playfully approaching sports training may also benefit athletes in adulthood. This suggestion is in line with earlier propositions that findings within the work context may advance our understanding of athletic performance (Balk et al., Citation2019; DeFreese & Smith, Citation2013; Wagstaff, Citation2019). Although sport is inherently more playful, fun, autotelic, and voluntary than work, work and sport also overlap. Both employees and athletes engage in goal-directed behavior, spend time and effort on improving their performance, operate in a complex social and organizational environment, and face demands and resources (Balk et al., Citation2019). Given the commonalities between work and sport, we suggest that the idea of playfully and proactively designing activities and tasks fits an athletic context too.

The development of the concept playful sport design

Building on the literature on playful design of work tasks, proactivity and (sport-specific) play theories, and earlier empirical findings on play, this paper introduces the concept of Playful Sport Design (PSD). PSD is defined as a proactive cognitive-behavioral orientation that makes athletes incorporate ludic and agonistic play elements into their athletic training sessions. We argue that introducing (additional) play elements into athletic training sessions may positively impact the effectiveness of athletic training.

We suggest that PSD has three critical features. First, a critical feature of PSD is that an athlete proactively takes charge of the design of the training session. This makes PSD a bottom-up orientation, referring to a situation in which the actor, in this case, the athlete, designs the training to fit their needs and skills (Parker et al., Citation2010). Proactive refers to self-initiated efforts to bring about change in the situation or oneself (Parker et al., Citation2010). Accordingly, the term design in PSD was chosen, since the verb design is described as “to create, fashion, execute, or construct according to plan” (Merriam-Webster, Citationn.d., Definition 1), referring to proactivity (see Bakker et al., Citation2020; Scharp et al., Citation2022). PSD contrasts top-down initiatives, during which someone other than the actor, for example, the athlete’s coach, shapes or facilitates the training experience of the athlete (e.g., Barreiro & Howard, Citation2017; Côté et al., 2007; Erickson et al., Citation2017; Launder & Piltz, Citation2013; Reed, Citation2020). Motives to design one’s environment can be derived from self-expansion theory (Aron & Aron, Citation1996), which states that people aspire to develop themselves by acquiring new skills and resources. Additionally, as play is defined as a voluntary pursuit (e.g., Caillois, Citation1961; Huizinga, Citation1949), forcing it upon the actor may cause an experience of “mandatory fun” resulting in a decreased effectiveness of play (Mollick & Rothbard, Citation2014).

Second, in line with the so-called play-as-approach stream in play literature (Mainemelis & Ronson, Citation2006; Scharp et al., Citation2022), we contend that PSD is an approach toward performing an activity. That is, any type of activity, even the most tedious tasks, can be reframed into play. This means that PSD embodies an orientation through which athletes frame all types of training activities in a playful way. Consequently, PSD can best be described as a cognitive-behavioral orientation. During PSD, athletes approach training tasks using a playful cognitive frame, allowing for new thoughts and perceptions, ultimately resulting in behavioral change. For instance, an athlete may deliberately create a cognitive play element (e.g., “Each runner that I pass earns me one point and each runner that passes me loses me one point.”) and change their behavior during training sessions accordingly (e.g., increase running pace when other runners are around). In this example the athlete uses fantasy, takes up a challenge and sets a goal by trying to win more points than losing, which fits PSD’s definition.

Third, based on assertions that play can take two different forms (e.g., Huizinga, Citation1949; Scharp et al., Citation2022), we suggest that PSD can be described through both ludic play elements, related to amusement and fun, and agonistic play elements, linked with challenges and competition. Accordingly, we argue that playfully designing training tasks can be classified as being part of either one of two variations: designing fun (i.e., ludic play) and designing competition (i.e., agonistic play). Designing fun involves a lighthearted, playful mindset that revolves around amusement, imagination, creativity, and fantasy. For example, one could frame a training session in a playful way by imagining that one is participating in a world-renowned competition (e.g., the Olympics). Designing competition comprises a playful frame that fosters enjoyment by extending one’s abilities, for example, creating self-oriented challenges (i.e., focused on outperforming oneself rather than others), keeping track of performance, and aiming to set time records. An example of designing competition is creating playful challenges, such as maintaining a heart rate of 130 BPM during a training task.

We suggest that these three features make PSD fundamentally distinct from existing play-related sport concepts. That is, the bottom-up approach of PSD contrasts with other top-down approaches of including play in sports training, such as structured play, gamification, and teaching games for understanding, in which playful tasks are introduced and designed by a trainer, teacher, or coach (Arufe-Giráldez et al., Citation2022; Barreiro & Howard, Citation2017; Côté et al., 2007; Deterding et al., Citation2011; Launder & Piltz, Citation2013; Reed, Citation2020). PSD also deviates from free play (play driven entirely by the youth; Barreiro & Howard, Citation2017) and deliberate play (Erickson et al., Citation2017). Free play and deliberate play are primarily bottom-up approaches, just like PSD. However, free play is rather spontaneous instead of proactive. Furthermore, free and deliberate play often take place outside organized settings, while PSD can take place in both organized and unorganized settings. Further, these play forms are believed to only work with children. PSD, on the other hand, is employed by adult athletes. Moreover, PSD consists of fun and competition elements, while other play forms primarily focus on one of these elements. PSD is also unique compared to a playful or lusory attitude in sport since these attitudes refer to mental frames approaching the sport activity (Mareš & Ryall, Citation2021; Suits, Citation1988) rather than the cognitions and behaviors that follow mental frames. PSD is also different compared to other athletic cognitive-behavioral strategies, such as goal-setting. Although goal-setting can be part of PSD, it is neither a necessary nor defining condition of PSD. Furthermore, one creates goals during PSD to make practice more play-like—not solely to enhance performance, as is argued to be the case for goal-setting (Locke & Latham, Citation1990).

Phase 1: validity and reliability

Factorial validity of PSD

In line with the aforementioned duality of play (Huizinga, Citation1949; Scharp et al., Citation2022), we expect PSD to be best represented by two distinct factors—designing fun and designing competition (Hypothesis 1).

Construct validity of PSD

Phase 1 aims to establish convergent validity between PSD and four personality characteristics: playfulness, fantasy proneness, competitiveness, and personal initiative. Convergent validity is achieved when theoretically congruent, but unique, constructs are related to PSD (Campbell & Fiske, Citation1959). Playfulness is a disposition to reframe everyday situations in a way such that they experience them as entertaining, intellectually stimulating, or personally interesting (Proyer, Citation2017). While playfulness relates to a trait or behavioral tendency (Proyer, Citation2017; Van Vleet & Feeney, Citation2015), PSD relates to the cognitions and subsequent acts of playing. Although PSD and playfulness are distinct, there should arguably be shared variance due to their theoretical similarities of playfully approaching tasks (Hypothesis 2). In a similar vein, fantasy proneness captures one’s general aptitude to use imagination and daydream in various situations (Costa & McCrae, Citation1992). PSD is context-specific and encompasses the succeeding cognitions and behaviors related to fantasy during athletic training sessions. Accordingly, we hypothesize that PSD shares variance with fantasy proneness due to conceptual commonalities (Hypothesis 3). Further, competitiveness is defined as the desire to win in interpersonal situations (Helmreich & Spence, Citation1978). Individuals with high trait competitiveness are driven to achieve superior performance, set higher performance goals, and generally approach situations more competitively (Hinsz & Jundt, Citation2005). Trait competitiveness usually manifests itself through improving performance relative to others, while during PSD athletes’ focus on outperforming themselves. Notwithstanding, trait competitiveness and PSD show conceptual commonalities and are therefore expected to share variance (Hypothesis 4). Finally, personal initiative is a tendency toward proactive behavior, exemplified by active undertakings, self-starting behavior, going beyond what is required, and actively shaping one’s environment (Frese et al., Citation1997). As one of the defining features of PSD is its bottom-up nature, we expect PSD to share variance with personal initiative (Hypothesis 5). Nevertheless, PSD is context-specific to sport and more oriented toward proactive integration of play elements compared to personal initiative.

Furthermore, phase 1 intends to establish divergent validity of the PSD instrument. Divergent validity is found when theoretically dissimilar constructs do not share or share little variance (Campbell & Fiske, Citation1959). That is, if the PSD instrument accurately assesses what it intends to measure, PSD scores should not be related to theoretically distinct concepts. In our study, we have included negative affect to test divergent validity of the newly developed instrument. Negative affect reflects a propensity to experience a high frequency of negative emotions (Thompson, Citation2007). PSD should not capture a predisposition of emotional states but should merely reflect the frequency by which athletes proactively use cognitive-behavioral strategies to integrate play elements into training sessions. Therefore, we anticipate that these concepts are not, or only marginally, related (Hypothesis 6)—demonstrating the instrument’s ability to purposefully capture PSD only.

Method

Participants and procedure

The study was pre-approved by and complied with guidelines of a Dutch University’s ethical committee (application number 20-071). We aimed to include adult athletes of any level of experience who train regularly. Consequently, individuals were eligible to participate when they were 18 years or older and trained at least once a week. We followed an inclusivist sport perspective (see Kobiela, Citation2018), allowing for the participation of those who practiced physical skill sports (i.e., sprint and endurance sports), bodily control skill sports (i.e., agility sports), and cognitive skill sports (i.e., mind sports). Athletes were recruited to complete an online questionnaire through social media and local sports associations. A total of 594 individuals participated in the study after providing informed consent. However, 32 participants were excluded from further analysis due to being younger than 18 (n = 19) or not training at least once per week (n = 13).

The final sample (N = 562) allowed us to detect small to medium effect sizes (.15) with a power of .95 and an alpha of .05 (Faul et al., Citation2007). The sample included slightly more athletes who identified as male (57.5%) than female (42.3%). The mean age was 41.46 (SD = 16.78), ranging between 18 and 78. On average, participants in this sample trained 2.97 (SD = 1.61) days a week. Moreover, the sample included athletes who participated in professional sports leagues on a regional (12.5%), national (15.5%), or international (4.8%) level. Numerous different types of sports were practiced by the athletes, for example: ball sports (19.9%), watersports (15.7%), mind sports (8.7%), athletics (8.7%), agility sports (6.6%), cycle sports (5.0%), combat sports (5.3%), gymnastics (3.9%), weightlifting (3.7%), racket sports (1.4%), winter sports (0.4%), and equestrian sports (0.4%). On average, the athletes in this sample practiced the same sport for 18.67 (SD = 15.28) years, ranging between 1 and 67 years of experience.

PSD instrument development

Our instrument is based on an instrument to assess Playful Work Design (PWD), which measures the proactive design of work tasks (Scharp et al., Citation2019). Due to the conceptual commonalities of the two concepts, an adapted form of the already validated PWD instrument was created to assess PSD (see Scharp et al., Citation2022). The PWD instrument included 12 items and two dimensions: designing fun and designing competition. To evaluate whether the PWD items fitted an athletic context and were generic enough to cover all sports, (elite) athletes and sports psychologists were consulted. As a result, the wording of the 12 PWD items was slightly adjusted to fit an athletic setting. The terms “work,” “work tasks,” and “job” were replaced by “training sessions” or “training activities,” and the item “I look for humor in the work tasks I need to do” was replaced by “I look for fantasy in the things I need to do during training sessions” (see for item formulation). The general instruction of the items was: “The following statements refer to self-initiated activities you may engage in during your sports training. Please indicate to what extent you show each of the following behaviors.” Athletes were asked to report the frequency of PSD behavior on a seven-point scale since this number of response options yields reliable item scores and is regarded as easy to use (Preston & Colman, Citation2000). The response options were all verbally anchored and ranged from never to always.

Table 1. Results of the analysis of item parameters, EFA, and CFA phase 1.

Materials and measures

Participants were asked to report the extent to which they agreed with each statement on a seven-point instrument (1 =totally disagree, 7 = totally agree), unless indicated otherwise.

PSD was assessed with a newly developed 12-item Playful Sport Design Instrument (see for item formulation). Participants were asked to indicate how often they engaged in PSD on a seven-point frequency scale (1 = never, 7 = always).

Playfulness was assessed using the Short Measure for Adult Playfulness (SMAP; Proyer, Citation2012b; α = .90). The instrument consisted of five items, for example: “It does not take much for me to change from a serious to a playful frame of mind.”

Competitiveness was assessed using the competitive subscale of the Work and Family Orientation Questionnaire (Helmreich & Spence, Citation1978; α = .69). The instrument is composed of six items (e.g., “I accept challenging tasks”).

Fantasy Proneness was assessed through the six-item Capacity for Fantasy and Imagination Scale (Costa & McCrae, Citation1992; α = .87). An example item from this scale includes: “I feel like my imagination can run wild.”

Personal Initiative was operationalized through a seven-item measure developed by Frese et al. (Citation1997; α = .84). An example item is: “I actively attack problems.”

Negative Affect was operationalized through the five-item negative affect instrument of the brief Positive and Negative Affect Schedule (PANAS; Thompson, Citation2007; α = .81). An example item is: “How often do you feel upset?.” Participants answered the questions using a seven-point frequency scale (1 = never, 7 = always).

Strategy of analysis

Hypothesis 1 (factorial validity) was tested using Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). To avoid overlap of the sample for the EFA and the CFA, the sample was split into two random groups, containing 281 cases each. Chi-square tests and a MANOVA showed that there were no significant differences between the two sample groups on the aforementioned participant characteristics and PSD mean scores, indicating that the random allocation of participants into two groups was successful. The sample size provided enough statistical power for factor analyses (Comrey & Lee, Citation1992). The EFA was carried out in IBM SPSS Statistics 25 and the CFA was conducted in the IBM SPSS AMOS 26 Graphics software package. Model fit to the data was assessed through three fit indices for the CFA: Comparative Fit Index (CFI), Tucker Lewis Index (TLI), and Root Mean Square Error of Approximation (RMSEA). The general rule of thumb is applied for CFI and TLI, for which ≥.90 indicates a reasonable fit and ≥.95 indicates good fit (Hoyle, Citation1995). RMSEA is acceptable at ≤.10, although ≤.05 indicates a closer fit (Lai & Green, Citation2016). In the statistical software package IBM SPSS Statistics 25, we calculated Pearson correlations between PSD and several personality traits to evaluate construct validity (Hypothesis 2—Hypothesis 6). Additionally, we calculated z-scores to explore if a PSD dimension (designing fun versus designing competition) was differently related to these personality traits. Further, reliability was assessed using Cronbach’s alpha.

Results

Factorial validity

Exploratory factor analysis

The results of Bartlett’s test of sphericity (χ2 = 1688.08, df = 66, p < .001) Kaiser-Meyer-Olkin measure of sampling adequacy (.84) provided solid justification for factor analysis. The skewness and kurtosis z values for the items indicated that the data were approximately normally distributed. Maximum likelihood estimation was used for the EFA, as it has been suggested to be the best extraction method when data is normally distributed (Costello & Osborne, Citation2005). Since the proposed dimensions of PSD were expected to be correlated, oblique rotation (promax) was employed. To strengthen the argument for the number of factors, a principal components parallel analysis was performed. The final solution consisted of two factors and 12 items (see ). All items loaded on the intended factor (higher than .30) and did not show cross-loadings, so no items were removed (Costello & Osborne, Citation2005). The two-factor solution explained 58.34% of the variance. The first factor explained 43.97% of the variance (Eigenvalue = 5.28). Items related to using creativity, fantasy, imagination, and making training activities more fun loaded on this factor. The second factor explained 14.37% of the variance (Eigenvalue = 1.73). Items that loaded on this factor related to approaching a training session as an exciting challenge, competing with oneself, pushing oneself to do better, and trying to set time records. Therefore, we considered factor one to represent designing fun (six items), while factor two to constitute designing competition (six items). The factor loadings ranged between .53 and .93 for designing fun and between .42 and .90 for designing competition (see ). The two factors were positively interrelated (r = .61, p < .001).

Confirmatory factor analysis

A two and one latent factor model were tested through maximum likelihood estimation. The latent variables were allowed to covary. In the initial analysis of the CFA, both uncorrected Model 1 and Model 2 showed fit issues (see ). To construct the corrected models, three covariances were added between error terms for items belonging to the same factor, namely items 1 and 2 (MI = 30.45, p < .001), items 6 and 10 (MI = 20.85, p < .001), and items 11 and 12 (MI = 58.66, p < .001). These pairs of items showed considerable overlap in item formulation, providing justification for adding covariances between error terms rather than removing the items (cf. Jöreskog & Sörbom, Citation1993). Additionally, we considered it best to keep the items and add covariances, because these items theoretically align with and thus are true reflections of PSD. Furthermore, the items correlated considerably with their associated dimension score, as well as the PSD total score (see ). The two-factor model provided a better fit than the one-factor model for both the corrected (Δχ2 (1) = 227.78, p < .001) and uncorrected models (Δχ2 (1) = 262.03, p < .001). Hence, Hypothesis 1 was supported: a two-factor model accurately represented the observed data. Factor loadings ranged between .57 and .88 for designing fun and between .37 and .85 for designing competition (see ). The two factors were positively related (r = .70, p < .001).

Table 2. Model fit results of CFA phase 1.

Reliability

Internal consistency

The internal consistency of scores generated by the instrument for designing fun and designing competition were α = .88 and α = .83, respectively. The internal consistency of the overall PSD score generated by the instrument was α = .88. The scores yielded by the PSD instrument were therefore found to be reliable for a heterogeneous sample in a sports context.

Validity

Construct validity

The results of the bivariate correlation analysis, means, and standard deviations can be found in . PSD was positively related to playfulness, supporting Hypothesis 2. Additionally, we explored the relations between the two dimensions of PSD and playfulness. Designing fun was more strongly related to playfulness than designing competition (z = 5.09, p < .001), although both dimensions showed a positive relation to this trait. Further, PSD was positively related to fantasy proneness. Consequently, Hypothesis 3 was supported. Both dimensions of PSD were positively related to fantasy proneness. Designing fun’s relation to fantasy proneness was significantly stronger than designing competition’s (z = 2.63, p = .009). Hypothesis 4 was also supported, as PSD was positively related to competitiveness. Designing fun and designing competition both had a positive relation with competitiveness, although designing competition showed a more robust correlation (z = 5.99, p < .001). In line with our expectations, PSD showed a positive relationship with personal initiative, supporting Hypothesis 5. Designing competition had a stronger relationship with personal initiative than designing fun (z = 2.61, p = .009), although both dimensions were positively related to personal initiative. Finally, PSD lacked a significant relationship with negative affect, indicating that divergent validity between PSD and negative affect was indeed present. Thus, Hypothesis 6 was supported as well. Designing fun turned out unrelated to negative affect (r = .02, p = .469) and designing competition was weakly and positively related to negative affect (r = .08, p = .048).

Table 3. Results of the bivariate correlation analysis Phase 1 and 2.

Summary of results phase 1

In this first phase, we developed the PSD scale based on a recent scale that measured playful design of tasks in a work context (PWD; Scharp et al., Citation2019). We showed that PSD, like PWD, consists of two dimensions, designing fun (i.e., reflecting ludic play elements such as proactive use of imagination, creativity, and fantasy during sports training) and designing competition (i.e., reflecting agonistic play elements such as proactively creating self-oriented challenges and deliberately trying to set time records). As regards convergent validity, PSD was indeed related to measures PSD should theoretically relate to (competitiveness, fantasy proneness, personal initiative, and playfulness). Finally, we have found that the PSD instrument does not unintentionally capture variance related to negative affect.

Phase 2: further insight into psychometric properties

Further assessment of convergent validity

In phase 2, we aimed to deepen our understanding of PSD’s psychometric properties (i.e., the test-retest reliability, construct validity, and predictive validity). To do so, we approached phase 1’s sample again half a year later, also enabling using longitudinal data to assess test-retest reliability and predictive validity.

To increase our understanding of PSD’s convergent validity, in phase 2, we considered additional personality traits. As PSD consists of two dimensions, designing fun and designing competition, we argue that PSD should share variance with fun-seeking and achievement striving. Fun-seeking individuals crave fun sensations and are willing to try new things if they think it will be amusing (Carver & White, Citation1994). Unlike trait fun-seeking, PSD reflects cognitions and behaviors related to integrating fun elements into athletic training sessions. Although conceptually distinct, PSD and trait fun-seeking are expected to share variance due to their theoretical similarities (Hypothesis 7). On the other hand, achievement striving refers to a predisposition to do more than is expected, demand high quality, and turn plans into action (Costa & McCrae, Citation1992). Along similar lines, PSD captures cognitive-behavioral strategies that integrate play elements, such as aiming to improve upon one’s previous performance. Therefore, we expect PSD and achievement striving to share variance (Hypothesis 8), albeit that PSD is conceptualized specifically for sport and is more autotelic than achievement striving. Additionally, qualities of PSD should allow for establishing convergent validity with openness to experience. Those who are open to experience indulge in fantasy, prefer novelty, and are more able to come up with new ideas that challenge the status quo (McCrae & Costa, Citation1997). Likewise, PSD captures cognitions and behaviors of athletes who unconventionally and playfully approach training tasks. Due to the congruence between these concepts, PSD and openness to experience should share variance (Hypothesis 9).

The divergent validity of the PSD instrument is also further considered. We examine relations between PSD, perfectionism, and procrastination. Perfectionism (i.e., self-critical perfectionism) refers to striving for flawlessness and being afraid of making mistakes (Feher et al., Citation2020). On the contrary, PSD should not capture impeccable execution of training tasks, but rather should relate to improving oneself for the sake of playing. Therefore, these constructs should not, or only marginally, be related (Hypothesis 10). Procrastination is defined as a type of behavior that makes individuals put off the execution of tasks (Tuckman, Citation1991). On the other hand, PSD is not conceptualized around directly executing or postponing tasks and should, therefore, not share variance with procrastination (Hypothesis 11).

Predictive validity

In addition to deepening our understanding of PSD’s convergent validity, phase 2 aims to establish predictive validity between PSD, flow, and athletic performance. Flow is a peak experience of concentration and is a highly desirable state for athletes (Swann, Citation2016) because it allows for the effortless execution of tasks (Ravizza, Citation1977) and personal growth (Csikszentmihalyi, Citation1988). Flow in sport can be accomplished through a balance in challenge and skills, setting clear objectives, receiving immediate feedback, feeling a sense of control, and intrinsic motivation (cf. Diana et al., Citation2015)—all of which can arguably be impacted by PSD. First, PSD allows athletes to initiate tailor-made challenges, making realizing the equilibrium of challenge and skill more feasible. Second, PSD’s bottom-up nature goes hand in hand with athletes’ feeling in control over the design of their training sessions. Third, PSD can stimulate flow when athletes actively set objectives, monitor their performance, and provide clear feedback cues (Csikszentmihalyi, Citation1975). Finally, play is characterized by intrinsic motivation, and intrinsic motivation fuels flow (e.g., Csikszentmihalyi, Citation2014; Huizinga, Citation1949). In support of this line of reasoning, it has been found that enjoying practice is a facilitator of flow (Csikszentmihalyi, Citation1990; Jackson, Citation1992). In sum, we suggest that PSD is positively related to flow during training tasks (Hypothesis 12).

Further, we suggest that PSD can increase performance—which is a function of ability and motivation (Locke, Citation1965)—via many pathways. Since it has been found that play enhances energy levels, a fun experience and intrinsic motivation, fun, (i.e., Barreiro & Howard, Citation2017; Fluegge-Wolf, Citation2014; Visek et al., Citation2017), it follows that play during training sessions may have a positive impact on sports performance. More specifically, Côte et al. (Citation2003) proposed that play-like activities increase the quantity of practice via intrinsic motivation, ultimately resulting in improved performance. Furthermore, the quality of practice is likely affected by PSD too. By engaging in play, one trains for the unexpected, which may advance functioning in uncertain performance landscapes (Furley, Citation2021). Moreover, playful training strategies can help athletes engage in diversified experiences (Côté et al., 2007; Erickson et al., Citation2017; Furley, Citation2021), which helps to avoid training monotony and boredom that would otherwise negatively impact athletic performance (Velasco & Jorda, Citation2020). These diversified experiences may also increase sport-specific creativity, efficient problem-solving, and decision-making—all contributing to excellent sports performance (Barreiro & Howard, Citation2017; Côté et al., 2007; Erickson et al., Citation2017; Furley, Citation2021; Kingston et al., Citation2020; McCarthy, Citation2011). In addition, it has been argued that play stimulates (implicit) learning of, for instance, sport-specific tactics and skills, which eventually benefit performance (Griffin & Butler, Citation2005; Launder & Piltz, Citation2013). By designing training sessions as more challenging, i.e., adding goals that are increasingly complex and difficult, performance during competition may be enhanced (Griffin & Butler, Citation2005; Launder & Piltz, Citation2013). Setting time records could serve as intermediate subgoals linked to increased effort and excellent performance (Hall & Byrne, Citation1988; Locke & Latham, Citation1990). In addition, trying to keep score could serve as performance feedback, which may also improve performance (Lord et al., Citation2010). Taken together, there is reason to believe that PSD positively relates to athletic performance, both subjective (Hypothesis 13) and objective (Hypothesis 14) performance.

Method

Participants and procedure

This study was pre-approved by, and complied with the guidelines of, the ethical committee of a Dutch University (application number 20-071). Participants who took part in phase 1 (T1) were asked to participate in a follow-up study six months later (T2). The second questionnaire was completed by 136 participants. However, some participants were excluded from further analysis due to not using the same ID as during phase 1 (n = 5), leaving us unable to link the data of the two measurements. The participant group was representative of the general sample in phase 1. The sample size (N = 131) allowed us to detect medium effect sizes (.30) with a power of .95 and an alpha of .05 (Faul et al., Citation2007).

Materials and measures

Participants were asked to report the extent to which they agreed with a statement on a seven-point Likert scale (1 =totally disagree, 7 = totally agree) unless indicated otherwise.

Playful Sport Design was assessed with the same instrument as was used in Phase 1.

Fun Seeking was assessed through the Fun Seeking Behavior subscale of the Behavioral Inhibition/Behavioral Activation (BIS/BAS) instrument (Carver & White, Citation1994; α = .75). This scale consisted of four items (e.g., I’m always willing to try something new if I think it will be fun”).

Achievement Striving was assessed through a ten-item subscale of conscientiousness from the NEO Personality Inventory (Costa & McCrae, Citation1992; α = .83). An example item is: “I plunge into tasks with all my heart.”

Openness to Experience was assessed using the ten-item Openness to Experience subscale of the Integrative Big Five Trait Taxonomy (John et al., Citation2008; α = .83). An example item is: “I am curious about many things.”

Perfectionism was assessed through the Critical Perfectionism subscale of the Big Three Perfectionism Scale Short Form (BTPS-SF; Feher et al., Citation2020; α = .87). An example item is: “The idea of making a mistake frightens me.”

Procrastination was assessed using the Procrastination scale (Tuckman, Citation1991; α = .90). An abbreviated version of the scale was created by selecting the items with the highest factor loadings. The final scale consisted of eight items, including, “When I have a deadline, I wait till the last minute.”

Flow was operationalized through the nine-item Core Flow scale by Martin and Jackson (Citation2008; α = .76). An example item is: “I do things spontaneously and automatically whilst playing sports, without having to think.”

Subjective Performance was assessed by a general self-report of sports performance was used, which is less susceptible for factors such as one’s gender, type of sport, and age. Self-reported performance was assessed through the Relative Mastery Measurement Scale (George et al., Citation2004; α = .77). Specifically, we used four Satisfaction items (e.g., “I am very pleased with my performance in this sport”) and four Effectiveness items (e.g., “I did not produce the results I expected within this sport”).

Objective Performance was assessed by the frequency of personal bests, offering a more ‘objective’ measure of performance that reflects one’s relative improvement. The number of personal bests was assessed through the question: “How many times did you improve on your personal best during the past six months?” (1 = 0 times, 7 = more than 5 times).

Strategy of analysis

All phase 2’s analyses were conducted in the statistical software package IBM SPSS Statistics 25. We correlated PSD at T1 and T2 to analyze test-retest reliability. Similar to phase 1, we calculated bivariate Pearson correlations between PSD (measured at T2) and several personality traits (measured at T2) to further evaluate PSD’s construct validity (Hypotheses 7–11). Again, we calculated z-scores to explore if a PSD dimension (designing fun versus designing competition) was differently related to these personality traits. Lastly, predictive validity (Hypothesis 12–14) was analyzed by calculating bivariate Pearson correlations between PSD (T1) and flow and performance (T2). Different time points were chosen in an effort to minimize Common Method Variance (CMV).

Results

Reliability

Test-retest reliability

The test-retest reliability of scores generated by the instrument for designing fun r = .65 (p < .001) and designing competition r = .64 (p < .001), respectively. The test-retest reliability of the overall PSD score generated by the instrument was r = .66, (p < .001). The scores yielded by the PSD instrument were therefore found to be reasonably stable over time for a heterogeneous sample in a sports context.

Validity

Convergent validity

Results of the bivariate correlation analysis, means, and standard deviations are shown in . In line with Hypothesis 7, PSD was positively related to fun-seeking. Both PSD dimensions did not significantly differ in strength with regard to their relation to fun-seeking (z = .17, p = .886). The relation between PSD and achievement striving was positive too, supporting Hypothesis 8. Both PSD dimensions were positively related to achievement striving. The difference in the strength of the relations was nonsignificant (z = −1.76, p = .078). Hypothesis 9 was also supported: PSD and openness to experience were found to be positively related. Designing fun had a stronger relation with openness to experience than designing competition (z = 3.03, p = .002), but both dimensions showed a positive relation. As theorized, PSD and perfectionism were not related. Therefore, Hypothesis 10 was supported. However, from the two dimensions, designing fun showed a negative relationship with perfectionism, while designing competition was unrelated to perfectionism. Finally, in support of Hypothesis 11, procrastination was unrelated to PSD, as were both dimensions of PSD.

Predictive validity

The average score on PSD was 4.18 (SD = 1.06). The mean scores on designing fun and designing competition were 3.91 (SD = 1.27) and 4.46 (SD = 1.27), respectively. On average, participants scored 5.75 (SD = 0.65) on flow, 5.06 (SD = 0.82) on self-reported performance, and 1.35 (SD = 1.85) on the frequency of achieving a personal best. In line with our expectations, PSD was positively related to flow (r = .55, p < .001). Therefore, Hypothesis 12 was supported. Designing fun and designing competition both showed a positive relationship with flow (r = .40, p < .001 and r = .58, p < .001, respectively). PSD was significantly related to self-reported performance (r = .19, p = .029). Hence, Hypothesis 13 was supported. Upon further inspection, designing fun was found to have a positive relation with self-reported performance (r = .18, p = .042), while designing competition was unrelated to self-reported performance (r = .16, p = .078). As regards Hypothesis 14, PSD was unrelated to improving a personal best (r = .11, p = .201), which means that this hypothesis was not supported. Designing fun lacked a relationship with frequency of personal bests (r = .03, p = .774), while designing competition had a positive relation with the frequency of personal bests (r = .18, p = .040). Consequently, it seemed that designing competition showed more robust relations to relative improvement of performance.

Summary of results phase 2

In phase 2, the test-retest reliability of the PSD instrument was satisfactory. Second, we found that individuals who reported more PSD, score higher on fun seeking, achievement striving, and openness to experience. As expected, the PSD instrument did not unintentionally capture procrastination behaviors and perfectionism, although the designing fun dimension correlated negatively with perfectionism. Lastly, PSD predicted subjective/objective sports performance and flow, showing the relevance of PSD for athletes.

Phase 3: factorial and predictive Validity of PSD among runners

Predictive validity

The main objectives of phase 3 were to replicate PSD’s factorial validity and predictive validity. As regards the first objective, by investigating a more specific sports context (i.e., running), we examined whether PSD’s two-factor structure found in phase 1 also fitted a sample of runners. As regards the latter, we sought to enhance insight into the relationship between PSD and running performance. In line with the aforementioned reasoning in phase 2, we postulated that PSD enhances performance. As this general theoretical framework should also apply to a more specific sports context, we hypothesize that PSD relates to better running performance, both subjective (Hypothesis 15) and objective (Hypothesis 16) performance. By limiting the heterogeneity of the sample and the associated broad interpretation of what athletic achievement is, the relationship between PSD and performance should become clearer. Running is studied in this phase, as this sport offers uniform and specified performance indicators (i.e., self-rated goal attainment on 10,000-mand time on 10,000 m). In addition to testing our hypothesis regarding the relationship between PSD and performance for runners, the reliability and the factor structure of the PSD instrument was tested once more. Due to the possible effects of heterogeneity on variance in observed item scores, the reliability scores found in phase 1 may have been inflated (Thompson, Citation2002).

Method

Participants and procedure

This study was pre-approved by, and complied with the guidelines of, an ethical committee of a Dutch university (application number 21-051). Participants were recruited to partake in a 10,000-m run via social media runner groups. A virtual run was organized, due to strict COVID-19 restrictions at the time of the study, which did not allow for a sports event to take place. Individuals of all experience levels were eligible to participate but had to be at least 18 years old. Before signing up for the 10,000-m run, participants were asked to provide a declaration of good health and give informed consent. A total of 334 athletes signed up for the virtual 10,000-m run. Eight athletes were excluded due to not providing consent for their data to be analyzed (n = 3) or not providing a declaration of good health (n = 5). These athletes did not receive further information regarding the 10,000-m run, nor did they receive the questionnaires. All other participants received an email in which instructions were given since the researchers were unable to directly oversee the timed run due to COVID-19. Runners were asked to establish a route for the run beforehand, track the time in hours and minutes by using an app or stopwatch, and run 10,000-m as fast as possible. The start and finish had to be in proximity of 1 km to each other to avoid the influence of elevation in the route. To ensure that all participants completed exactly 10,000-m, they were provided with an online tool to determine their track beforehand. All participants completed the 10,000-m run during the same weekend at different locations throughout the country. Immediately after the run, participants completed a second questionnaire.

The second questionnaire was completed by a total of 218 athletes. Six participants were excluded due to not fully completing the run (n = 4) or not using the same ID as the first questionnaire (n = 2). The final sample (N = 212) allowed us to detect small to medium effect sizes (.20) with a power of .95 and an alpha of .05 (Faul et al., Citation2007). The sample consisted of a roughly equal number of athletes who identified as male (50.9%) and female (49.1%). The mean age was 41.81 (SD = 12.91) and ranged between 18 and 77. On average, participants in this sample trained 2.74 (SD = 1.10) days and 3.42 hours (SD = 2.98) a week. The athletes in this sample practiced running for an average of 10.07 (SD = 9.93) years, ranging between 0 and 55 years of experience.

Materials and measures

Playful Sport Design was assessed prior to the 10,000-m run, using the same measurement tool as was used in phase 1 and 2. We asked participants to report PSD during the training sessions over the previous month on a seven-point frequency scale (1 = never, 7 = always).

Subjective Performance was assessed after the 10,000-m run. Self-reported goal attainment was assessed through the 12-item Attainment of Sport Achievement Goals Scale (A-SAGS; α = .87) as described in Amiot et al. (Citation2004). This instrument captured three types of athletic attainment—being mastery, self-referent, and normative goal attainment. The wording of the items was specified to contextualize the self-report to the 10,000-m run (e.g., “During the 10,000-m run, I did better than my usual performances”).

Objective Performance was assessed by asking the time that was needed to complete the 10,000-m run. Participants were asked to report their own time (i.e., “What was your time on the 10,000-m run? Report the number of minutes and seconds (e.g., 49.30)”), as we could not directly track the time of our participants from a distance for privacy reasons. Participants were reminded that it was of great importance to report the exact time on 10,000 m honestly and that prizes would not be allocated based on the reported time.

Strategy of analysis

Using IBM SPSS 25, Cronbach’s alpha was calculated to evaluate the internal consistency of the PSD scale in a sample of runners. Additionally, Pearson correlations were calculated to test Hypotheses 14 and 15. To test PSD’s factor structure again in a different sample, a CFA was conducted in the IBM SPSS AMOS 26 Graphics software package. In our attempt to replicate factorial validity of phase 1, used the same analytical strategy and interpretation guidelines for the fit indices as we did for phase 1’s CFA.

Results

Reliability

Internal consistency

The internal consistency of scores generated by the instrument for designing fun and designing competition were α = .86 and α = .82, respectively. The internal consistency of the overall PSD score generated by the instrument was α = .87. The scores yielded by the PSD instrument were therefore found to be reliable for a homogenous sample of runners.

Validity

Factorial validity

After replicating phase 1’s procedure for a CFA using a sample of runners, we again found that a two-factor model showed a better fit to the data than the one-factor model (see for factor fit indices of the different models). Factor loadings ranged between .54 and .87 for designing fun and between .34 and .87 for designing competition (see in Supplementary Material). The two factors were positively related (r = .63, p < .001).

Table 4. Model fit results of CFA phase 3.

Predictive validity

On average, the athletes in our sample completed the run in 55.31 (SD = 8.92) minutes. The mean score on PSD during training was 3.64 (SD = .90). Participants’ mean score on designing fun and designing competition was 3.13 (SD = 1.05) and 4.45 (SD = 1.06), respectively. PSD showed a positive relation with self-reported goal attainment (r = .15, p = .029), which means that Hypothesis 14 received support. Designing fun turned out not to be related to self-reported goal attainment, while designing competition was (r = .05, p = .479 and r = .21, p = .003, respectively). Similarly, PSD was found to relate to a faster time on the 10,000-m (r = −.16, p = .017). Additional analyses revealed that the relation between PSD and time on 10,000-m remained significant after controlling for age and gender (β = −.13, p = .040). Therefore, Hypothesis 15 was supported. It must be noted, however, that the relationship between PSD and time on 10,000-m was found to be non-significant for designing fun (r = −.09, p = .205) and significant for designing competition (r = −.19, p = .005).

Summary of results phase 3

PSD was found to be positively related to subjective performance in a sample of runners. PSD was also found to be positively related to objective performance, i.e., a faster time on a 10,000-m run. Designing competition seemed to predict subjective and objective performance better than designing fun.

General discussion

In this paper, we introduced the concept Playful Sport Design (PSD) as a proactive cognitive-behavioral orientation that makes adult athletes playfully design their sports training. We found that athletes do this in two ways: by designing fun and designing competition. More specifically, designing fun, which reflects ludic play elements, was found to be captured by proactive use of imagination, creativity, and fantasy during training sessions. Designing competition, which represents agonistic play, was found to indicate proactively creating self-oriented challenges, approaching training sessions as a set of exciting tasks, keeping track of performance, and trying to set time records. These findings support the duality of play as portrayed in play literature (e.g., amusement and challenge, see Ellis, Citation1973; Huizinga, Citation1949; Scharp et al., Citation2019).

We developed a PSD instrument, which yields reliable and valid scores. These PSD scores reflect the frequency by which athletes integrate fun or challenging play elements into training sessions. We found support for convergent, discriminant, and predictive validity. We found that individuals who were more likely to playfully design their training sessions are—in general—playful, prone to fantasy, fun-seeking, competitive, open to new experiences, and often show personal initiative. Furthermore, PSD was unrelated to negative affect, perfectionism, and procrastination, which supports the discriminant validity of PSD. By showing that PSD predicts flow and performance indicators, we show that PSD is relevant for athletes. presents a complete overview of support levels for this study’s hypotheses.

Table 5. Synthesis of evidence.

Study’s contributions

We believe our study has several contributions to the literature. First, by showing that adult athletes proactively add play elements to their sports training and that doing so aids flow and performance, we complement earlier studies that mainly focused on the benefits of play during youth sports training (Arufe-Giráldez et al., Citation2022; Barreiro & Howard, Citation2017; Côté et al., 2007; Erickson et al., Citation2017; Launder & Piltz, Citation2013; Reed, Citation2020). These results also align with previous suggestions that playing is not limited to children in its functionality and serves important purposes in adult life (e.g., Bakker et al., Citation2020; Proyer, Citation2012a; Van Vleet & Feeney, Citation2015). Interestingly, similar to observations that play decreases throughout childhood (Erickson et al., Citation2017), we also see in our samples a slight decrease in PSD as adult athletes become older (r’s ranging from −.13 and −.18; see Supplementary Material), primarily driven by the decrease in designing competition. The physical decline associated with aging may result in less playful behavior, although earlier research highlights that competition motivates older age groups to engage in sport (Stenner et al., Citation2020). Furthermore, as athletes in our samples ranged from recreational to elite athletes and participated in a wide range of sports, we suggest that athletes from all levels and who play different types of sport may benefit from PSD in terms of a more positive training experience. Exploratory analyses indeed revealed that athletes of different levels and sports did not differ regarding the statistical relationships found between PSD and sports performance and flow.

Second, by showing that playfully designing tasks is relevant in the sports domain, we build on previous work showing that the playful design of work tasks is beneficial to individuals (e.g., Bakker et al., Citation2020; Scharp et al., Citation2019). Our findings underline earlier findings and recommendations that concepts developed within the work context can advance our understanding of athletic performance (Balk et al., Citation2019; DeFreese & Smith, Citation2013; Wagstaff, Citation2019). Indeed, it appears that athletes stand to gain from PSD during sports training as employees stand to gain from PWD during work endeavors. Specifically, PSD was shown to benefit athletes through increased flow experiences and improved performance. It should be noted, however, that the found effect sizes were modest. Although these effect sizes were modest, we argue that PSD can be meaningful. For instance, a slight improvement in performance is meaningful for elite athletes, for which the margin between success and failure can be extremely narrow (Swann, Citation2016). Furthermore, a slightly more positive training experience may be important for an individual who struggles to find motivation for the next sports session.

Third, we suggest that PSD is a valuable addition to existing play forms during sports training because of its proactive nature. PSD refers to self-initiated play to bring about change in the training or oneself (Parker et al., Citation2010). This makes PSD a bottom-up approach, which contrasts other top-down approaches in which a trainer, teacher, or coach facilitates or introduces play into sports training (e.g., see Arufe-Giráldez et al., Citation2022; Barreiro & Howard, Citation2017; Côté et al., 2007; Erickson et al., Citation2017; Launder & Piltz, Citation2013; Reed, Citation2020). As play is a voluntary pursuit (Caillois, Citation1961; Huizinga, Citation1949), we suggest that self-initiated play increases its effectiveness. Theoretically, PSD shares overlap with free play and deliberate play in which youth are in control of deciding what they play, with whom they play and how they play (Barreiro & Howard, Citation2017; Erickson et al., Citation2017). In contrast to PSD, free play is spontaneous instead of proactive. Furthermore, deliberate and free play often occur outside organized sports settings, while PSD occurs during both organized and unorganized sports settings. One might argue that, given PSD’s proactive nature, there is less room for PSD in organized settings and/or in settings in which an athlete is supervised. Interestingly, exploratory analyses revealed that PSD was higher among athletes who are supervised by a trainer compared to athletes who are not supervised, and that PSD slightly occured more often in organized versus unorganized settings (see Supplementary Material). These findings align with previous research indicating that play and proactive behaviors can be promoted and facilitated in an organized settings (e.g., Côté et al., 2007; Wu & Parker, Citation2017).

Fourth, by finding and defining that playful sport design has two dimensions (redesigning tasks or activities to be more fun or challenging), we integrate aspects of previous playful sports training forms that often focused primarily on one of these approaches. For instance, free play and deliberative play are mainly focused on increasing fun (Barreiro & Howard, Citation2017; Côté et al., 2007) and teaching games for understanding (Griffin & Butler, Citation2005), play practice (Launder & Piltz, Citation2013), and structured play (Kingston et al., Citation2020) are primarily focused on increasing challenges. Although we have sometimes found that the individual subdimensions were differently related to outcomes (see ), we contend that both strategies are manifestations of PSD, and that PSD can be seen as a higher-order cognitive-behavioral construct. A person is showing PSD if either of the two strategies is used, and the choice for a specific strategy and its usefulness may depend on the context or the goal (see also Scharp et al., Citation2021). For instance, we found that designing fun was related to flow and subjective performance, not objective performance. It is possible that designing fun is helpful to attain a specific performance goal, such as being flexible in tactics when a competition goes differently than planned, thus in a context of uncertainty. Another explanation for this finding is that designing fun enhances objective performance indirectly, e.g., via flow (i.e., in additional exploratory analyses, we indeed found an indirect effect from designing fun to objective performance via flow; see Supplementary Material for results). Future research could be directed toward formulating specific hypotheses regarding the context in which a specific strategy is most effective and which outcomes are likely to follow.

Limitations and avenues for future research

Although the current findings with regard to the newly developed PSD instrument are promising, our study is not without limitations. First, the studies may have suffered from Common Method Variance (Podsakoff et al., Citation2003). By including multiple self-report measures in the surveys, the correlations may have been in- or deflated (Reio, Citation2010). This is only critical for interpreting the nomological network of PSD, as it is preferable to test for relationships of similar constructs assessed through different methods. Therefore, it may be interesting to include other-ratings of PSD in future studies.

Second, the cross-sectional nature of most of our data should be taken into consideration when interpreting the findings. Although we also used longitudinal data to establish a temporal order, causal inferences are impossible. Further, the relatively high test-retest correlations for the PSD dimensions suggest that PSD is rather stable over time, and may even have trait-like properties. One might argue that a half year time lag in phase 2’s study is too short to observe change in PSD. However, we would like to argue that even though some individuals may be more inclined to use PSD than others, PSD is specific behavior that may fluctuate from training episode to training episode (see also Scharp et al. (Citation2021), who report fluctuations in playful work design across work days). Thus, we conceive PSD as a state-like concept that may change over the course of days, weeks, and months. We recommend research designs with multiple assessment moments to better capture stability and change over time. It would also be valuable to test PSD’s predictive validity through an experimental study design to make an attempt at establishing causality. Lastly, future research could include proposed mechanisms underlying the PSD-performance relationship (e.g., learning, boredom) to enhance insight into the psychological effects of PSD and causality.

Third, our chosen objective performance indicators deserve attention. In phase 2, we decided to measure objective performance by the frequency of one’s personal bests. Although personal bests apply to a heterogeneous sample of athletes, this indicator is not relevant in all sports, such as many team sports. Additionally, as personal bests may also depend on personal characteristics (e.g., age) and the environment (e.g., weather), this indicator may not always reflect one’s actual performance. Similarly, in phase 3, the time to complete the 10,000-m run was chosen as an objective performance indicator, which may also depend on personal characteristics and the environment. Notwithstanding the shortcomings of our objective performance measures’ shortcomings, modest associations between PSD and objective performance were still found. These associations may be somewhat stronger when PSD is studied in a specific sports context with relevant objective performance indicators, which we recommend for future research (see e.g., Hughes and Bartlett (Citation2002) for suggestions how to measure performance).

Fourth, we did not include sport-specific measures in our attempt to establish PSD’s nomological network. It is conceivable that such sport-specific (vs. general) measures would correlate higher with PSD, and therefore our test of convergent validity should be seen as a conservative test. Indeed, the correspondence principle posits that measures should better predict criteria when there is a match in terms of the level of generality or specificity at which both are conceptualized (Ajzen & Fishbein, Citation1977). Future studies may consider sport-specific measures such as sport orientations (Gill & Deeter, Citation1988) to obtain further insight into PSD’s nomological network.

Finally, due to the COVID-19 pandemic, we were unable to time participants on 10,000 m in phase 3 ourselves. It would therefore be advisable to make an effort to replicate our results in better controlled conditions once the circumstances allow for such research designs.

Practical implications

The PSD instrument can be used to assess the frequency of PSD during training sessions—both for practical assessment of athletes’ training behavior and for scientific purposes. First, the PSD instrument can be used within the field of sport psychology to further study the antecedents, boundary conditions, mechanisms, and effectiveness of this new concept. Additionally, the newly developed PSD instrument opens the door for practitioners to develop PSD interventions during which athletes can learn how to redesign their training activities playfully. Coaches may use the PSD instrument to investigate differences between athletes. In this way, they may find out which athlete uses PSD most, and this person could then inform the others what possible PSD tactics and strategies are. However, coaches should keep the importance in mind of stimulating athletes to develop PSD strategies themselves, rather than presenting athletes with top-down initiatives to shape their training in a playful way.

Conclusion

PSD was introduced as a new proactive cognitive-behavioral orientation through which athletes introduce ludic and agonistic play elements into training sessions. Throughout this study, the psychometric properties of the scores yielded by the proposed PSD instrument were found to be promising. Additionally, the findings regarding the nomological network and predictive validity of the new concept PSD generally matched previous work on play in adulthood and sport. We found evidence that the efforts of athletes to playfully design training sessions affect training experiences (i.e., flow) and effectiveness (i.e., performance). The new PSD measurement tool can be used to detect PSD behavior and can be utilized to further deepen our understanding of this new construct.

Supplemental material

Supplemental Material

Download MS Word (18.9 KB)

References

  • Ajzen, I., & Fishbein, M. (1977). Attitude-behavior relations: A theoretical analysis and review of empirical research. Psychological Bulletin, 84(5), 888–918. https://doi.org/10.1037/0033-2909.84.5.888
  • Amiot, C. E., Gaudreau, P., & Blanchard, C. M. (2004). Self-determination, coping, and goal attainment in sport. Journal of Sport and Exercise Psychology, 26(3), 396–411. https://doi.org/10.1123/jsep.26.3.396
  • Aron, A., & Aron, E. N. (1996). Self and self-expansion in relationships. In G. J. O. Fletcher & J. Fitness (Eds.), Knowledge structures in close relationships: A social psychological approach (pp. 325–344). Lawrence Erlbaum Associates, Inc.
  • Arufe-Giráldez, V., Sanmiguel-Rodríguez, A., Ramos Álvarez, O., & Navarro-Patón, R. (2022). Can gamification influence the academic performance of students? Sustainability, 14(9), 5115. https://doi.org/10.3390/su14095115
  • Bakker, A. B., Scharp, Y. S., Breevaart, K., & De Vries, J. D. (2020). Playful work design: Introduction of a new concept. The Spanish Journal of Psychology, 23(19), 1–6. https://doi.org/10.1017/SJP.2020.20
  • Balk, Y. A., de Jonge, J., Geurts, S. A., & Oerlemans, W. G. (2019). Antecedents and consequences of perceived autonomy support in elite sport: A diary study linking coaches’ off-job recovery and athletes’ performance satisfaction. Psychology of Sport and Exercise, 44, 26–34. https://doi.org/10.1016/j.psychsport.2019.04.020
  • Barreiro, J. A., & Howard, R. (2017). Incorporating unstructured free play into organized sports. Strength & Conditioning Journal, 39(2), 11–19. https://doi.org/10.1519/SSC.0000000000000291
  • Burke, R. (1971). “Work” and “Play”. Ethics, 82(1), 33–47. https://doi.org/10.1086/291827
  • Caillois, R. (1961). Man, play and games. Free Press.
  • Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81–105. https://doi.org/10.1037/h0046016
  • Carver, C. S., & White, T. L. (1994). Behavioral inhibition, behavioral activation, and affective responses to impending reward and punishment: The BIS/BAS instruments. Journal of Personality and Social Psychology, 67(2), 319–333. https://doi.org/10.1037/0022-3514.67.2.319
  • Chow, J. Y., Davids, K., Betton, C., & Renshaw, I. (2019). Nonlinear pedagogy in skill acquisition. Taylor & Francis Ltd.
  • Comrey, L. A., & Lee, H. B. (1992). A first course in factor analysis (2nd ed.). Lawrence Erlbaum Associates.
  • Costa, P. T., & McCrae, R. R. (1992). Normal personality assessment in clinical practice: The NEO Personality Inventory. Psychological Assessment, 4(1), 5–13. https://doi.org/10.1037/1040-3590.4.1.5
  • Costello, A. B., & Osborne, J. W. (2005). Best practices in Exploratory Factor Analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research, and Evaluation, 10(7), 1–9.
  • Côte, J., Baker, J., & Abernethy, B. (2003). From play to practice: A developmental framework for the acquisition of expertise in team sports. In J. Starkes & K. A. Ericsson (Eds.), Expert performance in sports: advances in research on sport expertise (pp. 89–110). Human Kinetics.
  • Csikszentmihalyi, M. (1975). Beyond boredom and anxiety. Jossey-Bass Inc. Publishers.
  • Csikszentmihalyi, M. (1988). The flow experience and its significance for human psychology. In M. Csikszentmihalyi & I. S. Csikszentmihalyi (Eds.), Optimal experience: Psychological studies of flow in consciousness (pp. 15–35). Cambridge University Press.
  • Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harper & Row.
  • Csikszentmihalyi, M. (2014). Play and intrinsic rewards. In M. Csikszentmihalyi (Eds.), Flow and the foundations of positive psychology (pp. 135–153). Springer.
  • DeFreese, J. D., & Smith, A. L. (2013). Teammate social support, burnout, and self-determined motivation in collegiate athletes. Psychology of Sport and Exercise, 14(2), 258–265. https://doi.org/10.1016/j.psychsport.2012.10.009
  • Deterding, S. (2016). Make-believe in gameful and playful design. In P. Turner & J. Tuomas (Eds.), Digital make-believe (pp. 101–124). Springer.
  • Deterding, S., O’Hara, K., Sicart, M., Dixon, D., & Nacke, L. (2011, May 7–12). Gamification: Using game design elements in non-game contexts. In Human factors in computing systems. ACM.
  • Diana, B., Argenton, L., Muzio, M., Inghilleri, P., Riva, G., & Riva, E. (2015). Positive change and flow in sport. In P. Inghilleri, G. Riva, & E. Riva (Eds.), Enabling positive change: Flow and complexity in daily experience (pp. 138–151). De Gruyter Open Poland.
  • Ellis, M. J. (1973). Why people play. Prentice-Hall.
  • Erickson, K., Côte, J., Turnnidge, J., Allan, V., & Vierimaa, M. (2017). Play during childhood and the development of expertise in sport. In D.Z. Hambrick, G. Campitelli, B. N. Macnamara, & R. Plomin (Eds.), The Science of expertise. (pp 398–416).
  • Ericsson, K. A. (2020). Towards a science of the acquisition of expert performance in sports: Clarifying the differences between deliberate practice and other types of practice. Journal of Sports Sciences, 38(2), 159–176. https://doi.org/10.1080/02640414.2019.1688618
  • Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191. https://doi.org/10.3758/bf03193146
  • Feher, A., Smith, M. M., Saklofske, D. H., Plouffe, R. A., Wilson, C. A., & Sherry, S. B. (2020). The Big Three Perfectionism Instrument–short form (BTPS-SF): Development of a brief self-report measure of multidimensional perfectionism. Journal of Psychoeducational Assessment, 38(1), 37–52. https://doi.org/10.1177/0734282919878553
  • Fluegge-Wolf, E. R. (2014). Play hard, work hard: Fun at work and job performance. Management Research Review, 37(8), 682–705.
  • Frese, M., Fay, D., Hilburger, T., Leng, K., & Tag, A. (1997). The concept of personal initiative: Operationalization, reliability and validity in two German samples. Journal of Occupational and Organizational Psychology, 70(2), 139–161. https://doi.org/10.1111/j.2044-8325.1997.tb00639.x
  • Furley, P. (2021). The nature and culture of physical activity play: Towards a biocultural model of play, practice, and teaching in sports. Evolutionary Behavioral Sciences, 15(2), 208–229. https://doi.org/10.1037/ebs0000254
  • George, L. A., Schkade, J. K., & Ishee, J. H. (2004). Content validity of the relative mastery measurement instrument: A measure of occupational adaptation. OTJR: Occupation, Participation and Health, 24(3), 92–102. https://doi.org/10.1177/153944920402400303
  • Gill, D. L., & Deeter, T. E. (1988). Development of the sport orientation questionnaire. Research Quarterly for Exercise and Sport, 59(3), 191–202. https://doi.org/10.1080/02701367.1988.10605504
  • Griffin, L. L., & Butler, J. (2005). Teaching games for understanding: Theory, research, and practice. Human Kinetics Publishers.
  • Hall, H. K., & Byrne, A. T. (1988). Goal-setting in sport: Clarifying recent anomalies. Journal of Sport and Exercise Psychology, 10(2), 184–198. https://doi.org/10.1123/jsep.10.2.184
  • Helmreich, R. C., & Spence, J. T. (1978). The work and family orientation questionnaire: An objective instrument to assess components of achievement motivation and attitude toward family and career. JSAS Catalog of Selected Documents in Psychology, 8, 35.
  • Hinsz, V. B., & Jundt, D. K. (2005). Exploring individual differences in a goal‐setting situation using the motivational trait questionnaire. Journal of Applied Social Psychology, 35(3), 551–571. https://doi.org/10.1111/j.1559-1816.2005.tb02135.x
  • Horn, T. S. (2008). Coaching effectiveness in the sport domain. In T. S. Horn (Ed.), Advances in sport psychology (pp. 239–267). Human Kinetics.
  • Hoyle, R. H. (1995). The structural equation modeling approach: Basic concepts and fundamental issues. In R. H. Hoyle (Ed.), Structural equation modeling, concepts, issues, and applications (pp. 1–15). Sage.
  • Hughes, M. D., & Bartlett, R. M. (2002). The use of performance indicators in performance analysis. Journal of Sports Sciences, 20(10), 739–754. https://doi.org/10.1080/026404102320675602
  • Huizinga, J. (1949). Homo ludens: A study of the play-element in culture. Routledge & Kegan Paul.
  • Jackson, S. A. (1992). Athletes in flow: A qualitative investigation of flow states in elite figure skaters. Journal of Applied Sport Psychology, 4(2), 161–180. https://doi.org/10.1080/10413209208406459
  • John, O. P., Naumann, L. P., & Soto, C. J. (2008). Paradigm shift to the integrative big five trait taxonomy: History, measurement, and conceptual issues. In O. P. John, R. W. Robins, & L. A. Pervin (Eds.), Handbook of personality: Theory and research (pp. 114–158). Guilford.
  • Jöreskog, K. G., & Sörbom, D. (1993). LISREL 8: User’s reference guide. Scientific Software International.
  • Kingston, U., Adamakis, M., & Costa, J. (2020). Acute effects of physical education, structured play, and unstructured play in children’s executive functions in primary school. Journal Of Physical Education And Sport, 20(6), 3260–3266.
  • Kobiela, F. (2018). Should chess and other mind sports be regarded as sports? Journal of the Philosophy of Sport, 45(3), 279–295. https://doi.org/10.1080/00948705.2018.1520125
  • Lai, K., & Green, S. B. (2016). The problem with having two watches: Assessment of fit when RMSEA and CFI disagree. Multivariate Behavioral Research, 51(2–3), 220–239. https://doi.org/10.1080/00273171.2015.1134306
  • Launder, A. G., & Piltz, W. (2013). Play practice: Engaging and developing skilled players from beginner to elite. Human Kinetics Publishers.
  • Locke, E. A. (1965). Interaction of ability and motivation in performance. Perceptual and Motor Skills, 21(3), 719–725. https://doi.org/10.2466/pms.1965.21.3.719
  • Locke, E. A., & Latham, G. P. (1990). A theory of goal-setting and task motivation. Prentice-Hall.
  • Lord, R. G., Diefendorff, J. M., Schmidt, A. M., & Hall, R. J. (2010). Self-Regulation at Work. Annual Review of Psychology, 61(1), 543–568. https://doi.org/10.1146/annurev.psych.093008.100314
  • Mace, R. (1990). Cognitive behavioral interventions in sport. In G. Jones & L. Hardy (Eds.), Stress and performance in sport (pp. 203–231). Wiley.
  • Mainemelis, C., & Ronson, S. (2006). Ideas are born in fields of play: Towards a theory of play and creativity in organizational settings. Research in Organizational Behavior, 27, 81–131. https://doi.org/10.1016/S0191-3085(06)27003-5
  • Mareš, L., & Ryall, E. (2021). ‘Playing sport playfully’: on the playful attitude in sport. Journal of the Philosophy of Sport, 48(2), 293–306. https://doi.org/10.1080/00948705.2021.1934689
  • Martin, A. J., & Jackson, S. A. (2008). Brief approaches to assessing task absorption and enhanced subjective experience: Examining ‘short’ and ‘core’ flow in diverse performance domains. Motivation and Emotion, 32(3), 141–157. https://doi.org/10.1007/s11031-008-9094-0
  • McCarthy, P. J. (2011). Positive emotion in sport performance: Current status and future directions. International Review of Sport and Exercise Psychology, 4(1), 50–69. https://doi.org/10.1080/1750984X.2011.560955
  • McCrae, R. R., & Costa, P. T. (1997). Conceptions and correlates of openness to experience. In R. Hogan, J. Johnson, & S. Briggs (Eds.), Handbook of personality psychology (pp. 825–847). Academic Press.
  • Meier, K. V. (1988). Triad trickery: Playing with sport and games. Journal of the Philosophy of Sport, 15(1), 11–30. https://doi.org/10.1080/00948705.1988.9714458
  • Memmert, D., Baker, J., & Bertsch, C. (2010). Play and practice in the development of sport‐specific creativity in team ball sports. High Ability Studies, 21(1), 3–18. https://doi.org/10.1080/13598139.2010.488083
  • Merriam-Webster (n.d). Design. Merriam-Webster.com dictionary. Retrieved November 2, 2022, from https://www.merriam-webster.com/dictionary/design
  • Mollick, E., & Rothbard, N. (2014). Mandatory fun: Consent, gamification, and the impact of games at work. The Wharton School Research Paper Series (pp. 1–54). Available at SSRN: http://doi.org/10.2139/ssrn.2277103
  • Parker, S. K., Bindl, U. K., & Strauss, K. (2010). Making things happen: A model of proactive motivation. Journal of Management, 36(4), 827–856. https://doi.org/10.1177/0149206310363732
  • Petelczyc, C. A., Capezio, A., Wang, L., Restubog, S. L. D., & Aquino, K. (2018). Play at work: An integrative review and agenda for future research. Journal of Management, 44(1), 161–190. https://doi.org/10.1177/0149206317731519
  • Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. The Journal of Applied Psychology, 88(5), 879–903. https://doi.org/10.1037/0021-9010.88.5.879
  • Preston, C. C., & Colman, A. M. (2000). Optimal number of response categories in rating scales: Reliability, validity, discriminating power, and respondent preferences. Acta Psychologica, 104(1), 1–15. https://doi.org/10.1016/s0001-6918(99)00050-5
  • Proyer, R. T. (2012a). Examining playfulness in adults: Testing its correlates with personality, positive psychological functioning, goal aspirations, and multi-methodically assessed ingenuity. Psychological Test and Assessment Modeling, 54(2), 103–127.
  • Proyer, R. T. (2012b). Development and initial assessment of a Short Measure for Adult Playfulness: The SMAP. Personality and Individual Differences, 53(8), 989–994. https://doi.org/10.1016/j.paid.2012.07.018
  • Proyer, R. T. (2017). A new structural model for the study of adult playfulness: Assessment and exploration of an understudied individual differences variable. Personality and Individual Differences, 108, 113–122. https://doi.org/10.1016/j.paid.2016.12.011
  • Ravizza, K. (1977). Peak experiences in sport. Journal of Humanistic Psychology, 17(4), 35–40.
  • Reed, J. P. (2020). Using games to enhance skill and fitness. Strategies, 33(2), 35–37. https://doi.org/10.1080/08924562.2020.1706986
  • Reio, T. G. (2010). The threat of common method variance bias to theory building. Human Resource Development Review, 9(4), 405–411. https://doi.org/10.1177/1534484310380331
  • Scharp, Y. A., Bakker, A. B., Breevaart, K., Kruup, K., & Uusberg, A. (2022). Playful work design: Conceptualization, measurement and validity. Human Relations, 00(0), 001872672110709. https://doi.org/10.1177/00187267211070996
  • Scharp, Y. S., Breevaart, K., & Bakker, A. B. (2021). Using playful work design to deal with hindrance job demands: A quantitative diary study. Journal of Occupational Health Psychology, 26(3), 175–188. https://doi.org/10.1037/ocp0000277
  • Scharp, Y. S., Breevaart, K., Bakker, A. B., & van der Linden, D. (2019). Daily playful work design: A trait activation perspective. Journal of Research in Personality, 82, 103850. https://doi.org/10.1016/j.jrp.2019.103850
  • Stenner, B. J., Buckley, J. D., & Mosewich, A. D. (2020). Reasons why older adults play sport: A systematic review. Journal of Sport and Health Science, 9(6), 530–541. https://doi.org/10.1016/j.jshs.2019.11.003
  • Stenros, J. (2017). The game definition game: A review. Games and Culture, 12(6), 499–520. https://doi.org/10.1177/1555412016655679
  • Suits, B. (1977). Words on play. Journal of the Philosophy of Sport, 4(1), 117–131. https://doi.org/10.1080/00948705.1977.10654132
  • Suits, B. (1988). Tricky triad: Games, play, and sport. Journal of the Philosophy of Sport, 15(1), 1–9. https://doi.org/10.1080/00948705.1988.9714457
  • Swann, C. (2016). Flow in sport. In L. Harmat, F. Orsted. Andersen, F. Ullen, J. Wright, & G. Sadlo (Eds.), Flow experience: Empirical research and applications (pp. 51–64). Springer International Publishing.
  • Thompson, B. (2002). Score reliability: Contemporary thinking on reliability issues. Sage publications.
  • Thompson, E. R. (2007). Development and validation of an internationally reliable short-form of the Positive and Negative Affect Schedule (PANAS). Journal of Cross-Cultural Psychology, 38(2), 227–242. https://doi.org/10.1177/0022022106297301
  • Tuckman, B. W. (1991). The development and concurrent validity of the procrastination instrument. Educational and Psychological Measurement, 51(2), 473–480. https://doi.org/10.1177/0013164491512022
  • Van Vleet, M., & Feeney, B. C. (2015). Play behavior and playfulness in adulthood. Social and Personality Psychology Compass, 9(11), 630–643. https://doi.org/10.1111/spc3.12205
  • Velasco, F., & Jorda, R. (2020). Portrait of boredom among athletes and its implications in sports management: A multi-method approach. Frontiers in Psychology, 11, 831. https://doi.org/10.3389/fpsyg.2020.00831
  • Visek, A. J., Achrati, S. M., Mannix, H. M., McDonnell, K., Harris, B. S., & DiPietro, L. (2015). The fun integration theory: Toward sustaining children and adolescents sport participation. Journal of Physical Activity & Health, 12(3), 424–433. https://doi.org/10.1123/jpah.2013-0180
  • Visek, A. J., Mannix, H., Mann, D., & Jones, C. (2017). Integrating fun in young athletes’ sport experiences. In C. J. Knight, C. G. Harwood, & Gould, D. (Eds.), Sport psychology for young athletes (pp. 68–80). Routledge.
  • Wagstaff, C. R. (2019). A commentary and reflections on the field of organizational sport psychology. Journal of Applied Sport Psychology, 31(1), 134–146. https://doi.org/10.1080/10413200.2018.1539885
  • Wu, C.-H., & Parker, S. K. (2017). The role of leader support in facilitating proactive work behavior: A perspective from attachment theory. Journal of Management, 43(4), 1025–1049. https://doi.org/10.1177/0149206314544745