1,527
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

The development and psychometric properties of a scalable digital measure of social and emotional wellbeing for middle childhood

, , , , , & show all

Abstract

The need for a new measure of social-emotional wellbeing for children 6–12 years emerged in our collaboration with schools and community agencies in a disadvantaged region of Brisbane, Australia. Our search for an age-appropriate medium led us to develop Clowning Around, a computer game generating data which was assessed for validity and reliability in Study 1 (N = 3,461), revealing four wellbeing factors with satisfactory properties. The imperative to support autonomous use at scale led to development of the Rumble’s Quest user management system and game app, incorporating a slightly modified version of the measure. Reassessment of the factor structure (Study 2: N = 4,333) supported Study 1 results but extended factors to five: School Attachment, Social Wellbeing, Emotional Wellbeing, Family Support, and Behavioral Conformity. The measure exhibits sound convergent and concurrent validity and is a reliable and practical tool for data-guided service planning and evaluation, and for monitoring child wellbeing trends.

This paper presents a series of studies that describe the development of a new self-report measure of social and emotional wellbeing for children in middle childhood. Our aims are: (1) to establish the need for a utilitarian measure of social and emotional wellbeing for middle childhood; (2) to describe the measure’s early construction and its evolution in content and mode of delivery during testing with schools and community agencies; and (3) to report the results of two studies that explored the measure’s psychometric properties.

We first identified the need for a practical and robust wellbeing measure for children in middle childhood whilst undertaking the Pathways to Prevention Project in collaboration with educators and community practitioners (Homel et al., Citation2006). This project was designed to foster positive child development and strengthen family functioning in a socially disadvantaged community in Brisbane, Australia over the period 2002 to 2011 (Freiberg et al., Citation2005). Development of the wellbeing measure began with the formulation of a set of questions focused on matters of fundamental importance to primary school children’s lives, particularly friends, family, school, and the child’s growing sense of self. We were very conscious of the need for an age-appropriate medium to create a meaningful context that would support children’s capacity to report on their lives in a reliable way. We decided that a computer-based game, initially named Clowning Around, would be an efficient and effective medium for administering the survey. Study 1 assessed the psychometric properties of this measure including its factor structure, convergent and concurrent validity, test-retest reliability, and internal consistency.

A second critical need we identified was for a system that would support practitioners’ capacity to independently collect, understand, and act on the data that the measure generated. This became a core goal of the Creating Pathways to Child Wellbeing Project (2013–2020) through which, building on our experiences in the Pathways to Prevention Project (Homel et al., Citation2015a), we constructed a new software package that combined the game-based measure with a data report and other support resources that practitioners could utilize through a secure management dashboard. As this integrated system was being developed, rapid developments in the world of gaming afforded the opportunity to reimagine the game as a more interactive experience for children. Study 2 assessed this revised measure, called Rumble’s Quest, which introduced a new integrated system as well as a small number of improvements to the measure. Study 2 aimed: (1) to test whether the factor structure of the modified measure delivered through the enhanced game format preserved the factor structure of the original measure; and (2) to use exploratory factor analyses to identify a subset of items with a factor structure that fitted the data as closely as possible while preserving interpretability and utility for practitioners and researchers. A five-factor solution with an excellent fit emerged from these exploratory factor analyses, based on a subset of approximately half the items.

Establishing the need for a utilitarian measure of child wellbeing in the context of community practice

The Pathways to Prevention Project

Here we describe how our collaborative work with schools and community agencies in the early 2000s led us to conclude that a new, practitioner-friendly measure was required. The Pathways to Prevention Project combined a wide range of preventive, early intervention, and remedial activities for families and children, one key goal of which was to improve children’s wellbeing. While project staff were concerned with a wide range of practical wellbeing issues such as a child’s regular access to nutritious meals or having a bed of their own (Homel et al., Citation2006), we made an early decision to focus for evaluation purposes on children’s social and emotional wellbeing. We made this decision in light of the growing evidence linking social-emotional wellbeing to positive short-term outcomes such as improved learning, peer relationships, and classroom behavior (Durlak et al., Citation2011; Greenberg et al., Citation2001; Payton et al., Citation2008) as well as to long-term developmental trajectories across domains such as education, employment, antisocial behavior, and mental health, including the ability to manage stress and avoid substance misuse (Jones et al., Citation2015; Olsson et al., Citation2013).

Our decision to focus on a measure of social-emotional wellbeing was also influenced by the diverse nature of the Pathways family support services which varied markedly in focus, duration, and intensity in response to the needs of each family. A core measure was needed to help project partners: (1) understand how children were faring, (2) demonstrate change in child wellbeing arising from family participation in the suite of Pathways activities, and (3) draw attention to issues where children may have needed support. As planned, this wellbeing measure would be one component of the evaluation of the holistic family support system that also included measures of parental empowerment and efficacy, classroom behavior, and child communication skills (Freiberg et al., Citation2005, Citation2014; Homel et al., Citation2006).

The Pathways to Prevention Project activities aimed to promote family capacity to support some of the key developmental tasks of middle childhood such as readiness for learning, the ability to deal with everyday challenges, and positive social behavior. Positive behavior includes such indicators of social-emotional wellbeing as prosocial orientation and skills to get along with others, the ability to understand and manage one’s own and others’ emotions, and the capacity to resolve interpersonal conflicts respectfully and to refrain from engaging in challenging behaviors. Mastery of these key tasks is important because they prepare children for adaptive responses to the challenges that arise during subsequent developmental transitions in adolescence and young adulthood. Early identification of children who struggle with these tasks is therefore of great value in highlighting the need for social-emotional learning (SEL) interventions to address these issues (Catalano et al., Citation2021; Guhn et al., Citation2012; Jones et al., Citation2015; Moffitt et al., 2010).

The challenge we faced in the early 2000s was finding a reliable, user-friendly measure for practitioners to identify the conditions and competencies that underpin children’s social-emotional wellbeing to guide their service decisions and to evaluate the effectiveness of their work. This challenge persists, highlighting its difficulty. As McKown and Taylor (Citation2018:1) observed: “In contrast to the wide availability of effective SEL programs, there are few tools educators can use to assess children’s SEL that are usable, feasible, and scalable.” A more recent review of tools for measuring child wellbeing by Cho and Yu (Citation2020) generally confirms the continuing relevance of McKown and Taylor’s observation.

Criteria for measuring child wellbeing in the context of community practice

Researchers have defined and measured wellbeing in a great variety of ways (Australian Institute of Health and Welfare (AIHW), 2020; Cho & Yu, Citation2020; Fernandes et al., Citation2012; Guhn et al., Citation2012; Hamilton & Redmond, Citation2010; Sanson et al., Citation2010). A number of conceptual frameworks for child wellbeing have also been developed. For example, the Collaborative for Academic, Social, and Emotional Learning (CASEL) enumerates five personal domains of social and emotional competency: self-awareness, self-management, social awareness, relationship skills, and responsible decision-making (Devaney et al., Citation2006; Payton et al., Citation2008). Another noteworthy framework stems from Developmental Assets Theory which outlines 40 personal and contextual factors that promote wellbeing and emphasizes the importance of the opportunities that exist in a child’s life to attract and use multiple sources of support and to experience a sense of belonging through sustained relationships with attentive and nurturing adults in structured, safe, and affirming care environments. (Scales et al., Citation2006; Scales & Leffert, Citation1999). These frameworks can help guide the generation or selection of measures because they outline key competencies that underpin social and emotional development.

A systematic review of the empirical literature on child wellbeing by Pollard and Lee (Citation2003) particularly influenced our early thinking. However, as these authors noted the field at the time lacked clarity, being characterized by methodological eclecticism rather than any kind of consensus. They located 173 measures for 2–10-year-olds and 256 for 8–13-year-olds, concluding that “there is no standard method to assess well-being in children” and that many measures tap only a single dimension. As a result, “the majority of authors used multiple separate measures of presumed indicators of well-being in an effort to capture a more complete assessment of the child’s well-being” (p.68). This conclusion raised several points for consideration for the Pathways Project, including whether to use a bundle of available measures to piece together a multidimensional picture of child wellbeing. Other considerations related to who would collect, provide, and interpret the data, how often information would be gathered, and the uses to which it would be put.

Our experience in working with schools and community agencies, as well as our search of the literature, led us to formulate seven conditions or criteria that a wellbeing measure for children aged 6–12-years should satisfy to be suitable for widespread use in community and school practice. These criteria taken together instantiated McKown and Taylor (Citation2018) concerns about the need for measures to be “usable, feasible, and scalable.”

The data the measure produces must be scientifically valid and reliable

While this criterion is a given for scientific research it needs emphasis since it is sometimes contested in community practice. In our experience some practitioners do not value sufficiently the need for standardized and dependable measures for evaluation and decision-making purposes, viewing their use as a distraction from the main task of cultivating trusting relationships with clients (Freiberg et al., Citation2005). There is a growing body of research that shows that the use of psychometrically reliable measures by practitioners can enhance community practice and program evaluation (e.g. Shapiro et al., Citation2013).

The measure must be responsive, or sensitive to change

In particular the measure must be sufficiently sensitive to detect changes in response to the effects of program participation or of life experiences (Aaronson et al., Citation2002).

The measure must capture the voices of children directly

There are several reasons for this criterion. First, involving children in the research process is consistent with child rights approaches that highlight the importance of including young people’s insights in decisions about their welfare and their right to be heard regarding policies and practices that may affect their lives (United Nations, Citation1989). Secondly, not all aspects of children’s social and emotional wellbeing manifest in ways that allow adults to report reliably on the child’s behalf. Bernard et al. (Citation2007) for example noted that teachers provide very different reports from their students on the children’s social and emotional states. Thirdly, despite debate about the capacity of young children to report accurately on their feelings and state of mind there is sufficient evidence to support the reliability of children’s responses (Melton, Citation2005; Riley, Citation2004). For instance, Luby et al. (Citation2007) reported that even as preschoolers, children can provide accurate information on their experience of core symptoms of psychological conditions such as depression and anxiety if they are provided with age-appropriate methods.

The measure must be delivered in a way that enables children’s meaningful participation

If we take seriously the subjective element of wellbeing then efforts should be made to use data collection methods that are sufficiently engaging to sustain even young children’s attention, do not place a burden on their limited literacy skills, and provide a meaningful context for their understanding and response to questions which can be abstract and difficult to relate to when presented in traditional questionnaire or survey formats (Darling-Churchill & Lippman, Citation2016; Kempf, Citation2018; McKown, Citation2019).

The data capture process must be practical, resource and time efficient, able to be used at scale, and not disrupt busy school or agency routines

As noted earlier, we needed a consistent way to measure wellbeing during the school day on an ongoing basis for all children aged 6 to 12 years. The practicalities of working in schools and other service settings demanded that methods of data collection be efficient, not place an undue burden on staff, and not require specialists to administer.

The data generated by the measure must be able to guide action by practitioners on behalf of children

The measure should generate “knowledge for understanding and knowledge for advocacy” (Shonkoff, Citation2004, p. 3), highlighting issues of concern at both group and individual levels to guide practitioner decisions on how best to address the needs of children. The data-based insights may lead to the more efficient use of program resources and help over time to build a culture of continuous, data-guided improvement. An excellent example of this approach is the Communities That Care Youth Survey (Fagan et al., Citation2019), which measures community-level risk and protective factors for adolescent behavioral health problems and guides the selection of evidence-based interventions by community coalitions.

The content of the measure must be developmentally appropriate, meaningful, and multidimensional

During middle childhood (6 to 12 years) children’s experiences help to lay the foundations of their personal identity as competent individuals capable of making independent decisions about their actions, friendships, interests, and responsibilities (Eccles, Citation1999; Erikson, Citation1968). The most significant contextual domains for children’s development during these years are family, school, and peer group. A developmentally appropriate measure of social and emotional wellbeing for middle childhood should encompass both the contextual and the personal: a child’s relationships in these three domains and their emerging sense of self. In this way it should reflect the inherent multidimensionality of wellbeing. Pollard and Lee (Citation2003) observed that multiple measures are frequently used to deal with multidimensionality through the construction of composite indices of child wellbeing. Although this approach may be possible as part of a well-resourced national project or as a component of large-scale cohort studies in the social indicators or policy development arenas, such extensive, expensive, and resource intensive methods are a barrier to use in frequently data-hesitant practice settings. The easier it is for practitioners to collect, understand, and act on data the more likely they are to embrace the methodology. This highlights the need to capture some of the most significant aspects of social-emotional wellbeing through a unified and user-friendly measurement and delivery platform.

Constructing a new measure for community practice

During the Pathways Project we could find no established child wellbeing measure(s) that simultaneously met all seven criteria. By using a small bundle of measures based on teacher as well as child reports we could probably have satisfied several of our criteria (scientific integrity, sensitivity to change, capturing children’s voices, multidimensionality) but the administration of these scales on a regular basis would have validated practitioner concerns about evaluation fatigue and made the whole process dependent on the research team indefinitely. We therefore commenced the development of a new measure and the construction of a delivery platform suitable for practitioners as well as for researchers.

In developing the new measure, we drew on two wellbeing frameworks: Developmental Assets Theory (Scales et al., Citation2006; Scales & Leffert, Citation1999) and the Collaborative for Academic, Social, and Emotional Learning framework (Devaney et al., Citation2006; Payton et al., Citation2008). These frameworks were critical in identifying factors related to personal skills and to contexts for positive child development (people, places, and institutions). We were also guided in item construction by the broad aims of the Pathways to Prevention Project which were to promote nurturing developmental environments and children’s sense of connection to school and readiness to learn. Connection to school is an important protective factor for many children which contributes to educational, health, social, and emotional outcomes (Bowles & Scull, Citation2019). Readiness to learn includes competencies that support engagement in classroom settings, social problem solving, and the ability to manage one’s behavior and emotions.

As noted by Pollard and Lee (Citation2003), indicators can generally be grouped according to five domains of child wellbeing: physical (e.g. health, nutrition, physical activity, smoking and use of drugs or alcohol); psychological (e.g. depression, anxiety, confidence, self-esteem); social (e.g. interpersonal skills and relationships, support mechanisms); cognitive (e.g. academic achievement, classroom behavior, school attachment); and economic (e.g. material resources and financial hardship). Pollard & Lee also noted that each domain is commonly made up of a combination of positive and negative indicators (e.g. depression as a negative state and confidence as a positive one), which supported our preference for including items that explored strengths as well as challenges.

A similar organizational structure for conceptualizing child wellbeing was proposed by Moore et al. (Citation2008). Like Pollard and Lee (Citation2003), Moore and colleagues used the four individual domains of wellbeing: cognitive-educational, social, psychological, and physical. However, they drew more attention than Pollard and Lee to contextual domains to reflect the distinct importance of family and community contexts in addition to socio-economic influences on child wellbeing.

While recognizing the importance of physical and economic influences on child wellbeing, these are objective elements that can be reported easily by adults. For the child report we decided to focus on the cognitive-educational, social, psychological-emotional, and contextual domains. We settled on a set of 55 questions that reflected four domains that we considered most important in assessing children’s social and emotional wellbeing:

  • Educational wellbeing. Items included perception of school atmosphere, connection to and enjoyment of school, interest and engagement in learning, perception that one’s efforts are noticed, adherence to school rules, discussion of school life at home.

  • Social wellbeing. Items included positive peer relationships, social engagement and interactions, exposure to and engagement in conflict, peer behavior, own behavior.

  • Emotional wellbeing. Items included positive outlook, positive and negative affect, self-concept.

  • Protective contexts. Items included feelings of safety, presence of and attachment to caring adults, opportunities for personal growth through structured family routines, supervision, and shared family activities.

After developing the question set, we conducted focus groups with children. These groups confirmed the comprehensibility, relevance, and value of the questions and demonstrated children’s willingness to report honestly on their lives at and beyond school. The candid nature of children’s answers occasionally prompted the researcher to gently query some responses, but ensuing discussion within the small groups of supportive friends who all knew each other well generally endorsed the authenticity of the information that children shared. The focus groups also helped us simplify the linguistic and grammatical structure of a small number of questions.

From survey to computer game: Clowning Around

To meet our goals of an efficient and engaging model of assessment, the questions were embedded in a simple game-like format with colorfully illustrated auditory-visual animation. The cartoon style had wide appeal to children. The game was called Clowning Around because the thematic backdrop depicted movement through a set of circus events (the big top show ring, sideshow alley). The central character in the game was a comic child with an androgenous appearance. As the game opened, this character invited the child player to join them at the circus and asked the 55 questions over a series of scenes.

Questions were delivered in a paced sequence that gave the child time to consider their response before proceeding to the next item. All questions were presented verbally and accompanied by text. After each question was posed, the child listened to the response options (icons and text lit up as each response on the scale was voiced), then selected their answer by clicking an icon. This made it possible for children whose literacy skills were still developing to participate with ease and confidence. During the game, blocks of questions were separated by three tasks that challenged the player to use memory, attention, and cognitive skills (see Day et al., Citation2019 for a description of these executive function tasks).

Study 1: The psychometric properties of Clowning Around

We began using Clowning Around as a data collection tool in seven Pathways schools in 2008. Study 1 reports the results of tests of the measure’s factor structure and internal consistency; convergent validity; test-retest reliability; and concurrent validity. We also briefly report wellbeing variations by age, gender, and the socio-economic status of a child’s area of residence.

For the study of convergent validity, we compared Clowning Around scores to responses on three validated paper and pencil measures, each of which relates to a specific aspect of wellbeing: The Self Perception Profile for Children: SPPC (Harter, Citation1985); The Psychological Sense of School Membership: PSSM (Goodenow, Citation1993); and The Personal Wellbeing Index: PWI-SC (Cummins & Lau, Citation2005). The properties of these measures are described below.

Concurrent validity can be assessed by comparing groups that the measure theoretically should be able to distinguish (Anastasi & Urbina, Citation1997). In this study we report analyses of Clowning Around scores from the Pathways Project database (Homel et al., Citation2015c) in relation to two measures that are known to influence or reflect social-emotional wellbeing: (1) self-reported adversity (Ray et al., Citation2020) and (2) first school disciplinary suspension (Homel et al., Citation2016; Laurens et al., Citation2021). We also examine (3) the relationship between Clowning Around scores and family participation in Pathways services (Homel et al., Citation2015b) on the basis that some parenting and family interventions have been shown to improve children’s social and emotional development (Huang et al., Citation2017; Li et al., Citation2021).

Method

Participants

Factor structure

3,461 children from 11 primary schools in Brisbane played Clowning Around, representing approximately 95% of the schools’ total enrollments in the age range 6 to 12 years. The participation rate was high since the study had strong support from school authorities and the Department of Education and obtaining informed consent by parents or guardians was managed by the schools directly using their standard procedures. Our sample was broadly representative of the participating schools’ Grade 1–7 population. Although most children in these grades were aged 6–12, the sample did include a small number of children aged 5 (2.7%) and 13 years (1.8%). Nearly half the sample (n = 1,599; 46.2%) attended schools involved in the Pathways to Prevention Project with the remaining 1,862 children enrolled at one of five schools located outside the Pathways community.

There was a roughly uniform age distribution across ages 6–12 years with a small drop in the rate of participation by 12-year-olds (1.7% compared with 14.3% aged 11 and 15.7% aged 10). This was probably because at that time most children turned 12 during Year 7 and a proportion of that grade would not yet have reached their 12th birthday at the time testing was completed. Boys (53.0%) slightly outnumbered girls. Schools were from geographic areas (suburbs) representing High (43.4%), Medium (27.3%) and Low (29.3%) socio-demographic bands according to the Australian Bureau of Statistics (ABS) Socio-Economic Indexes for Areas (SEIFA) (Australian Bureau of Statistics, Citation2018). All students from Pathways schools resided in suburbs that scored at the lowest SEIFA level (1st decile), with the other five schools selected to span the range of higher SEIFA deciles.

Convergent validity

A subsample of 1,822 children out of the 1,862 children who completed Clowning Around at one of the five non-Pathways schools attempted one or more of the three validation scales. The criterion for selection was parental consent for the validation data collection. Complete Clowning Around data was obtained from 1,757 children. The Personal Wellbeing Index (PWI-SC) was completed by 885 children, many of whom (n = 795) also completed both subscales of the Self Perception Profile for Children (SPPC) but not the Psychological Sense of School Membership (PSSM) which was completed by 831 different children from the five schools to reduce the data collection burden. The PSSM sample was drawn from different classes, where possible within the same grade levels, from those that completed the PWI-SC and the SPPC scales.

Test-retest reliability

Four weeks after the initial data collection 347 children across grade-levels 1–7 at three of the five non-Pathways schools were chosen at random by the schools to contribute test-retest reliability data by completing Clowning Around a second time.

Concurrent validity

Analyses of concurrent validity were based on subsamples of the 4,858 children in the Pathways to Prevention Child Database, which consolidated data from service providers (Freiberg et al., Citation2005; Homel et al., Citation2006), schools (including Clowning Around data), the Queensland Department of Education (Homel et al., Citation2016), and surveys of Grade 7 children (Homel et al., Citation2015c). We report wellbeing and behavioral variations across the following groups: (1) four levels of life adversity reported by a sample of 210 Grade 7 children who had attended one of the Pathways schools since preschool (Homel et al., Citation2015c); (2) 433 children receiving their first out-of-school disciplinary suspension and a control group of 1,168 individually matched children who had never been suspended (Homel et al., Citation2016); (3) 123 children whose families had participated in Pathways services at some time between preschool and Grade 7 and a matched sample of 123 children whose families had never participated (Homel et al., Citation2015b).

Wellbeing variations by age, gender, and the Socio-Economic status of a child’s Area of residence

The full factor analysis sample from all 11 primary schools (N = 3,461) was used for these descriptive analyses.

Measures

Clowning around

Clowning Around comprised 55 questions presented to the child in blocks that roughly reflected the four anticipated domains of educational, social, emotional, and contextual factors. The four question blocks were separated by the three problem solving tasks. Scores were assigned using a 2- or 3-point scale ranging from a low of 0 to a high of either 1 or 2. For example, the item ‘How do you like school?’ used a 3-point response scale (Don’t like it much, wish I didn’t have to go = 0; Ok, I sort of like it = 1; Great, I really like it = 2) whereas the item ‘Do you feel safe at school?’ used a 2-point response scale (Not always = 0; Always = 1). Low scores always indicated lower wellbeing. Since 12 of the 55 items asked about emotions, behaviors, and experiences that are generally understood as negative, answers to these items were reverse scored. For example, Item 15, ‘I get in trouble in class’ was scored All the time = 0; Sometimes = 1; Never = 2.

Measures of convergent validity

(1) The Self Perception Profile for Children: SPPC (Harter, Citation1985) measures children’s self-worth across six domains of self-competence. Twelve items were selected from the original two subscales of Social Acceptance and Behavioral Acceptance. Items are presented as dichotomous statements (e.g. Some kids have lots of friends BUT Other kids don’t have very many friends) to which children use a two-step process to respond. First, they decide which of the two statements is more like them, and then they choose the degree to which it is true for them (sort of true for me vs. really true for me). Psychometric evaluations of the SPPC as a valid and reliable measure of self-concept have been reported in several child populations (Ferro & Tang, Citation2017; Harter, Citation2012; Muris et al., Citation2003). (2) The Psychological Sense of School Membership: PSSM (Goodenow, Citation1993) assesses perceived belonging at school (a component of educational wellbeing). It consists of 18 items and children respond to each statement using a 5-point Likert scale (ranging from 1 = not at all true, to 5 = completely true). Evidence of the reliability and validity of the Psychological Sense of School Membership Scale (PSSM) has been provided in studies of adolescent and pre-adolescent children across a range of cultural contexts (Castro‐Kemp et al., Citation2020; Gaete et al., Citation2016; Goodenow, Citation1993; Wagle et al., Citation2018). (3) The Personal Wellbeing Index-School Children: PWI-SC (Cummins & Lau, Citation2005) is an 8-item measure of subjective wellbeing. Children are asked how happy they are with several domains (e.g. How happy are you with your health?). Responses are scored using an 11-point scale (ranging from 0 = very sad to 10 = very happy). Studies of the psychometric properties of the PWI-SC provide evidence of satisfactory levels of convergent validity and internal reliability (Alfaro et al., Citation2016; Cummins et al., Citation2003; Singh et al., Citation2015).

Measures of concurrent validity

(1) To measure adversity, we used a single survey item asking Grade 7 children “Up to the age you are now, how many bad things have happened in your life?” with answers ranging from 0, 1, 2–3, and 4+ (Homel et al., Citation2015c). (2). We accessed annual school disciplinary absences, or suspensions, as officially recorded for every state school child by the Queensland Department of Education (Homel et al., Citation2016). We combined short suspensions (1–5 days) with long suspensions (6–20 days) but did not include the relatively small number of suspensions with a proposal/recommendation for exclusion and cancelation of enrollment, or actual exclusions. Each child’s classroom behavior was assessed toward the end of each school year by class teachers using the Rowe Behavioral Rating Inventory (RBRI), a validated checklist consisting of 12 items related to difficult behavior (Rowe & Rowe, Citation1995). (3). The level of family participation in Pathways services was assessed from project records and classified as no contact; 1–5 contacts; 6–22 contacts; 23+ contacts. Clowning Around scores were derived from an early principal factor analysis of the sample of 3,461 children that yielded three factors (enjoys supportive positive social relationships; attachment to school; capacity to self-regulate behavior and emotions) (Homel et al., Citation2015b). A child’s cultural background was classified as First Nations (Aboriginal or Torres Strait Islander); Anglo-Celtic; Pacific Islander; Vietnamese; Other.

Socio-economic status of a child’s area of residence

Suburb of residence was coded at the Statistical Area Level 2 as defined in the Australian Statistical Geography Standard. The socio-economic status of an SA2 was scored using the Socio-Economic Indexes for Areas (SEIFA), specifically the Index of Relative Socio-Economic Advantage and Disadvantage, defined as “people’s access to material and social resources, and their ability to participate in society” (Australian Bureau of Statistics, Citation2018, p.6). SEIFA indexes are based on the key dimensions of residents’ income, education, employment, occupation, and housing as reported in the 2016 Population Census.

Procedures

All Clowning Around testing was conducted during normal school hours. Children attending the seven Pathways schools completed Clowning Around during an annual Pathways to Prevention Project evaluation test round, while those at the five non-Pathways schools completed the game at times convenient to each school. Children played Clowning Around in class groups, usually supervised by a staff member who was not the class teacher, since staff had observed a tendency for children to hesitate or even pause their gameplay when their classroom teacher was actively looking on. Children wore headphones to reduce distractions. Generally, whole class groups were able to complete the game in 30–35 min. Children at the non-Pathways schools completed the pencil-and-paper validation tests under the supervision of project staff after completing Clowning Around, usually on the same or the following school day.

Data analysis

For the factor analysis part of the study, we performed exploratory maximum likelihood factor analyses with Oblimin rotation on the correlation matrix of the children’s (N = 3,461) responses to the 55 wellbeing items, including 94 cases (2.7%) with missing values. Pairwise deletion did not lead to negative eigenvalues but because non-responses did not appear to be related to variable values, we analyzed the 3,370 complete records. Factor analyses were carried out using Mplus 8.0 (Muthén & Muthén, Citation2017). Model fit was assessed with the Root Mean Square Error of Approximation (RMSEA), where values smaller than .08 indicate acceptable fit, and values smaller than .06 good fit; the Comparative Fit Index (CFI), where values larger than .90 indicate adequate fit, but values higher than .95 are better; and the Standardized Root Mean Square Residual (SRMR), where values smaller than .08 indicate adequate model fit (Hu & Bentler, Citation1999). However, in selecting the number of factors we balanced the value of goodness of fit indices, whose use in exploratory factor analysis has been criticized (Montoya & Edwards, Citation2021), with factor interpretability and practical utility in schools and other settings.

Analyses of convergent and concurrent validity, test-retest reliability, and correlates of wellbeing used analysis of variance, Spearman correlations, and multiple regression. In the analysis of suspended children to assess concurrent validity, the 433 children who had received their first suspension were individually matched with a control group of 1,168 never suspended children on: school grade in the calendar year of first suspension; school attended; teacher-rated classroom behavior in the year prior to first suspension; prior involvement in Pathways services; gender; and cultural/linguistic background (with more than one match permitted per suspended child) (Homel et al., Citation2016). Weighted regression analyses were performed so that the ratio of weighted controls to case within each group was the same across all subclasses. For the assessment of concurrent validity using children whose families had participated in the Pathways service, the 123 Pathways children were individually matched with 123 children whose families had never participated, so that the treatment and control groups were equivalent at the beginning of preschool (the year before Grade 1) in terms of teacher-rated behavior, gender, cultural background, and child’s level of adversity self-reported in Grade 7.

Results

Factor analysis

The 55 items spanned the domains of Educational Wellbeing, Social Wellbeing, Emotional Wellbeing, and Protective Contexts as described earlier. While these four domains were a useful conceptual tool for item development, we recognized that each domain was likely to be heterogeneous (i.e. containing various subdomains), and that the domains were also likely to be strongly correlated with each other. Thus, we did not expect that an empirical factor analysis would produce four factors exactly corresponding to these domains. However, we did expect that the factor analysis would identify three to five factors readily interpretable by reference to these domains. Given the strong correlations between domains of child wellbeing reported in the literature, we also anticipated a number of interpretable item cross-loadings.

The first eigenvalue of 10.52 was much larger than the others (2.38, 2.31, 1.57, 1.36) suggesting a dominant general well-being factor. A one-factor solution revealed positive loadings exceeding .25 for all but one item (Do you ever feel worried? with a loading of .18). Since the Cronbach alpha for overall child wellbeing using all 55 items was .92, we felt justified in including these scores in data reports to user organizations. However, the analysis also showed, as expected, that the one-factor model fit was not satisfactory (χ2 = 22754.9; df = 1539; p < .001; RMSEA (90% CI) = .056 (.056-.057); CFI = .616), leading us to explore more factors. We chose a readily interpretable 4-factor solution with a satisfactory fit (χ2 = 4985.2; df = 1271; p < .001; SRMR = .025; RMSEA = .030, 90% CI (.029, .030); CFI = .905). We labeled the four factors: 1. Educational Wellbeing (10 items, not including Item 43 which cross-loaded with Factor 2; α = .82); 2. Social Wellbeing (16 items including Item 43; α = .80); 3. Self-Regulation (15 items; α = .79); and 4. Protective Contexts (14 items; α = .78). sets out the loadings and item communalities.

Table 1. 4-Factor exploratory analysis: clowning around (55 items; n = 3370).

Overall, the Educational Wellbeing (Factor 1) and Protective Contexts (Factor 4) factors were relatively close to the corresponding domains used to generate and organize the items. Factor 1 Educational Wellbeing clearly reflected children’s feelings about school and engagement with learning (e.g. 1. Likes school with a loading of .772). We originally grouped items related to feelings of safety in the Protective Contexts domain but in the analysis Item 43. Feel safe at school loaded on both Educational Wellbeing (.265) and Factor 2 Social Wellbeing (.281). Factor 4 (Protective Contexts) related to both family and community contexts. Family and parents featured strongly (e.g. 46. Talk to someone at home about school: .536) but support from other caring adults also featured (e.g. 38. A grown-up always listens and helps me: .312).

The highest loadings on the Social Wellbeing factor (Factor 2) were three items related to children’s feelings about other children (e.g. 19. Other kids make me feel happy: .495) and other people (e.g. 36. People like me as I am: .460). The factor was also characterized by items reflecting low levels of negative affect (15. Feeling generally worried: .418). Item 44: Feel safe in the neighborhood (.280) joined feelings of safety at school in loading on the Social Wellbeing factor.

Items with the largest loadings on Factor 3 (Self-Regulation) related to low levels of rule-breaking and conflict (e.g. 32. Gets into fights: .610). Indeed, eight of the 15 items related to rule-breaking, conflict, and aggression of self and peers (e.g. 39. Friends get into trouble; 3. Behave self). Other items related to (low levels of) negative affect and poor self-esteem (e.g. 12. Have a lot of problems), emotion regulation (33. Get mad and lose temper), and higher levels of educational wellbeing (6. Try hard at school). We therefore interpreted Factor 3 broadly as relating to both emotion and behavior regulation, including the self-discipline required to try hard at school.

As shown in scores on the four factors were significantly correlated, with Factor 4 (Protective Contexts) and Factor 2 (Social Wellbeing) correlating most strongly at .47. The overall wellbeing factor (not shown in ) computed using the regression score method also correlated strongly with each subfactor, with all values around .8.

Table 2. Factor correlations: Clowning around 4-factors (n = 3370).

Convergent validity

Correlations are shown in between the overall wellbeing measure and the factors from the 4-factor model with previously validated measures of similar constructs.

Table 3. Clowning around validation correlations: 1-factor and 4-factor models.

Clowning Around’s Overall Wellbeing score correlated well with measures of the constructs of subjective wellbeing or satisfaction with the quality of one’s life (PWI-SC), self-esteem (SPPC Total), behavioral conduct (SPPC Behavior), and sense of belonging at school (PSSM). In accordance with what might be expected at a conceptual level, Factor 1 (Educational Wellbeing) correlated well with the PSSM which measures the construct of school belonging; Factor 3 (Self-Regulation) correlated well with the SPPC construct of behavioral conduct; and the SPPC subscale that taps the construct of social acceptance had its highest correlation with Factor 2 (Social Wellbeing). Factor 4 (Protective Contexts) correlated best with the constructs of school belonging (PSSM) and subjective wellbeing (PWI-SC).

Test-retest reliability

The correlation between total raw score (overall wellbeing) at Time 1 and Time 2 for the children who completed Clowning Around twice within a four-week period was .80 (p < .001, n = 314). Test-retest correlations for the four factors were: Factor 1 Educational Wellbeing .74 (p < .001, n = 328); Factor 2 Social Wellbeing .69 (p < .001, n = 320); Factor 3 Self-Regulation .75 (p < .001, n = 314); Factor 4 Protective Contexts .68 (p < .001; n = 314).

Concurrent validity

Self-reported adversity

In response to the question about the number of really bad things that had happened in their life, answers ranging from none (n = 69), one (n = 52), 2–3 (n = 41), and 4+ (n = 48). This analysis yielded a significant effect (F(3,206) = 11.05, p < .001) reflecting the fact that as adversity level increased children recorded decreasing levels of wellbeing. The effect was non-linear and most marked for the group experiencing most adversity, with the 69 children who reported no bad things recording an overall wellbeing score 1.04 standard deviations higher than the 48 children who reported four or more bad things.

First out-of-school disciplinary suspension

The 433 suspended children had significantly worse behavior in the year following suspension compared with the year preceding suspension (effect size .41; p < .001), and also had lower post-suspension scores on the Clowning Around Self-Regulation factor compared with the matched control group of 1,168 non-suspended children (effect size .38; p < .001).

Family participation in the Pathways Project

The focus of analysis was changes between Grade 1 and 7 in teacher-rated classroom behavior and between Grade 5 and 7 for wellbeing scores from Clowning Around (which only became available for use when this cohort had reached Grade 5). Compared with the no contact control group, Pathways family support (for those with five or fewer contacts) was associated with subsequent improvements in both classroom behavior (effect size .58; p = .003) and in Clowning Around’s Self-Regulation and Social Wellbeing scores (effect sizes of .71 (p = .054) and .59 (p = .033), using the early 3-factor solution). No significant changes were found for families who had six or more contacts.

Wellbeing variations by age, gender, and socioeconomic status

Overall wellbeing scores declined monotonically with grade, the Grade 1 mean being one third of a standard deviation (.33; p < .001) higher than the Grade 7 mean. Similarly, girls reported higher levels of wellbeing than boys (.43 standard deviation difference; p < .001). Finally, we observed a small SES effect when comparing total wellbeing scores across the three SES groups. The mean was highest for children in the highest SES group and lowest for children in the lowest group (.17 standard deviations; p < .001).

Discussion

Study 1 provides proof-of-concept that an engaging computer game can be devised for primary aged children that examines multiple dimensions of social-emotional wellbeing and generates scores that are psychometrically reliable and valid. A4-factor solution yielded a satisfactory fit to the data and aligned broadly with the conceptual domains that were used as a framework for developing the question set. Importantly, the four factors were readily interpretable and actionable by practitioners seeking to implement data-guided interventions. Educational Wellbeing tapped mainly children’s feelings about school and engagement with learning; Social Wellbeing reflected feelings about other children and other people; Self-Regulation encompassed behavior and emotion regulation, particularly rule-breaking and conflict; and Protective Contexts captured perceived support from both family and other caring adults.

Tests of convergent validity verified that Clowning Around measures some core aspects of children’s wellbeing in a streamlined way, while the real-world performance of the measure provided evidence for its utility for program evaluation and potentially as a social indicator. For example, our study of the effects of a first official primary school suspension on children produced results (deterioration in the year following a first suspension in both teacher-rated classroom behavior and in Clowning Around Self-Regulation scores) consistent with the longitudinal literature that asserts a range of adverse consequences of disciplinary exclusions including school disengagement, academic failure, and intensified behavior problems (Bowman-Perrott et al., Citation2013; Laurens et al., Citation2021; Noltemeyer et al. Citation2015; Raffaele Mendez, Citation2003). The same association between teacher-rated classroom behavior and Self-Regulation factor scores was found in the study of the effects of Pathways family support, consistent with the literature on the impact of family programs on child risk factors for youth antisocial behaviors (Fagan & Benedini, Citation2016) and with what would be expected if the Clowning Around Self-Regulation construct measures what it purports to measure: rule-breaking and conflict with school authorities.

The correlations of the Clowning Around factor scores with children’s demographic characteristics were also broadly in line with what is known about the epidemiology of children’s wellbeing. The decline of overall wellbeing scores with age is consistent with a pattern reported by researchers who have examined changes in children’s life satisfaction or happiness as they move through childhood and into adolescence (Beatton & Frijters, Citation2012; Ho, Citation2013). Similarly, the fact that girls reported higher levels of wellbeing than boys is in line with population level research using other instruments such as the Strengths and Difficulties Questionnaire (AIHW, Citation2020). Again, the small SES effect we observed when comparing total wellbeing scores across the three SES groups is in line with the Australian findings using the SDQ (AIHW, Citation2020) and with a range of evidence cited by the American Psychological Association (Citation2022) that lower levels of SES are associated with higher levels of child and youth emotional and behavioral difficulties.

A limitation of Study 1 is that about half the items we used to interpret the four factors had loadings of less than .30. A logical next step would be to assess and potentially eliminate many of these low loading items and examine the resulting models for goodness of fit, interpretability, and utility for practitioner decision making. However, these analyses were postponed to Study 2 when new funding and expanded partnerships made possible enhancement of the game technology and development of an integrated user support system to facilitate the measure’s use at scale.

Study 2: The development and factor structure of Rumble’s Quest

The process of scaling up for wide dissemination and autonomous use by user groups (Homel et al., Citation2015a) required us to integrate the data capture application (game) into a user management system for self-guided use by practitioners in front-line services. As technology was rapidly advancing rendering some aspects of the code base of the game outdated, we also undertook a rebuild of the game platform.

Enhancement of the game technology

Rumble’s Quest, the successor to Clowning Around, offers a new generation assessment with enhanced game technology that provides an authentic context for children to express what they think and feel. The new, more interactive elements and scripting of the storyline frame questions in a child relevant way that feels natural and meaningful. Questions are posed not in the abstract but as part of a conversation in a way that makes immediate sense and therefore promotes response reliability. When children play Rumble’s Quest, they adopt an avatar and enter a mythical world where they meet Rumble, who is lost, and go on a quest to help him find his way home and his place in the world. This affords a natural context within which Rumble can ask the child questions about their own lives and their world. As with Clowning Around, all questions are voiced, and children answer by selecting labeled icons from a response scale.

Development of an integrated assessment system

Making data capture easy for local ownership of information on children’s wellbeing by schools or agencies was considered essential as they require direct access to practical data collection and reporting tools that identify key issues that affect the social and emotional wellbeing of children. Such data can provide a rationale for immediate action and a focal point for a variety of interventions. Moreover, if these data are aggregated to the community level, they can help galvanize the use of collective strategies across the community and guide the shape that they take (e.g. Fagan et al., Citation2019). When embedded in regular (usually annual) practice they can help drive a cycle of outcomes assessment and ongoing improvement of the initiatives that are put in place.

The Rumble’s Quest integrated system incorporates background videos, administration dashboards, user support and training materials, a secure database, and data visualization tools for instant report generation, as well as resources to facilitate data interpretation, decision making, and the process of planning action in response to priority issues identified in the data profile (https://www.realwell.org.au/rumbles-quest/). The data management and reporting infrastructure supports the aggregation of data from multiple data collection sites since we wanted to support use by networks of community-based service providers and whole school districts, not just individual schools, agencies, or community coalitions. The establishment of Rumble’s Quest was completed during the CREATE project. As part of this process in Study 2 we reviewed the factor structure of the measure.

Method

Participants

Rumble’s Quest’s release in May 2016 quickly led to adoption by 18 Queensland primary schools clustered in three mostly urban regions. Two of these regions were in Greater Brisbane (48.2%), the other was in North Queensland. As with Study 1, the schools obtained informed consent from parents or guardians, with participation rates comparable with those in Study 1 of approximately 95%. The 4,333 children were enrolled across Grades 1 to 6 (generally ages 6 to 11 years, with 12-year-old children who were in Grade 7 in high school after government reforms in 2015 not included). The age distribution was approximately uniform with about 17.5% for each age from 7 to 11, but children aged 6 and 12 were underrepresented at 8.4% and 3.9% respectively. Nearly half the sample were girls (48.8% of 3,956 cases where gender was recorded by the school), and 841 (19.4%) were recorded by the school as being First Nations. Only 16.6% of the children lived in areas of above average socio-economic status (SEIFA deciles 6–9, with none from band 10). However, with a modal SEIFA score of 4 (45.3%) and only 26.5% in deciles 1 and 2 the sample could be described as skewed but not heavily weighted to disadvantaged communities.

Measures

In addition to the major improvements in game technology, three key changes were made to the measure: (1) greater contextualization of the questions embedded in a video story, necessitating small changes to the ordering of items used in Clowning Around; (2) response options were expanded from two or three to five for all items; (3) two items were added to better capture children’s experiences of adversity and victimization. A consequence of the more engaging and conversational format was that game duration typically increased to about 45 min. There were three variants of the five response options introduced. For example, ‘Do you like your school?’ (No; A bit; Sometimes; Mostly; Yes); ‘How do your teachers make you feel?’ (Unhappy; OK; A bit happy; Mostly happy; Happy); ‘Do you get to do things you enjoy in your spare time?’ (Never; A bit; Sometimes; Often; A lot). We added Item 56: ‘How often is someone mean to you?’ (Never to A lot) to capture, in a non-threatening way, feelings of victimization, and to balance Item 52: ‘How often are you mean to someone?’ We also added Item 57 ‘Do bad things happen to you?’ (Never to A lot) to capture children’s perceptions of adversity, an important issue that we had identified in the validity analyses of Clowning Around.

Procedures

Children were briefed about the game by a supervising school staff member and participated in small groups during timetabled sessions in normal school hours. When the game was opened children entered their code that linked them to their school’s Rumble’s Quest account, and then selected their avatar through whom they entered the game world. When each child finished the game, the system posted their data to a secure Australian web server.

Data analysis

As in Study 1 we performed maximum likelihood factor analyses on the correlation matrix of the children’s (N = 4,333) responses to the 57 wellbeing items. There were no missing values. We began with a confirmatory analysis of the Clowning Around solution, followed by exploratory analyses using oblimin rotation and the geomin criterion (with epsilon value .5). Geomin rotation has a good track record of satisfactory solutions and can produce factor loadings and factor correlations like those of confirmatory factor analysis without the need to specify the factor loading pattern (Hattori et al., Citation2017).

We took the additional step in Study 2 of exploring solutions based on subsets of items in order to identify an interpretable model with good overall fit, thus improving on solutions using all 57 items while also satisfying our Criterion 6 (able to guide action by practitioners on behalf of children). Guided by Study 1 results we aimed for a pattern matrix that approximated simple structure and was potentially replicable through confirmatory analysis with a fresh sample, and that had loadings of at least .4, internal consistencies of at least .6, communalities of .2 or higher, and a RMSEA less than .05. These rules of thumb are in line with the recommendations of Hair et al. (Citation2018) for exploratory analyses except for communalities for which their preference is .50 or more, a stringent criterion that would eliminate most items in our data. We also aimed to improve the global fit index (CFI) from around .90 to at least .95, a threshold recommended for structural equation modeling (Hu & Bentler, Citation1999).

Results

Factor structure

As expected, given the changes in technology and response options a confirmatory factor analysis of the 4-factor model from Study 1 () did not fit well (SRMR=.058; RMSEA= .047; 90% CI (.046, .048); CFI=.742). We therefore repeated the approach we used with Study 1, using exploratory factor analyses to investigate 4 and 5-factor solutions.

Mirroring Study 1 results, the first eigenvalue of 10.14 was much larger than the others (3.21, 2.57, 1.80, 1.63), but in contrast to Study 1 nine eigenvalues exceeded 1.00, suggesting the possibility of more than 4 factors. A dominant general well-being factor was once again apparent from the 1-factor model, although two loadings in the 1-factor model, both related to negative affect, were quite low: Item 15. Do you ever feel worried? (.075) and Item 25. Do you get sad? (.099). The internal consistency of the overall wellbeing factor was nevertheless high (.91).

The 4-factor solution was a poorer fit than for Study 1 (χ2 =8583.8; df = 1374; p<.001; SRMR=.029; RMSEA= .035 (90% CI: .034, .036); CFI=.870). However, the four rotated factors were broadly comparable to the Clowning Around 4-factor solution, with 18 variations in item loadings that in most cases enhanced factor interpretation.

The 5-factor model () provided an adequate statistical fit (χ2 =6594.9; df = 1321; p<.001; SRMR=.024; RMSEA= .030 (90% CI: .030, .031); CFI=.905). As with Clowning Around, three factors corresponded closely with the conceptual domains of Educational Wellbeing: Factor 1; Social Wellbeing: Factor 2; and Protective Contexts: Factor 4, and importantly the five rotated factors were all highly interpretable. The fifth factor arose from a split of the Self-Regulation factor in the 4-factor solution into Factor 3 Emotional Wellbeing and Factor 5 Behavior Regulation/Conformity with Norms.

Table 4. 5-Factor exploratory analysis: Rumble’s Quest (57 items; n = 4333).

Inspection of reveals many loadings of less than .3, often corresponding to low communalities (e.g. Item 46, I talk to someone at home about school, had a loading of .268 on Educational Wellbeing and communality of .14). In addition, some items with communalities greater than .2 had two small cross-loadings (e.g. Item 22, I feel good when I help others, had a communality of .29 but loadings of only .258 and .245 on Educational and Social Wellbeing respectively). Using the criteria outlined earlier, we explored a range of trimmed models omitting most items with loadings less than .4 and/or with low communalities, which eliminated most cross-loadings. We also had an eye to the spread of items across the five factors and the interpretability of solutions, and experimented with the effects of adding in or dropping a range of marginal variables such as Item 27: If your friends took chocolate from the shop, would you take some too? We arrived through this process at a trimmed model with 27 items using the goemin criterion (). The geomin rotations compared to oblimin solutions generally led to better fitting models with fewer cross-loadings and with more loadings exceeding .4.

Table 5. 5-Factor trimmed model: Rumble’s Quest (27 items; n = 4333).

The 5-factor 27-item model provided a good statistical fit (χ2 =1354.7; df = 226; p<.001; SRMR=.021; RMSEA= .034 (90% CI: .032, .036); CFI=.955) and had a clear structure that clarified the interpretation of the five factors. The seven items that loaded on Factor 1, which hitherto we labeled Educational Wellbeing, clearly all reflect feelings of attachment to school (e.g. Item 4, School is a nice place: loading .618; h2 = .49), so we relabeled the factor School Attachment (7 items; α = .82).

We retained the name Social Wellbeing for Factor 2 (5 items; α = .62) since it reflects positive social interactions (e.g. Item 16, Good things happen to me: loading .480; h2=.29), social acceptance (e.g. Item 34, People trust me: loading .459: h2=.30) and getting along with peers (e.g. Item 35, Other kids like to play with me: loading .416, h2=.25). There was one potential cross-loading: Item 30, Do you behave yourself? (loading .305; h2=.34). This question reflects prosocial behavior and could therefore be included under Social Wellbeing, but the loading was below our threshold of .4 so we grouped it with other behavioral items loading on Factor 5, which we relabeled Behavioral Conformity. Although Item 30 also has a relatively low loading on Factor 5 (.409), the square of the ratio of the two loadings was sufficiently high at 1.8 to support deleting it from the Social Wellbeing factor. Two other possible cross-loadings (Items 49 and 8) were clearly ignorable (Hair et al., Citation2018).

We retained the name Emotional Wellbeing for Factor 3 (5 items; α = .70) since items reflected negative affect (e.g. 12, Do you feel like you have problems? .621; h2=.42), poor emotion regulation (e.g. 33, Do you get mad and lose your temper? .494; h2=.33), and negative social interactions (57, How often are people mean to you? .611; h2=.43).

We relabeled Factor 4 from Protective Contexts to Family Support (6 items; α = .73) because it reflected children’s perceptions of family safety (e.g. 42, do you feel safe at home? .610; h2=.40), emotional support (e.g. 18, How do your parents make you feel? .574; h2=.40), relationships and attachment (e.g. 49, Do you do fun things with your parents? .455; h2=.38), and routine (45, Do you have dinner together with your family? .443; h2=.24). Since all items related to the child’s family, we judged that, as with Factor 1, a more specific label than Protective Contexts would better reflect the pattern of loadings and be more useful for practitioners.

Finally, as noted above Factor 5 Behavioral Conformity (4 items, α = .61) is a relabeling of the full model Behavior Regulation factor. All four items clearly reference (low levels of) rule-breaking such as getting into trouble (e.g. Item 48, How often do you get detention or sent to the principal’s office for being in trouble? .594; h2=.35).

The factor correlations for the 27-item trimmed model are shown in . Factor 2: Social Wellbeing retained its relatively high correlation (in ) with Factor 4: Family Support (r = .35). Behavioral Conformity correlated with Emotional Wellbeing (r = .25) and School Attachment (r = .31).

Table 6. Factor correlations: Rumble’s Quest 5-Factor Trimmed Model (n = 4333).

Variations in wellbeing by age, gender, and socioeconomic status

The overall wellbeing scale, computed from the sum of the 27 items in the trimmed model, had internal consistency of .83. As with Study 1, girls reported higher overall wellbeing than boys (p < .001) but the gap was less at .32 standard deviations than the value of .41 for Study 1. The gender gap was particularly strong for Behavioral Conformity (.47; p < .001) followed by School Attachment (.30; p < .001) but was significant for all five factors. Consistent with Study 1, older children tended to score lower in terms of overall wellbeing, but the effect was very weak (r = −0.032; p = .038). Correlations were very slightly stronger for School Attachment (r = −0.039; p = .010) and Behavioral Conformity (r = −0.043; p = .005).

The SES gradient was not significant for overall wellbeing but was a little more marked for Social Wellbeing (where the Low SEIFA category children scored .11 standard deviations lower than children in the High and Medium SEIFA groups; p < .001), Family Support (low SEIFA children scored .06 standard deviations lower than medium/high SEIFA children; p = .017), and Behavioral Conformity (low SEIFA children scored .08 standard deviations lower than the high SEIFA children; p = .007). Contrary to expectations, in this sample low SES children scored slightly higher on School Attachment than the highest SEIFA group (.15 standard deviations; p = .004). SES differences were not significant for Emotional Wellbeing.

General discussion

In this paper we aimed to explain our decision to develop a new digitally administered game to measure wellbeing in middle childhood; to describe how the content of the measure and its mode of delivery evolved over a 15-year period; and to report the outcomes of two studies that explored the measure’s psychometric properties. We first identified the need for a new measure through a collaboration with schools and a family support agency in a socially disadvantaged community, a collaboration that made all partners acutely aware of the need to fuze scientific integrity with robust practicality. We sought a reliable and valid measure that could be administered easily, and in a way that would encourage children to report naturally on how they felt about their friends, family, school, neighborhood, and self. In addition, we wanted a measure that would be useful for the evaluation of multifaceted interventions and be capable of generating data that could guide practitioner responses to children’s needs, whether at the individual child level or at the school or community level. As our research program developed to encompass multiple communities, the scalability of the measure and its reporting system became a high priority.

Factor structure

Study 1 used the first version of the measure, Clowning Around, which yielded an overall indicator of children’s general wellbeing, as well as a 4-factor solution that captured the inherent multidimensionality of the concept. A major contribution of Study 2 using Rumble’s Quest was to demonstrate the need for five factors which clearly differentiated emotional and behavioral wellbeing by splitting the Clowning Around Self-Regulation factor. This differentiation provides clearer guidance for schools and communities on where to target pastoral responses or intervention programs.

We explored the five-factor model further in Study 2 by eliminating, as far as possible, items with low loadings and/or communalities. We arrived at a meaningful and well-fitting solution with 27 items (CFI = .955; ) that loaded on five factors labeled: 1. School Attachment (relabeled from Educational Wellbeing in the untrimmed Rumble’s Quest model in ); 2. Social Wellbeing; 3. Emotional Wellbeing; 4. Family Support (Protective Contexts in the untrimmed model); and 5. Behavioral Conformity (Behavioral Regulation in the untrimmed model). The internal consistencies (Cronbach alphas) of the five factors in the 27-item model ranged from .82 for Factor 1 School Attachment with 7 items, to .61 for Factor 5 Behavioral Conformity with 4 items and .62 for Factor 2 Social Wellbeing with 5 items. While constructs with reliabilities exceeding .7 are desirable it can be difficult when relying on the self-reports of young children or even adolescents to ensure that the item pool is sufficiently large to ensure that all factors are based on the minimum number of items required to reach this threshold. These constraints need not however limit the value of the factor scores as a guide for practitioner decision making. For example, the well validated Communities That Care (CTC) prevention model uses the CTC Youth Survey to compute a range of risk and protective factor scores that guide community coalitions to strategically plan and monitor implementation of evidence-based interventions (Fagan et al., Citation2019). While most of these factors have satisfactorily high alphas, one critical construct is antisocial behavior which is derived from five questions and in Australian implementations has an alpha of .62 (Toumbourou et al., Citation2019).

In the trimmed model eight items referenced emotions, behaviors, attitudes, or experiences that are generally understood as negative (e.g. Item 12, Do you feel like you have problems?). These items loaded either on Emotional Wellbeing (5 items) or Behavioral Conformity (3 items), which raises the possibility that these factors are to some extent methodological artifacts since people tend to respond differently to questions about negative feelings or experiences compared to positive ones (Brown, Citation2015). Against this view, however, is the fact that they clearly tap broadly internalizing and externalizing constructs that are central to the literature on child wellbeing (Pollard & Lee, Citation2003). For example, although items loading on Factor 3 Emotional Wellbeing referenced only negative emotions and experiences, children’s responses usually reflected a facility to stay calm and positive in the face of challenge. Similarly, for Factor 5 Behavioral Conformity, responses by most children to the three items referencing negative behaviors indicated that they seldom initiated conflict with school authorities or other children.

Even so, the trimmed model could be viewed as not providing a completely comprehensive measure of children’s social-emotional wellbeing because the items for the Emotional Wellbeing and Behavioral Conformity factors focus more on risk rather than on protective factors, and on the absence of ill-being rather than the presence of well-being. For example Item 12, ‘Do you feel like you have problems?’ loads heavily on Factor 3 Emotional Wellbeing (h2 = .42), but items like 11, ‘When things go bad do you know how to get help?’ or 38, ‘Do you have a grown-up who always listens and helps when you need them or feel upset?’ were dropped in the reduction process. While this trend to exclude protective factors was not absolute (Item 30 for example, ‘Do you behave yourself?’ could be viewed as protective), the current form of the Emotional Wellbeing and Behavioral Conformity factors does to some extent run counter to the strengths-based foundations of a wellbeing measure for which we were striving. We explore the practical implications of this feature of the trimmed model below.

Validity and reliability

Although we do not yet have direct evidence for the convergent validity and reliability of the Rumble’s Quest measure based on the trimmed model, we suggest that Study 1 provides a reasonable basis for confidence given that the changes to the scale only involved the addition of two items to the pool of 55 and the expansion of response categories from two or three to five alternatives. The overall similarities of the factor structures in Studies 1 and 2 ( and ) suggest that the upgrades to the animation and game technology did not fundamentally alter what the instrument measures.

This conclusion is supported by the pattern of Study 2 correlations with age, gender, and the SEIFA score of the suburb in which the school was located. Except for age, correlations were very similar to the patterns reported in Study 1. In contrast to Study 1 the age effect was very weak, although in the same direction (older children reporting slightly lower wellbeing). The differences between the two studies likely reflect the nature of the sample for Study 2, where the SES distribution was skewed to low SEIFA areas and older children were somewhat under-represented. Nevertheless, the distributions of some Rumble’s Quest items were broadly in line with similar measures in other Australian studies of children’s wellbeing that used representative samples. For example, the Australian Institute for Health and Welfare (2020) found that 91% of children aged 12–13 felt safe in their neighborhood in 2015–2016. This compares with 90% of the Study 2 sample who in response to Item 44, Do you feel safe in your neighborhood? said that they felt at least some degree of safety. These comparisons help to strengthen confidence that Rumble’s Quest produces accurate estimates of how Australian children feel about their lives.

A study completed in 2022 using post-2016 samples (Allen et al., Citation2023) bears on the question of concurrent validity because it used Rumble’s Quest to evaluate the impact of the 2020 Covid lockdowns on children’s social-emotional wellbeing in three Australian states: Queensland, Tasmania, and Western Australia. Analyses were conducted using a trimmed model with the same five factors reported in the present paper, but with slightly different items loading on some factors. The treatment group of 580 children was tested once during 2019 (Time 1) and a second time in mid-late 2020 and early 2021, post-lockdowns (Time 2). The comparison group of 841 children was tested twice prior to the pandemic, so changes in Rumble’s Quest factor scores in this group were used as a benchmark for evaluating treatment group changes. These showed a range of effects conditioned by the child’s gender, particularly on Family Support but also on Behavioral Conformity and Emotional Wellbeing. The changes observed were consistent with research carried out during the pandemic highlighting the increased risk to already vulnerable families (Spencer et al., Citation2021).

Rumble’s Quest as a guide for action by practitioners and policymakers

The trimmed model draws on a parsimonious subset of 27 items to provide an elegant solution for measuring wellbeing that approximates simple structure. The five factors have clear interpretations that facilitate the computation of a group’s wellbeing profile, providing a range of reliable and otherwise difficult to access information to schools and child agencies about how the children in their care feel about their lives. But what of the 30 omitted items, many of which as noted earlier can be understood as referencing conditions that promote, protect, or are otherwise fundamental to children’s wellbeing? Omitted items refer, for example, to prosocial orientation, positive identity and self-belief, positive affect and optimistic outlook, ready access to caring adults who can provide support and recognition, as well as interesting experiences and structured opportunities to develop confidence and competence. These are the ‘hard to see issues’ that in our experience schools and community practitioners are frequently most interested in because they provide some of the background story that explains what lies behind the troublesome or worrying behavior of some children.

Our response to this challenge is to provide an Assets Report with item-by-item breakdowns aggregated across children in the whole group or in user-specified subgroups, in addition to the Summary Report based on the 5-factor trimmed model. The Summary Report offers schools and other users a starting point for decision making because it presents a broad wellbeing profile that facilitates identification of general issues, while the Assets Report, which closely reflects the Developmental Assets Framework (Search Institute, Citation2016), supports user capacity to take a deep dive into their data and develop more nuanced action plans. For example, insights into how to strengthen potential assets that underpin the Behavioral Conformity dimension may be gained by investigating items omitted from the trimmed model that could indicate a tendency to take a casual attitude to norms such as thinking rules are stupid (Q29), contemplating shoplifting under peer pressure (Q27), or having friends who get into trouble (Q39). Similarly, a pattern of responses that indicate that children don’t know how to get help (Q11), or don’t have access to assets such as adults to protect them (Q40) or to listen to them and help when they need it (Q38) may lead to insights into how to strengthen Emotional Wellbeing.

Further research

Validation studies

Given the development of the measure since Study 1 and its use in a much wider range of schools and communities, it is important to replicate and extend the study of convergent validity using Rumble’s Quest and to include newer non-clinical measures from sources such as the CASEL Assessment Guide (https://measuringsel.casel.org/) or the Developmental Assets Profile (Search Institute, Citation2016). We will also explore further the ecological validity of the measure through structured interviews with children, parents, and practitioners.

As opportunity arises to upgrade the game it will be of great value, while maintaining our focus on utility for practitioners, to assess the effects of introducing a small number of new items while removing some others with low communalities to improve the internal consistencies of the Behavioral Conformity and Social Wellbeing factors. There were several features of the loadings and communalities of the Social Wellbeing factor (including a potential cross-loading and a communality of only .18 for Item 50, Do you ever play at your friend’s house?) that suggest this factor in particular might benefit from further development.

Importantly, analyses are planned using newly collected Rumble’s Quest data to further investigate how both age and gender influence responses. The results of Studies 1 and 2 were inconsistent with respect to how the various wellbeing dimensions varied by age, although both clearly showed that girls scored more highly than boys on most factors. Although middle childhood constitutes a distinct phase of life, many developmental changes nevertheless occur over this period. Explicitly modeling the effects of age and testing for the age and gender invariance of Rumble’s Quest’s factor structure is a priority.

Rumble’s Quest at different levels of aggregation

An important consideration in the development of our new measure was the need to provide practitioners with indicators of children’s wellbeing, both at a child level to facilitate the delivery of individual support to vulnerable children and at a group level to facilitate action planning at the school or community level. Although the measure was never intended to serve as a diagnostic tool for specific mental health or behavioral conditions there is nevertheless a need to identify clear cut-points flagging vulnerability, which we plan to do by using the measure with a sample of children with diagnosed social and emotional disorders. Such cut-points might support the measure’s use as a screening tool for the clinical targeting of children at higher risk of poor outcomes.

The capacity for group level reporting provides the basis for several new research initiatives. As we have worked with schools in implementing Rumble’s Quest, we have accumulated a considerable body of anecdotal and qualitative data on how principals or wellbeing champions in schools have used their data reports to respond to identified needs, particularly through improved pastoral care. The next steps are to quantify and systemize this information through structured interviews and then to evaluate the impacts of a range of strategies on Rumble’s Quest scores and behavioral measures.

Finally, as the AIHW (Citation2020, p. 128) has noted, the measure has potential for use as part of a comprehensive system of social indicators that can be collected at a population level for policy planning and monitoring purposes. We are exploring this option since national action to measure and improve child wellbeing is more likely to succeed if it integrates strategies at all levels of the social ecology.

Conclusion

Rigorous assessment of child wellbeing at the individual, school cohort, and community levels is a prerequisite for the development of effective responses to children’s needs. Yet there are few demonstrably valid and reliable tools for measuring wellbeing in middle childhood in ways that: (1) allow the child to make their own report, (2) tap wellbeing as a multi-dimensional construct, and (3) are suitable for independent use by practitioners in non-clinical settings with large numbers of children. In this paper we have addressed these challenges by providing evidence for the validity and reliability of a new digital measure of child social and emotional wellbeing. We have in addition described our work to build local capacity for data collection and use through the development of an integrated on-line platform.

The authors gratefully acknowledge the funding and in-kind support of our partner organizations: Anglicare Tasmania; Australian Department of Social Services; Australian Primary Principal’s Association; The Benevolent Society; Children’s Health Queensland; Mission Australia; New South Wales Department of Communities and Justice; New South Wales Department of Education; New South Wales Department of Family and Community Services; Parenting Research Center Inc; Queensland Department of Children, Youth Justice and Multicultural Affairs; Queensland Department of Communities, Child Safety and Disability Services; Queensland Department of Education; Queensland Family and Child Commission; The Salvation Army; The Smith Family; and The Trustee for Logan Child Friendly Community Charitable Trust.

Acknowledgments

The authors wish to thank the Editor and a reviewer for very helpful critiques of earlier versions of this paper. The authors are also very grateful to InVision Media Pty Ltd for their partnership in developing the software and for helping transform the measure into an engaging computer game. The authors are especially indebted to our partner school principals and the many children who enthusiastically played Clowning Around and Rumble’s Quest.

Data availability statement

The data for Studies 1 and 2 are available on request.

Disclosure statement

RealWell (https://www.realwell.org.au), the platform which licensed Rumble’s Quest for use by schools and agencies while the studies reported in this paper were being conducted, was a social enterprise developed wholly within Griffith University. In July 2022 RealWell Pty Ltd was established as a for-purpose business with a continuing research relationship with Griffith University. The authors declare that they derive no financial benefit from RealWell Pty Ltd nor do they have personal relationships that could have appeared to influence the work reported in this paper. The funding agencies had no role in the study design, data collection, analysis, interpretation, writing of the paper, or decision to submit the paper for publication. The corresponding author had full access to all data in the study and had final responsibility for the decision to submit for publication. The authors declare their independence from the funders and from anybody substantially funded by one of these organizations.

The views presented in this paper are those of the authors and are not necessarily those of the funders or partner organizations.

Additional information

Funding

This research was supported by Australian Research Council grants DP0984675, LP0560771, DP140100921, LP130100142, and LP170100480 and Criminology Research Grant CRG 30/11-12. Ethics approval was granted by Griffith University Human Research Ethics Committee (LEJ/04/05/HREC; LEJ/02/09/HREC; LEJ/03/13/HREC; Protocol number 2017/805) and by the Queensland Department of Education.

References

  • Aaronson, N., Alonso, J., Burnam, A., Lohr, K. N., Patrick, D. L., Perrin, E., & Stein, R. E. (2002). Assessing health status and quality-of-life instruments: Attributes and review criteria. Quality of Life Research, 11(3), 193–205. https://doi.org/10.1023/a:1015291021312
  • Alfaro, J., Guzman, J., Garcia, C., Sirlopu, D., Reyes, F., & Varela, J. (2016). Psychometric properties of the Spanish Version of the Personal Wellbeing Index-School Children (PWI-SC) in Chilean school children. Child Indicators Research, 9(3), 731–742. https://doi.org/10.1007/s12187-015-9342-2
  • Allen, J., Homel, R., McGee, T., & Freiberg, K. (2023). Child wellbeing before and after the 2020 COVID-19 lockdowns in three Australian states. Australian Journal of Social Issues, 58(1), 41–69. https://doi.org/10.1002/ajs4.258
  • American Psychological Association. (2022). Children, youth, families and socioeconomic status. https://www.apa.org/pi/ses/resources/publications/children-families
  • Anastasi, A., & Urbina, S. (1997). Psychological testing (7th ed.). Prentice-Hall.
  • Australian Bureau of Statistics. (2018). Socio-Economic Indexes for Areas (SEIFA). Technical Paper Catalogue No. 2033.55.001.
  • Australian Institute of Health and Welfare. (2020). Australia’s children. Cat. no. CWS 69. AIHW.
  • Beatton, T., & Frijters, P. (2012). Unhappy young Australian: A domain approach to explain life satisfaction change in children (No. 289). School of Economics and Finance, Queensland University of Technology.
  • Bernard, M., Stephanou, A., & Urbach, D. (2007). ASG student social and emotional health report. Australian Council for Educational Research and Australian Scholarships Group.
  • Bowles, T., & Scull, J. (2019). The centrality of connectedness: A conceptual synthesis of attending, belonging, engaging and flowing. Journal of Psychologists and Counsellors in Schools, 29(01), 3–21. https://doi.org/10.1017/jgc.2018.13
  • Bowman-Perrott, L., Benz, M. R., Hsu, H.-Y., Kwok, O.-M., Eisterhold, L. A., & Zhang, D. (2013). Patterns and predictors of disciplinary exclusion over time: An analysis of the SEELS National Data Set. Journal of Emotional and Behavioral Disorders, 21(2), 83–96. https://doi.org/10.1177/1063426611407501
  • Brown, T. A. (2015). Confirmatory factor analysis for applied research (2nd ed.). Guilford Press.
  • Castro‐Kemp, S., Palikara, O., Gaona, C., Eirinaki, V., & Furlong, M. (2020). The role of Psychological Sense of School Membership and postcode as predictors of profiles of socio‐emotional health in primary school children in England. School Mental Health, 12(2), 284–295. https://doi.org/10.1007/s12310-019-09349-7
  • Catalano, R. F., Hawkins, J. D., Kosterman, R., Bailey, J. A., Oesterle, S., Cambron, C., & Farrington, D. P. (2021). Applying the social development model in middle childhood to promote healthy development: Effects from primary school through the 30s and across generations. Journal of Developmental and Life-Course Criminology, 7(1), 66–86. https://doi.org/10.1007/s40865-020-00152-6
  • Cho, E. Y. N., & Yu, F. (2020). A review of measurement tools for child wellbeing. Children and Youth Services Review, 119, 105576. https://doi.org/10.1016/j.childyouth.202.105576
  • Cummins, R. A., Eckersley, R., Pallant, J., van Vugt, J., & Misajon, R. A. (2003). Developing a national index of subjective wellbeing: The Australian Unity Wellbeing Index. Social Indicators Research, 64(2), 159–190. https://doi.org/10.1023/A:1024704320683
  • Cummins, R., & Lau, A. (2005). Personal Wellbeing Index – School Children (PWI-SC) manual (3rd ed.). Australian Centre on Quality of Life.
  • Darling-Churchill, K., & Lippman, L. (2016). Early childhood social and emotional development: Advancing the field of measurement. Journal of Applied Developmental Psychology, 45, 1–7. https://doi.org/10.1016/j.appdev.2016.02.002
  • Day, J., Freiberg, K., Hayes, A., & Homel, R. (2019). Towards scalable, integrative assessment of children’s self-regulatory capabilities: New applications of digital technology. Clinical Child and Family Psychology Review, 22(1), 90–103. https://doi.org/10.1007/s10567-019-00282-4
  • Devaney, E., O’Brien, M., Resnik, H., Keister, S., & Weissberg, P. (2006). Sustainable schoolwide social and emotional learning: Implementation guide and toolkit. Chicago: Collaborative for Academic, Social, and Emotional Learning.
  • Durlak, J., Weissberg, R., Dymnicki, A., Taylor, R., & Schellinger, K. (2011). The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82(1), 405–432. https://doi.org/10.1111/j.1467-8624.201.01564.x
  • Eccles, J. (1999). The development of children ages 6 to 14. The Future of Children, 9(2), 30–44. https://psycnet.apa.org/doi/1.2307/1602703 https://doi.org/10.2307/1602703
  • Erikson, E. H. (1968). Identity: Youth and crisis. Norton.
  • Fagan, A. A., & Benedini, K. M. (2016). How do family-focused prevention programs work? A review of mediating mechanisms associated with reductions in youth antisocial behaviors. Clinical Child and Family Psychology Review, 19(4), 285–309. https://doi.org/10.1007/s10567-016-0207-0
  • Fagan, A. A., Hawkins, J. D., Catalano, R. F., & Farrington, D. P. (2019). Communities that care: Building community engagement and capacity to prevent youth behavior problems. Oxford University Press.
  • Fernandes, L., Mendes, A., & Teixeira, A. (2012). A review essay on the measurement of child well-being. Social Indicators Research, 106(2), 239–257. https://doi.org/10.1007/s11205-011-9814-9
  • Ferro, M., & Tang, J. (2017). Psychometric properties of the Self-Perception Profile for Children in children with chronic illness. Journal of the Canadian Academy of Child and Adolescent Psychiatry, 26(2), 119–124. https://psycnet.apa.org/doi/1.1037/t05338-000
  • Freiberg, K., Homel, R., Batchelor, S., Carr, A., Hay, I., Elias, G., Teague, R., & Lamb, C. (2005). Creating pathways to participation: A community-based developmental prevention project in Australia. Children & Society, 19(2), 144–157. https://doi.org/10.1002/chi.867
  • Freiberg, K., Homel, R., & Branch, S. (2014). The Parent Empowerment and Efficacy Measure (PEEM): A tool for strengthening the accountability and effectiveness of family support services. Australian Social Work, 67(3), 405–418. https://doi.org/10.1080/0312407X.2014.902980
  • Gaete, J., Montero-Marin, J., Rojas-Barahona, C. A., Olivares, E., & Araya, R. (2016). Validation of the Spanish Version of the Psychological Sense of School Membership (PSSM) Scale in Chilean adolescents and its association with school-related outcomes and substance use. Frontiers in Psychology, 7, 1901. https://doi.org/10.3389/fpsyg.2016.01901
  • Goodenow, C. (1993). The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychology in the Schools, 30(1), 79–90. https://doi.org/10.1002/1520-6807(199301)30:1%3C79::AID-PITS2310300113%3E3.CO;2-X
  • Greenberg, M. T., Domitrovich, C., & Bumbarger, B. (2001). The prevention of mental disorders in school-aged children: Current state of the field. Prevention & Treatment, 4(1), 1–62, 1a. https://doi.org/10.1037/1522-3736.4.1.41a
  • Guhn, M., Schonert-Reichl, K. A., Gadermann, A. M., Marriott, D., Pedrini, L., Hymel, S., & Hertzman, C. (2012). Well-being in middle childhood: An assets-based population-level research-to-action project. Child Indicators Research, 5(2), 393–418. https://doi.org/10.1007/s12187-012-9136-8
  • Hair, J. F., Babin, B. J., Anderson, R. E., & Black, W. C. (2018). Multivariate data analysis (8th ed.). Cengage Learning Australia.
  • Hamilton, M., & Redmond, G. (2010). Conceptualization of social and emotional wellbeing for children and young people, and policy implications. The Australian Research Alliance for Children and Youth, and the Australian Institute of Health and Welfare.
  • Harter, S. (1985). Manual for the self-perception profile for children. University of Denver.
  • Harter, S. (2012). Self-perception profile for children: Manual and Questionnaires (Grades 3–8). University of Denver.
  • Hattori, M., Zhang, G., & Preacher, K. J. (2017). Multiple local solutions and geomin rotation. Multivariate Behavioral Research, 52(6), 720–731. https://doi.org/10.1080/00273171.2017.1361312
  • Ho, L. S. (2013). Happiness of children as they grow into their teens: The Hong Kong case. Centre for Public Policy Studies: CPPS Working Paper Series, Paper 93.
  • Homel, R., Freiberg, K., & Branch, S. (2015a). CREATE-ing capacity to take developmental crime prevention to scale: A community-based approach within a national framework. Australian and New Zealand Journal of Criminology, 48(3), 367–385. https://doi.org/10.1177/0004865815589826
  • Homel, R., Freiberg, K., Branch, S., Haskard, K., Teague, R., Thompson, P., & Mobbs, S. (2016). Can family support moderate the relationship between disciplinary suspensions and child outcomes? Report to Qld Dept Education. Griffith University.
  • Homel, R., Freiberg, K., Branch, S., & Le, H. (2015b). Preventing the onset of youth offending: The impact of the Pathways to Prevention Project on child behaviour and wellbeing. Trends and Issues in Crime and Justice, 481, 1–10. https://www.aic.gov.au/publications/tandi/tandi481
  • Homel, R., Freiberg, K., Branch, S., & Le, H. (2015c). Preventing the onset of youth offending: The impact of the Pathways to Prevention Project on the behaviour and wellbeing of children and young people (88 pp). Report to the Criminology Research Advisory Group.
  • Homel, R., Freiberg, K., Lamb, C., Leech, M., Hampshire, A., Hay, I., Elias, G., Carr, A., Manning, M., Teague, R., & Batchelor, S. (2006). The pathways to prevention project: The first five years, 1999–2004. Griffith University & Mission Australia.
  • Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118
  • Huang, J., Kim, Y., & Sherraden, M. (2017). Material hardship and children’s social-emotional development: Testing mitigating effects of Child Development Accounts in a randomized experiment. Child: Care, Health and Development, 43(1), 89–96. https://doi.org/10.1111/cch.12385
  • Jones, D. E., Greenberg, M., & Crowley, M. (2015). Early social-emotional functioning and public health: The relationship between kindergarten social competence and future wellness. American Journal of Public Health, 105(11), 2283–2290. https://doi.org/10.2105/AJPH.2015.302630
  • Kempf, A. (2018). The challenges of measuring wellbeing in schools: A review prepared for the Ontario Teachers’ Federation. https://www.otffeo.on.ca/en/wp-content/uploads/sites/2/2018/02/The-challenges-of-measuring-wellbeing-in-schools-Winter-2017-web.pdf
  • Laurens, K. R., Dean, K., Whitten, T., Tzoumakis, S., Harris, F., Waddy, N., Prendergast, T., Taiwo, M., Carr, V. J., & Green, M. J. (2021). Early childhood predictors of elementary school suspension: An Australian record linkage study. Journal of Applied Developmental Psychology, 77, 101343. https://doi.org/10.1016/j.appdev.2021.101343
  • Li, N., Peng, J., & Li, Y. (2021). Effects and moderators of Triple P on the social, emotional, and behavioral problems of children: Systematic review and meta-analysis. Frontiers in Psychology, 12, 709851. https://doi.org/10.3389/fpsyg.2021.709851
  • Luby, J., Belden, A., Sullivan, J., & Spitznagel, E. (2007). Preschoolers’ contribution to their diagnosis of depression and anxiety: Uses and limitations of young child self-report of symptoms. Child Psychiatry and Human Development, 38(4), 321–338. https://doi.org/10.1007/s10578-007-0063-8
  • McKown, C. (2019). Challenges and opportunities in the applied assessment of student social and emotional learning. Educational Psychologist, 54(3), 205–221. https://doi.org/10.1080/0046152.2019.1614446
  • McKown, C., & Taylor, J. (2018). Introduction to the special issue on social-emotional assessment to guide educational practice. Journal of Applied Developmental Psychology, 55, 1–3. https://doi.org/10.1016/j.appdev.2017.12.002
  • Melton, G. (2005). Treating children like people: A framework for research and advocacy. Journal of Clinical Child and Adolescent Psychology, 34(4), 646–657. https://doi.org/10.1207/s15374424jccp3404_7
  • Moffitt, T. E., Arseneault, L., Belsky, D., Dickson, N., Hancox, R. J., Harrington, H., Houts, R., Poulton, R., Roberts, B. W., Ross, S., Sears, M. R., Thomson, W. M., & Caspi, A. (2011). A gradient of childhood self-control predicts health, wealth, and public safety. Proceedings of the National Academy of Sciences of the United States of America, 108(7), 2693–2698. https://doi.org/10.1073/pnas.1010076108
  • Montoya, A. K., & Edwards, M. C. (2021). The poor fit of model fit for selecting number of factors in exploratory factor analysis for scale evaluation. Educational and Psychological Measurement, 81(3), 413–440. https://doi.org/10.1177/0013164420942899
  • Moore, K., Theokas, C., Lippman, L., Bloch, M., Vandivere, S., & O’Hare, W. (2008). A microdata child well-being index: Conceptualization, creation and findings. Child Indicators Research, 1(1), 17–50. https://doi.org/10.1007/s12187-007-9000-4
  • Muris, P., Meesters, C., & Fijen, P. (2003). The Self-Perception Profile for Children: Further evidence for its factor structure, reliability, and validity. Personality and Individual Differences, 35(8), 1791–1802. https://psycnet.apa.org/doi/1.1016/S0191-8869(03)00004-7 https://doi.org/10.1016/S0191-8869(03)00004-7
  • Muthén, L. K., & Muthén, B. O. (2017). MPlus user’s guide (8th ed.). Muthén & Muthén.
  • Noltemeyer, A. L., Ward, R. M., & Mcloughlin, C. (2015). Relationship between school suspension and student outcomes: A meta-analysis. School Psychology Review, 44(2), 224–240. https://doi.org/10.17105/spr-14-0008.1
  • Olsson, C. A., McGee, R., Nada-Raja, S., & Williams, S. M. (2013). A 32-year longitudinal study of child and adolescent pathways to well-being in adulthood. Journal of Happiness Studies, 14(3), 1069–1083. https://doi.org/10.1007/s10902-012-9369-8
  • Payton, J., Weissberg, R., Durlak, J., Dymnicki, A., Taylor, R., Schellinger, K., & Pachan, M. (2008). The positive impact of social and emotional learning for kindergarten to eighth-grade students: Findings from three scientific reviews. Collaborative for Academic, Social, and Emotional Learning. https://files.eric.ed.gov/fulltext/ED50537.pdf
  • Pollard, E., & Lee, P. (2003). Child well-being: A systematic review of the literature. Social Indicators Research, 61(1), 59–78. https://doi.org/10.1023/A:1021284215801
  • Raffaele Mendez, L. M. (2003). Predictors of suspension and negative school outcomes: A longitudinal investigation. New Directions for Youth Development, 2003(99), 17–33. https://doi.org/10.1002/yd.52
  • Ray, D. C., Angus, E., Robinson, H., Kram, K., Tucker, S., Haas, S., & McClintock, D. (2020). Relationship between adverse childhood experiences, social-emotional competencies, and problem behaviors among elementary-aged children. Journal of Child and Adolescent Counseling, 6(1), 70–82. https://doi.org/10.1080/2372781.202.1719354
  • Riley, A. (2004). Evidence that school-age children can self-report on their health. Ambulatory Pediatrics, 4(4 Suppl), 371–376. https://doi.org/10.1367/A03-178R.1
  • Rowe, K. J., & Rowe, K. S. (1995). RBRI profile user’s guide. Centre for Applied Educational Research.
  • Sanson, A. V., Misson, S., Hawkins, M. T., & Berthelsen, D., the LSAC Consortium. (2010). The development and validation of Australian indices of child development – Part 1: Conceptualisation and development. Child Indicators Research, 3(3), 275–292. https://doi.org/10.1077/s12187-009-9058-2
  • Scales, P. C., Benson, P. L., & Mannes, M. (2006). The contribution to adolescent well-being made by nonfamily adults: An examination of developmental assets as contexts and processes. Journal of Community Psychology, 34(4), 401–413. https://doi.org/10.1002/jcop.2010
  • Scales, P., & Leffert, N. (1999). Developmental assets: A synthesis of the scientific research on adolescent development. Search Institute.
  • Search Institute. (2016). User guide for the developmental assets profile.
  • Shapiro, V. B., Oesterle, S., Abbott, R. D., Arthur, M. W., & Hawkins, J. D. (2013). Measuring dimensions of coalition functioning for effective and participatory community practice. Social Work Research, 37(4), 349–359. https://doi.org/10.1093/swr/svt028
  • Shonkoff, J. (2004). Evaluating early childhood services: What’s really behind the curtain. The Evaluation Exchange, 10(2), 3–4. https://archive.globalfrp.org/evaluation/the-evaluation-exchange/issue-archive/early-childhood-programs-and-evaluation/evaluating-early-childhood-services-what-s-really-behind-the-curtain
  • Singh, K., Ruch, W., & Junnarkar, M. (2015). Effects of the demographic variables and psychometric properties of the Personal Well-Being Index for school children in India. Child Indicators Research, 8(3), 571–585. https://doi.org/10.1007/s12187-014-9264-4
  • Spencer, A. E., Oblath, R., Dayal, R., Loubeau, J. K., Lejeune, J., Sikov, J., Savage, M., Posse, C., Jain, S., Zolli, N., Baul, T. D., Ladino, V., Ji, C., Kabrt, J., Mousad, L., Rabin, M., Murphy, J. M., & Garg, A. (2021). Changes in psychosocial functioning among urban, school-age children during the COVID-19 pandemic. Child and Adolescent Psychiatry and Mental Health, 15(1), 73. https://doi.org/10.1186/s13034-021-00419-w
  • Toumbourou, J. W., Rowland, B., Williams, J., Smith, R., & Patton, G. C. (2019). Community intervention to prevent adolescent health behavior problems: Evaluation of Communities That Care in Australia. Health Psychology, 38(6), 536–544. https://doi.org/10.1037/hea0000735
  • United Nations. (1989). Convention on the rights of the child. Office of the High Commissioner. https://www.ohchr.org/en/professionalinterest/pages/crc.aspx
  • Wagle, R., Dowdy, E., Yang, C., Palikara, O., Castro, S., Nylund-Gibson, K., & Furlong, M. J. (2018). Preliminary investigation of the psychological sense of school membership scale with primary school students in a cross-cultural context. School Psychology International, 39(6), 568–586. https://doi.org/10.1177/0143034318803670