1,482
Views
0
CrossRef citations to date
0
Altmetric
Articles

Missing in action: queer(y)ing the educational implications of data justice in an age of automation

&
Pages 213-225 | Received 23 May 2022, Accepted 29 Nov 2022, Published online: 04 May 2023

ABSTRACT

In the recent Australian 2021 census, the socio-technical construct of algorithmically driven decision-making processes made LGBTQI+ data as a category of diversity, inclusion and belonging an absent presence. In this paper, we position the notion of ‘data justice’ in relation to the entrenchment of inequalities and exclusion of LGBTQI+ lives and consider the implications of LGBTQI+ data being missing in action. As we look at the notion of ‘data justice’, we consider five critical socio-technical imaginaries with different kinds of data to think through the implications of technical democracy, data justice and post-automation. Finally, we consider the imaginary of citizenship when LGBTQI+ data is habitually missing in action from systematic power integrated into forms of governance. This paper positions ‘data’ not as a static ‘object or process’ but as a dynamic ecology that carries with it a multi-faceted set of coded meanings requiring constant review and reconsideration.

Introduction – it aint no Mardi Gras here

As life worlds become more divergent and their boundaries become more blurred, the centrality of data and the multiplicity of its meanings are at a continual intersection with the multiple discourses surrounding digital data and automation. Learning how to be and become proficient in negotiating and encountering automation in our everyday lives requires an interpretation of the social and cultural contexts in which data and digital data are gathered. And how data-informed knowledge is produced and used.

We position automation as a concept through which a machine or computer becomes an informed system that aims to ‘operate with maximum efficiency through adequate measurement, observation, and control of its behaviour’ (Bagrit Citation1966, 14). Considering interactions with automation often involves a process of ‘analysing’. Analysing text, analysing code, and so on. Furthermore, automation depends on digital data to function. Our ‘critical analysis’ in this paper speaks back to the efficiency of measurement, control and what ‘data’ is. We apply a genealogical perspective to understanding the pedagogy of digital forms of automation and query the notion of democratic instruction and engagement with technology and new media. We relate the term ‘analysis’ in this paper to:

Genealogy as a form of analysis opposes a pursuit of origins … it usefully shows how the emergence of self is subject to the transforming forces of power that entwine subjects in a series of subjugations … of dynamic relationships of struggle … genealogy embraces the confrontations, the conflicts and systems of subjection … no one is responsible for an emergence it is merely an effect of the play of dominations. (Smart Citation1985, 57)

Smart’s (Citation1985) notion of an ‘effect of the play of dominations’ means that it is difficult for us to provide the reader with a ‘grand’ narrative of Australia’s gay history to inform the age of automation. This is due to the geographical and legislative differences that characterize federation, and the intangible nature of algorithmic systems. However, in providing an account of pivotal moments in the history of homosexuality in Australia, the absent presence of Lesbian, Gay, Bisexual, Transgender, Queer and Intersex (LGBTQI+) data within official government and policy departments attests to the significant efforts to oppose these issues in Australia’s public spaces. It is here we begin. To begin to narrate the genealogy of LGBTQI+ as excluded subjects, we initially turn to the grey documents of the recent past: The ‘Gay Hate Decades’.

Narrating the genealogy of LGBTQI+ as excluded subjects

The Gay Hate Decades refers to a period from the 1970s to 2010 in which almost 90 queer-bashing murders remained un/investigated in Sydney. Illustrative of how the police and legal system had systematically ignored the reporting and recording of crimes against LGBTQI+ people for decades (Hawley, Citation2022) the decades make it possible to understand how data becomes performative in LGBTQI+ lives. How it is connected to and fashioned by municipal attitude.

There continues to be ‘data-informed’ legislation shaped by wider public opinion, in Australia. A recent example was the campaign for the legalization of same-sex marriage. Debated by a predominantly heterosexual Australia in 2017, 61.6% of the Australian population returned a positive vote. However, the public debate and underpinning municipal attitude that preceded the ruling was unrelenting and damaging. As part of the debate, many LGBTQI+ individuals reported stigma-related stress due to homophobic reporting, advertizing, and discussion (Ecker, Aubry, and Sylvestre Citation2019). For example, Chonody et al.’s (Citation2020) study of the effects on the LGBTQI+ community during the Australian gay marriage plebiscite reported that interpersonal micro-aggressions (that is, day-to-day forms of subtle or unconscious discrimination often articulated in language) were heightened during the lead-up to the vote . All such reporting influenced Australia’s wider public opinion, becoming ‘data’ to inform legislation.

Turning our focus to educational systems, there are many instances where ‘data’ to inform educational programmes, policy and legislation is informed by public opinion. In 2013, the Australian Federal government passed the Sexual Orientation, Gender Identity and Intersex Status Amendment Act. This included an exemption that continues into 2022. This exception allows religious schools and organizations in Australia to discriminate based on sexual orientation. Further, in 2017, The Safe School’s programme centred on inclusive curriculum practice for all children, regardless of sexual orientation, was shut down due to that schools were trying to unnaturally sexualize children. In 2019, in Queensland, the Health Legislation Amendment Bill 2019 was passed by a narrow majority. Although this legislation prohibits shock therapy treatments for LGBTQI+ youth, it does not ban conversion therapy outside of healthcare domains. There is yet to be a comprehensive understanding of the world that LGBTQ I+ youth navigate, (Ashley Citation2020). It is clear that the ‘data’ surrounding these lived experiences reinforces stigmatizing relational dynamics and shapes how and on what terms social justice, inclusion and agency are experienced in LGBTQI+ lives. The types of data given voice and chosen to inform educational programmes, policy, and legislation have implications for educational systems.

Institutions enact the various acts, bills, policies and legislation built on public opinion and based on what ‘data’ has been given voice. Despite increasing rhetoric around inclusivity in classrooms, in 2020, Mark Latham, the ex-leader of the Australian Labor Party, introduced the Education Legislation Amendment (Parental Rights) Bill 2020. This bill aims to prohibit the teaching of gender fluidity in schools across the state of New South Wales and involves the introduction of an explicit, conceptual language to support those who then have the power to influence educational systems. Empowered by freedoms afforded by such laws, in 2022, the principal of a Brisbane school demanded that families sign anti-gay and anti-trans enrolment contracts which branded homosexuality as ‘sinful, offensive and destructive’ and lumped it into the same category as paedophilia and incest (Foster Citation2022). The ‘data’ that empowered such actions had and still has implications for LGBTQI+ youth, which we argue is because queer data is missing from the development of various forms of governance.

In this paper, we stress that we are neither near the start, in the middle, nor at the end of understanding, seeing, feeling and being impacted by the implications of what LGBTQI+ data being missing in action means. Instead, we reimagine LGBTQI+ citizenship as a form of genealogical endeavour, to reconsider the implications of what it can mean when queer data is habitually missing in action from systematic power-integrated forms of governance.

Not a beginning, middle or end

If a bullet should enter my brain, let that bullet destroy every closet door. (Harvey Milk, 22 May 1930–27 November 1978)

In the challenge of ‘knowing’ we began to pay close attention to how performative utterances – those words and phrases that create new meanings (Austin Citation1962) involve tracing how data practices define, describe and enact new knowledge. New knowledge in the so-called ‘Age of Automation’ in education sees big data, analytics and online social networks offering a finite number of interactions in emergent forms of ‘meta-edtech’ (Williamson Citation2021). Interactions that often operate as socio-technical imaginaries in terms of educational futures. Jasanoff and Kim (Citation2009, 120) define sociotechnical imaginaries as ‘collectively imagined forms of social life and social order reflected in the design and fulfilment of nation-specific and/or technological projects’. With determinism built into them, looking closely at sociotechnical imaginaries can enable the recovery of ‘excluded subjects’ within and through discourses that promulgate the values of normativity (Tamboukou and Ball Citation2003, 5).

Digital data is often claimed to be comprehensive in its scope, extremely detailed and malleable in how it may be combined across large data sets to provide knowledge (Selwyn Citation2015). Large volumes of digital data are generated, analyzed, and used in automated decision-making processes and are now ‘routine operations’ of education and social systems (Perrotta and Selwyn Citation2020), social networks in the media. Digital data is then part of ‘heterarchic, ‘systemless systems’ of governance’ (Gulson and Witzenberger Citation2022, 146) that enables, encourages and promotes automated power, which is exercised by automated decision-making. In our thinking about ‘digital data’ through a socio-cultural lens, we operate within socio-technical imaginaries to understand broader performative claims and to challenge what ‘knowing’ means. It is in our thinking about ‘digital data’ as a socio-cultural practice that we can begin to unpack and understand the wider performative utterances associated with:

the violent appropriation of interpretation [and] the processes of contingent unities and dispersions … [that is] the epistemological lesson of history … its affirmation of knowledge as perspective. There is no outside from which to view history, since all history is struggle. (May Citation1993, 156)

By reaffirming and reappraising what has remained seemingly insignificant in everyday events, our approach to utilizing sociotechnical imaginaries becomes one that can deconstruct narratives of normalcy derived from digital data. Drawing on this concept has guided us in our thinking to frame the concept of data justice as an ontological-ethical-epistemic space from which to reconsider forms of public knowledge-making and argumentation. As we unpack, problematize and disrupt understandings of digital data, we see how power comes to be circulated in terms of technical democracy and socio-material collectives, data justice, agency, epistemic orders, and post-automation. We consider these topics in terms that performatively fuel heteronormative systematic advantage. Five critical socio-technical imaginaries are described to allow us to disrupt the ways automation adds to the institutionalized absent-presence of LGBTQI+ lives.

Socio-technical imaginary 1. LGBTI+ data missing from the age of automation

The validity of digital data from a rights-based approach is part of this genealogy. In the 80s we didn’t have automation, all we had was our anger at the censorship of how LGBTQI+ lives were being framed in multiple policies, bills, and laws as intersecting systems of oppression. Automation now however, is part of our educational systems (Perrotta and Selwyn Citation2020). For example, in the United States, it took President Ronald Raegan 4 years to recognize that an HIV/AIDS pandemic was occurring. Perhaps it will take our educational systems just as long. It was only being reported and recorded as a disease impacting gay men. The data on HIV+ was missing in action.

It wasn’t until the Centre for Disease Control started to record data on ‘innocent victims’ that the political implications of HIV+ were acknowledged, and greater scrutiny was applied to how new cases became counted. Further, the consequences were then costed up against categories of belonging that consequently funded medical research. As the disease was left to the community to respond data was produced by the community. The free-floating signifier of the rainbow flag designed by Gilbert Baker in 1978 as a symbol of pride and defiance was added to in 1987 with the creation of the NAMES Project: the AIDS Memorial Quilt that announced to the world: ‘we’re here, we’re queer and we’re dying’. The ‘Quilt’ constituted of 48,000 panels, once unfolded covers three acres of land. It presents visible data of the effects of HIV and has become a Queer political symbol and form of data that is still sustained by the community.

Moving forward ten years, in June 1990 a leaflet entitled QUEERS READ THIS was distributed at a New York Pride march. The rhetorical question in the leaflet asked ‘Why Queer?’ It was urging its readers to interrogate and decolonize their everyday lives in the context of what Wittig (1980), conceptualized as ‘The Straight Mind’; Rich (1993) as ‘Compulsory Heterosexuality (1993) and Warner (1993), as ‘Heteronormativity’, it claimed:

Well, yes, ‘gay' is great. It has its place. But when a lot of lesbians and gay men wake up in the morning we feel angry and disgusted using ‘queer' is a way of reminding us how we are perceived by the rest of the world. Affirm queerness in the face of hatred and invisibility as displayed in a recent governmental study of suicides that states at least one third of all teen suicides are Queer kids. This is further exemplified by the rise in HIV transmission among those under 21. Our difference, our otherness, our uniqueness can either paralyze us or politicize us. Good queers don’t get mad. They’ve taught us so well that we not only hide our anger from them, we hide it from each other. WE EVEN HIDE IT FROM OURSELVES. We hide it with substance abuse and suicide and overachieving in the hope of proving our worth. For the last decade they let us die in droves and still we thank President Bush for planting a fucking tree, applaud him for likening PWAs to car accident victims who refuse to wear seatbelts. Let yourself be angry that the price of our visibility is constant threat of violence, anti-queer violence to which practically every segment of this society contributes. Go tell them until they have spent a month walking hand in hand in public with someone of the same sex, after they survive that, then you’ll hear what they have to say about queer anger. Otherwise, tell them to … listen. (Anon Citation1990)

Socio-technical imaginary 2: LGBTQI+ digital data are missing from the black boxes

The 2021 Census in Australia did not include a question on gender diversity or sexual orientation, and although the Australian Bureau of statistics acknowledges that such data is needed, the only question in the Census about sex/gender was limited to male/female/non-binary options. The Australian Census of Population and Housing data (‘the census’) collected every 5 years, provides a temporal snapshot of Australian people, culture and society and is vital for decision-making processes. Implicit in critiquing the lack of representation in the census data is paying ‘attention to the “doing” and the dialectic [of] “being”’ (Vicars Citation2018, 204). Without representation in the data, the inequalities associated with ‘being Other’ cannot be authentically and effectively addressed via ‘doing’ changes in policy, research, and practice. The 2021 Census systematically denied accurate data being collected and steered away funding of vital services for LGBTQI+ communities.

Becoming inculcated, or not, into the virtual allows us to consider how different identities connected to gender, race, ability and class can overlap in socio-technical systems. For example, LGBTQI+ individuals may experience gender-based discrimination, ableism, misogyny and racism, which become datafied as proxies in automated decision-making processes (Arantes Citation2019). Quantified and datafied into digital representations of the self, automated decisions have long been found to perpetuate algorithmically created discrimination that disadvantages queer sexual citizenship (Arantes Citation2019; Baker and Hawn Citation2022; Goggin and Soldatić Citation2022).

Recognizing how data on LGBTQI+ Australian lives has historically not always been collected and not including data about Australia’s gender and sexually diverse population, a significant presence is obfuscated within Australia. When data has been collected, it has been for medicalized and pathologizing purposes that have implications concerning the entrenchment of inequalities and exclusion of LGBTQI+ lives. The implications of this data missing in action continue to shape how and on what terms, data and social justice amalgamate (Arantes Citation2019). It raises questions associated with LGBTQI+ inclusion and agency. It raises questions about how LGBTQI+ lives can authentically and effectively ‘open the black box’. And, it raises questions about how the black box (Pasquale Citation2015) prevents us from fighting back against pathologized algorithmically informed decision-making processes, predictions, insights and recommendations based on dirty and black-boxed data.

These black-boxed systems remain promoted, although how they effectively function remains unclear (Pasquale Citation2015) but have been shown repetitively to have a deleterious impact (Elish and boyd Citation2018; Arantes Citation2022). In a case study regarding Houston Teachers, Dawson et al. (Citation2019) detail how the district used commercial algorithmic systems to make transparent and hold to account teachers’ impact by analyzing test scores over time. ‘The results were then used to dismiss teachers deemed ineffective by the system’ (Dawson et al. Citation2019, 34). As the algorithmic systems were proprietary and not able to be scrutinized, they were considered to be ‘a potential violation of the teachers’ civil rights’ (Dawson et al. Citation2019, 34).

Therefore, the algorithmic systems that underpin decisions interconnect social-cultural constructs within a broader infrastructure without being effectively scrutinized according to principles of data justice (Perrotta and Selwyn Citation2020). Taylor (Citation2017) describes data justice as ‘fairness in the way people are made visible, represented and treated due to their production of digital data’. The interplay of data justice and data invisibility of LGBTQI+ lives leads us to the question: ‘What does this mean in an age of automation?’ Was the Australian 2021 census unique, a once-off? As socio-technical constructs, digital data, algorithms, and automated decision-making have become an educative presence in all of our lives; does the adoption of ‘a more technical, democratic focus, locates data users/decision-makers as essential stakeholders within these systems?’ (Thompson Citation2021, 183).

Socio-technical imaginary 3: normative framings and automated governance

Automation is all the rage these days but we must consider how automation is failing LGBTQI+ communities. The burgeoning literature on automation and associated forms of governance, such as predicting who to employ through to whether a human should be forced to pay back welfare, fails ‘to acknowledge that non-human, automated governance has long been a reality’ (Mormann Citation2021, 3). It is by considering this construct and problematizing it through conflict that we may provide a means to locate a spectrum of benefits and harms explicitly and understand how any data missing from consideration can claim validity. Mormann (Citation2021) argues that policy-makers have entrusted regulations that automatically adjust to variations in conditions to mitigate the negative implications of flawed datasets that feed into forecasts, insights, and recommendations for more than a century. Arguably, the recent Australian 2021 census could be described as flawed.

Data about much funding is required for public services is based on geographic location, employment, demographics, and health which provide critical insights, and inform where funding for vital services is directed. Although representative data is lacking, Lyons et al. (Citation2021) it is estimated that Australia’s gender and sexually diverse population is approximately 3.5–11% of the population. Situating LGBTQI+ as absent in the socio-technical construct of inclusion allows for the normative archetypes of LGBTQI+ people to advise algorithmically informed decision-making. The implications in terms of governance can be summarized through two themes.

First, with authentic data missing in action from algorithmically informed decision-making, only shadows or proxies inform the decision-making matrix (Cheney-Lippold Citation2011). Such proxies have been shown to perpetuate discrimination as a result of algorithmic bias, which in turn skew services and funding towards forms of governance that arguably favour normative forms of governance (Hitczenko et al. Citation2022). Second, with proxies or mere data shadows of people used to represent LGBTQI+ people, the consequences for inclusion and agency, in terms of data justice as an educative presence in the lives of LGBTQI+ remains amiss. To imagine a socio-technical future, we consider these themes through the lens of post-automation that celebrates social movements as data within an industrious space (Smith and Fressoli Citation2021). This imaginary encourages us to consider grassroots responses to challenges with data missing and how people have attempted to enact change with the questions within the Australian 2021 Census.

Consultation with LGBTQI+ communities for the 2021 census had resulted in a question for persons aged 15 years or older being proposed to collect their gender identity and a second question on sexual orientation being included. The proposal was submitted to the Australian Senate for inclusion but was voted against. The normative framing of what is needed, informed by what data is collected, is what we are now seeing. As this normative data feeds into governance systems, we begin acknowledging that very few protections from discrimination based on sexual orientation currently exist in Australian federal law. We recognize a need to look for new ways to explore this discrimination and consider the notion of data justice in datafied systems in terms of normative framings and meanings that feed into forms of governance and social organization.

The repercussions of LGBTQI+ data missing in action from the Census are imagined to be varied. We consider that denying LGBTQI+ populations representation in census data is to expose consideration of LGBTQI+ populations as ‘being’ subaltern (Spivak Citation2003) to consider what such a decision is ‘doing’ in specific contexts. Based on the census data, we consider to what extent normative data is helping or not helping to direct funding and resources to support people. In this imaginary, we also consider notions of intersectionality and intersectional digital data. In specific contexts, intersectional digital data is required to address the unique service needs of LGBTQI+ groups, from everything from health to housing (Skerrett, Kõlves, and De Leo Citation2015). For example, LGBTQI+ youth who are also Culturally and Linguistically Diverse, Aboriginal and Torres Strait Islander, who have a disability or are experiencing homelessness, requires contextually specific support (McNair et al. Citation2017; Zeeman et al. Citation2018; Phelan and Oxley Citation2021).

The lack of appropriate questions in the Census to represent the LGBTQI+ population, let alone those facing intersectional challenges via how they are ‘being’ represented or not represented in the data, has implications that need further consideration. Through the lens of post-automation, we imagine a need for a more complex, diverse, and open approach to automation. We realize that there is literally nothing automatic about automation and that automated forms of governance need time to consider the complexities associated with LGBTI+ communities.

Socio-technical imaginary 4: labels, and subjugating LGBTQI+ groups

A grouping of LGBTQI+ communities into labels for the process of automation is ‘neither conceptually consistent nor truly inclusive’ (Veradonir Citation2022). Gulson and Witzenberger (Citation2022) provide a concrete example, referring to 30-minute workshop about Microsoft’s Azure for Teachers:

During this part of the presentation, a statement stands out: ‘Computers are better than humans at speech, text and image recognition’. While this statement is clearly aimed at the idea that efficiency has primacy, some falsely labelled picture tags indicate that being better at recognizing speech, text and images means quicker, not more accurate. (151)

It is in LGBTQI+ discourse, that ‘personal identity labels have emerged as a nearly all-encompassing system of thought, actively walling in discussions of sex, sexuality, gender, and relationship models’ (Veradonir Citation2022). Therefore, genealogically, automation promotes a seemingly objective understanding of LGBTQI+ communities whilst covering and masking that the labels are forming an accident in historical representation. A concrete example is the lack of questions in the 2021 Australian census to represent LGBTQI+ youth. This data is missing from the Census and presents implications for understanding the intersectional challenges in social justice of groups in society considered to be particularly vulnerable to exploitation (McNair et al. Citation2017; Zeeman et al. Citation2018; Phelan and Oxley Citation2021). Exploited, in terms of the level of ‘fairness in the way people are made visible, represented and treated as a result of their production of digital data’ (Taylor Citation2017, 1).

The consequence of these walls is that LGBTQI+ culture becomes trapped and ensnared in a singular, monoculture that arguably presents a dominant perspective. A perspective that erases the individual person and treats individuals as little more than ‘microcosms of group identities’ (Veradonir Citation2022). This trend is represented in the LGBTQI+ terminology commonly used in society. These labels provide specific meanings for some but are representative of limited contexts. Although these labels allow automated decisions to function efficiently, the lack of representation and meaning invalidates the findings.

This has mixed results for society and for the process of automation. For example, on the one hand, these labels attempt to capture the idiosyncrasies of sexuality. Yet, on the other hand, the reasons for these labels become black boxed (Pasquale Citation2015), or lack the explicit knowledge of a system’s internal workings. Automated insights can be convoluted and contradictory as many of these labels have made conversations about sexuality difficult. Thus the data is not valid to be part of the labels they represent. Where in societal discourse, these labels can be considered as contributing to a toxic polarization that is arguably holding back progress for LGBTQI+ communities. When such discourse is automated, black boxes render decisions intangible, unknown and silenced … Yet powerful, as insights and recommendations remain largely considered objective.

In the unfolding of a larger narrative, we imagine LGBTQI+ representation in terms of data justice. The concept of data justice ‘presupposes a reasonable state of information equity, where factual evidence and knowledge are thoughtfully integrated into decision-making’ (Taylor et al. Citation2020, 199). As LGBTQI+ groups are not represented in the institutional census data, and no questions are asked about intersectional role identities, data justice may encourage us to consider data produced within and from LGBTQI+ communities. If we look back to the NAMES Project – the AIDS Memorial Quilt that announced to the world: ‘we’re here, we’re queer, and we’re dying’, we might learn from our lack of collective understanding to what extent the lived experiences through an intersectional lens relate to data justice. Without representation in the census data, LGBTI+ populations in Australia exist in a state of epistemic injustice.

This broader narrative could speak to people perceiving similar outcomes from the census data and their resultant interventions differently. The data points could be re-orientated to point towards community-produced data sets, the data produced by the population that the labels claim to represent, and feed into government decision services, funding and other resources. By pointing the labels away from census data and towards a digitalized genealogy positions the most valid data points in systems that do not subjugate LGBTQI+ groups through black-boxed automation.

Socio-technical imaginary 5. What could go wrong? Automated mental health assessment

As a result of developmental stressors during adolescence, such as self-identifying (McDaniel, Purcell, and D'Augelli Citation2001) and a continued stigma in schools (Mackie, Patlamazoglou, and Lambert Citation2021), chatbots may present opportunities for mental health services to be available tailored to address and support LGBTQI+ youth. Automated mental health chatbots have been considered promising in consulting patients who may need therapy (Insel Citation2017). However, there is sufficient debate to suggest that automated chatbots are not sufficiently mature to replace health professionals (Parviainen and Rantala Citation2022).

A concrete example of automation in mental health is the semi-automated counselling tools discussed by Gooding and Clifford (Citation2021). Gooding and Clifford (Citation2021) argue that the arrival of ‘algorithmic video patient monitoring and surveillance will raise pressing questions in the mental health context but also in the health and social care context more broadly’ (546) that have yet to be sufficiently considered in terms of impact and implications. Calling for consideration of a rights-based discussion in terms of biometric technology, they finalize their argument by stating that ‘current legal and regulatory responses to video monitoring tend to be centred on CCTV’ and must be monitored to consider ‘major ethical and legal questions’ (546). Gooding and Clifford’s (Citation2021) narrative aligns with Parviainen and Rantala (Citation2022), who argue that the enactment of automated mental health chatbots may have consequences concerning the ways they amplify forms of rationalities and alter the decision-making practices of patients.

To that end, automation in terms of rights and mental health is relatively youthful by genealogical standards. As such digital data from oral interviews, records, analysis, and other data points require further investigation into how obtaining information about automation’s kinship to diagnosing, treating and assessing the mental health of LGBTQI+ youth may have impact and implications yet to be scrutinized. Notably, the Australian Human Rights Commission is only beginning to address these concerns, thus, illuminating potential repercussions for LGBTQI+ youth exposed to automated systems such as chatbots. Perhaps a desirable future is one where mental health assessment with LGBTQI+ youth adopts a more technically democratic focus. A future that produces a form of technical democracy, whereby problem-solving about automated mental health diagnosis requires the orchestration of conflict. One where we/they/the practitioners scrutinize to what extent these systems localize the datafied and intersectional representations of LGBTQI+ youth as essential stakeholders within these systems (Thompson Citation2021).

Imagining how consequences may emerge through a lens of technical democracy, we consider both opportunities and challenges associated with digital data and automated chatbots in mental health for LBGTI+ communities. For example, tools such as natural language processing, automatic speech recognition, and facial recognition claim to comparatively benchmark language, behaviour, and facial expression (Hitczenko et al. Citation2022). Such forms of automation may ‘serve as a resource multiplier for clinicians and researchers alike, allowing them to rigorously assess marginalized individuals who may otherwise have fallen through the cracks, helping to reduce existing disparities in mental health outcomes’ (Hitczenko et al. Citation2022, 285). However, according to the Australian Human Rights Commission (AHRC), the debate surrounding the implications of these technologies has not been sufficient. Conversely, such systems are not free from the same harmful biases that are part of human to human interaction (Arantes Citation2022; Baker and Hawn Citation2022). However, the unfair or unequal treatment of a group or individual on the basis of protected attributes such as gender and secondary use of data is receiving increased scrutiny.

Those who have scrutinized such systems have referred to automated systems as ‘snake oil’ due to the lack of validity in the prediction models for underrepresented populations (Elish and boyd Citation2018). Further, Bell and Alvarez-Jimenez (Citation2019), who researched the use of the Digital Phenotype, a process that sees digital data collected from smart devices to build a digital persona that claims to be able to diagnose illness and detect relapse before it occurs, flag the lack of questioning about such automated processes. They highlight that ‘many important questions are yet to be answered regarding digital phenotyping, including the reliability of prediction models based on the data and ethical considerations pertaining to the recording and use of personal data’ (Bell and Alvarez-Jimenez Citation2019, 264). In this imaginary, it is reasonable to question any assumption that data without valid representation of the individual LGBTQI+ person and their multi-dimensionalities may provide authentic insights into their mental health. Instead, chatbots may present tensions for those working with youth to negotiate critical socio-technical imaginaries such as this within any sociotechnical construct or context.

Discussion: automation is failing LGBTQI+

By reframing the socio-technical gaze towards data justice, we provide the reader with a means to explicitly locate a spectrum of benefits and harms caused by the use of data systems in all sorts of contexts. Taking us from a consideration of discourses around data and exclusion, care, maintenance, and wellbeing discussed in these socio-technical imaginaries, towards an understanding of ways to locate data as a form of systematic power integrated into systems of governance can shed light on the dynamics of automation concerning LGBTQI+ lives. That is, contemporary examples of digital data are habitually missing in action. The five socio-technical imaginaries demonstrated where and to what extent.

Consider how Moore and Currah (Citation2015) consider how a person’s ability to identify as a transgender citizen in US population databases legally correlates to income. Consider how Taylor et al. (Citation2020) suggest that data justice is underpinned by (in)visibility, (dis)engagement with technology and anti-discrimination. Consider Duarte, Llanso, and Loup (Citation2017 3) who has stated:

Today’s tools for automating social media content analysis have limited ability to parse the nuanced meaning of human communication, or to detect the intent or motivation of the speaker.[…] Without proper safeguards, these tools can facilitate overbroad censorship and biased enforcement of laws and of platforms’ terms of service.

Understanding how normative framings and meanings feed into forms of algorithmic governance and social organization raises issues around how technical democracy is maintained and informs decisions impacting LGBTQ+ lives.

In considering the digital genealogy narrative in an age of automation, Anthropologist Mary L. Gray has stated that Artificial intelligence will ‘always fail LGBTQ people’ (Wareham Citation2021). In the recent Australian Census, it is evident that automated decision-making situated LGBTQI+ representation as missing from the context of available data. There is an implied delegation of LGBTQI+ people being a low-priority status.

The impact can be clearly articulated in funding under current operational norms discussed in the socio-technical imaginaries, the flow of funding for educational support and the implementation of initiatives in society, thus being tied to the ability of such programmes to demonstrate results (Schwenke Citation2022), which in turn justifies its stance, by comparing to digital data that inculcates a perpetual heteronormative position. Demonstrable results are compared to a baseline based on the dominant ideology of heteronormativity but black-boxed with close scrutiny. To Queer(y) the realities of automation, as a result, means interrogating how LGBTQI+ identities become (re)constituted in the data and, as such, requires problematizing forms of power that manifest themselves in educational, health and other support systems (Rosenberg Citation2022). Funding for LGBTQI+ programmes needs to demonstrate progress compared to empirically robust baselines, yet for LGBTQI+ people, such baselines simply don’t exist if they are not part of the dataset. Forms of power that are made manifest miscarry the programmes that research has clearly shown LGBTQI+ individuals need to be represented.

As we think through the progressive implications of this automation, we consider how post-automation can ‘enable people to subvert and appropriate technologies for more open futures and thereby challenge through practical demonstration automation’s future essentialism’ (Smith and Fressoli Citation2021, 132). We consider how the heteronormativity of democratic engagement with technology can be subverted via a digital queer practice based on the concept of imaginaries of sexual citizenship (Benjamin Citation1999).

Reflection: asexual citizenship; queer automation as a counter-discourse

Earlier in this paper, we said we would utilize critical socio-technical imaginaries of sexual citizenship. As Plakhotnik (Citation2019) suggests,

The term ‘sexual citizenship’ has been circulating in academic discourse since the 1990s (Evans 1993), with its aim to alter traditional understandings of citizenship as a status that is solely defined by the state .... It can be described as a ‘momentum concept’ ... that broadens the scope of ‘citizens’ identities, and uncovers the gendered and racialized nature of citizenship (Evans 1993). Despite somewhat varying meanings of sexual citizenship … the queer theoretical account of sexual citizenship calls for deconstruction of its normative assumptions in the struggle for recognition .... Sexual citizenship … can be understood as discourses [and a] performative framework [that] allows seeing citizenship as a political process of ‘becoming’ sexual citizens as ‘recognized subjects’ (Cossman 2007) or ‘political subjects’ .... From this perspective, the rights that are demanded by LGBT+ communities have a regulatory role but also a performative aspect that makes comprehensible an imaginary about the desired configuration of citizenship, as well as the political community of citizen-subjects. Finally, understanding citizenship as a political practice of constant differentiation from those who are not (should not be) citizens is crucially important … since it allows seeing how (normative) sexual citizenship is produced through and productive of its ‘constitutive outside’....

Counter-discourses can displace the normative semiotic analytical representations in census data to challenge the notion of data justice. Sexual citizenship or the rights and responsibilities of people in sexual and intimate life affords a counter-discourse about automation. Counter-discourses enable alternative imaginaries of belonging, subjectivities and solidarities to emerge beyond hetero-norming and storming digital data and automation. Thinking about the socio-technical lives of LGBTQI+ as something represented in data (notably, not digital data) reveals that it is possible to weave a queer narrative from the socio-technical, globally and temporally fluid landscape of LGBTQI+ digital data.

With the census data awash in a swathe of normative representations, queer automation considers how representation has been systematically scaffolded like buildings into how certain data types are not collected. With the thought that the normativity of democratic instruction forges data constructs, the lack of data representation in the recent Australian Census modulates the ‘out of sight, out of body’ understanding; however, we know that data representation is ‘not out of mind’ (Vicars Citation2006, 350). Cogitating on the lack of representative queer census data means meandering nomadically in the ever-unfolding language and an algorithmic logic to subvert the normativity of democratic instruction and the data representing sexual citizenship. The census data awash in a swathe of normative representations queering the imaginary of ways of belonging as sexual citizens opens the door to interpreting and understanding how a lack of representation in census data can be challenged in socio-technical environments.

Acknowledging that the digital inclusion policy in Australia does not explicitly recognize nor address automated decisions making and other aspects of automation (Goggin and Soldatić Citation2022), is to identify how inclusion and exclusion in areas of automated technologies are forged as constructs by the normativity of democratic instruction that we focus on. The lack of data representation in the recent Australian Census could be argued is a modulated process, focused on by a queer engagement with technology that is antagonistic towards dominant ideologies and stares straight back into the constructs that represent LGBTQI+ data that inform and feed into and automate decision-making. Shaping, modulating and changing our educational futures.

The queer digital imaginary reflects and refracts revolutionary changes in socio-technical environments associated with sexual citizenship, automation and data justice. In this conceptualization, the queer imaginary is a challenge to an acceptance of normative socio-political cultures. Those who remain subterranean, not represented by or in Census or other institutionalized forms of data in an age of pervasive automation, can now feel and find a sense of belonging through the semiotic data they share, view and collect online. Further, becoming inculcated into the virtual allows us to consider how different queer identities connected to gender, race, ability and class can overlap in socio-technical systems. This positioning rejects the dominant discourse where a lack of LGBTQI+ representation in census data can be applied across a country with approximately 900,000 to 2.8 million people, being systematically silenced (Lyons et al. Citation2021). As a way of making the silenced have a voice voices have urged:

Value your favourite Queer accounts. We are not doing OK. Vague ‘community guidelines' keep tightening. Automated censors and random a**holes flag our posts. Queers lose online (and physical) spaces constantly. This is our cultural melting ice cap, eroding expression and for those who depend on social media for coins, a threat to our livelihood. We need to ask these platforms and ourselves: Who will create online spaces for us by us? These corporations are so eager to rainbow in June, but how do they support LGBTQ/political/sex-positive content year-round? How will they truly understand we are in a symbiotic relationship? Young Queers shouldn’t grow up in a gentrified internet. Like the censored network TV we never saw ourselves in, where same-sex kisses lost sponsors. That f*cked up generations. Queers will always find ways to communicate. We invented entire languages. We’ll xerox our posts and tape them up in bathrooms if we have to. But many of us are tired. We’ve been booted off every platform from eBay to Tumblr to P*rnHub. (Herrera Citation2022)

Concluding remarks

All things are subject to interpretation whichever interpretation prevails at a given time is a function of power not truth. (Nietzsche Citation1998)

In this paper, our comparative analysis of democratic instruction and engagement with technology, new media has taken us from a consideration of discourses around data and exclusion towards an understanding of ways to locate data as a form of systematic power integrated into systems of governance of LGBTQI+ lives. Foucault (Citation1980) tells us that the process by which we select the experiences we tell is a power saturated one and, given sexuality’s protean quality, its shape-shifting excess, its self-reflexivity and performativity, Data Justice in an age of automation mediates between interior and exterior existence and is an expression of power-knowledge relations.

Shedding light on the dynamics of automation in relation to the lives of LGBTQI+ youth, the paper has aimed to provide a contested consideration of the implications of LGBTQI+ digital data that is habitually missing in action. In considering how data can function as the grey documents of a genealogical endeavour, we have considered how LGBTQI+ lives emerge as the product of the perilous play of representational forces through acts of reinterpretation. We have aimed to think about how data automation as a socio-cultural practice constitutes subjects within and through discourses that promulgate and promote the values of normativity. Indeed, well-established struggles around algorithmic fairness and discrimination issues underpin the discourse of LGBTQI+ voices missing from automated decisions. Considering discourses around data and exclusion, the paper contributes an understanding of ways to locate data as a form of systematic power integrated into governance systems.

The consideration of automation through an examination of critical socio-technical imaginaries with technical democracy, data justice and post automation has provided a means to explicitly locate a spectrum of benefits and harms caused by using data systems in educational futures. It unmakes LBGTQI+ groups. We/they are not there anymore. A recent example on Oxford Street, the space and place of the Sydney Gay Mardi Gras, where the council painted a rainbow Zebra crossing and when the Mardi Gras week was over it reverted back to the normative. Through engaging with processes such as post-automation, data justice, and technical democracy, digital data can do the same thing for our youth.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Anon. 1990. Queers Read This. Pamphlet distributed at an 'Act Up' demonstrated, New York.
  • Arantes, J. A. 2019. “Equity Implications of Predictive Analytics in K-12 Classrooms.” Ubiquitous Learning: An International Journal 12 (2): 63–84. doi:10.18848/1835-9795/CGP/v12i02/63-84.
  • Arantes, J. A. 2022. “Personalization in Australian K-12 Classrooms: How Might Digital Teaching and Learning Tools Produce Intangible Consequences for Teachers’ Workplace Conditions?” The Australian Educational Researcher, 1–18. doi:10.1007/s13384-022-00530-7.
  • Ashley, F. 2020. “Homophobia, Conversion Therapy, and Care Models for Trans Youth: Defending the Gender-Affirmative Approach.” Journal of LGBT Youth 17 (4): 361–383. doi:10.1080/19361653.2019.1665610.
  • Austin, J. L. 1962. How To Do Things With Words. Cambridge, MA: Harvard University Press.
  • Bagrit, L. 1966. “The age of Automation.” British Journal for the Philosophy of Science 17 (1).
  • Baker, R. S., and A. Hawn. 2022. “Algorithmic Bias in Education.” International Journal of Artificial Intelligence in Education, doi:10.1007/s40593-021-00285-9.
  • Bell, I. H., and M. Alvarez-Jimenez. 2019. "Digital Technology to Enhance Clinical Care of Early Psychosis." Current Treatment Options in Psychiatry 6: 256-270.
  • Benjamin, W. 1999. The Arcades Project. Harvard: Harvard University Press.
  • Cheney-Lippold, J. 2011. “A New Algorithmic Identity.” Theory, Culture & Society 28 (6): 164–181. doi:10.1177/0263276411424420.
  • Chonody, J. M., J. Mattiske, K. Godinez, S. Webb, and J. Jensen. 2020. "How did the Postal Vote Impact Australian LGBTQ+ Residents?: Exploring Well-Being and Messaging." Journal of Gay & Lesbian Social Services 32 (1): 49–66.
  • Dawson, D., E. Schleiger, J. Horton, J. McLaughlin, C. Robinson, G. Quezada, J. Scowcroft, and S. Hajkowicz. 2019. “Artificial Intelligence: Australia’s Ethics Framework – A Discussion Paper.”
  • Duarte, Natasha, Emma Llanso, and Anna Loup. 2017. “Mixed Messages? The Limits of Automated Social Media Content Analysis.”
  • Ecker, J., T. Aubry, and J. Sylvestre. 2019. "A Review of the Literature on LGBTQ Adults who Experience Homelessness." Journal of homosexuality 66 (3): 297–323.
  • Elish, M. C., and danah boyd. 2018. “Situating Methods in the Magic of Big Data and AI.” Communication Monographs 85 (1): 57–80. doi:10.1080/03637751.2017.1375130.
  • Foucault, M. 1980. "The History of Sexuality: Interview." Oxford Literary Review 4 (2): 3–14.
  • Foster, Andrew. 2022. “Citipointe Christian College Principal Steps Down Over Anti-Gay Contract.” https://www.news.com.au/lifestyle/parenting/school-life/citipointe-christian-college-principal-steps-down-over-antigay-contract/news-story/19dc52f3c9c0cb99eef927f724b8638f.
  • Goggin, G., and Karen Soldatić. 2022. “Automated Decision-Making, Digital Inclusion and Intersectional Disabilities.” New Media & Society 24 (2): 384–400. doi:10.1177/14614448211063173.
  • Gooding, P. M., and D. M. Clifford. 2021. “Semi-Automated Care: Video-Algorithmic Patient Monitoring and Surveillance in Care Settings.” Journal of Bioethical Inquiry, 1–6. doi:10.1007/s11673-021-10139-7.
  • Gulson, K. N., and K. Witzenberger. 2022. “Repackaging Authority: Artificial Intelligence, Automated Governance and Education Trade Shows.” Journal of Education Policy 37 (1): 145–160. doi:10.1080/02680939.2020.1785552.
  • Hawley, Samantha. 2022. “The Gay Hate Killings Ignored for Decades.”
  • Herrera, L. 2022. https://www.instagram.com/herreraimages/?hl=en.
  • Hitczenko, K., H. R Cowan, M. Goldrick, and V. A. Mittal. 2022. Racial and Ethnic Biases in Computational Approaches to Psychopathology. Oxford University Press. doi:10.1093/schbul/sbab131.
  • Insel, T. R. 2017. "Digital Phenotyping: Technology for a New Science of Behavior." Jama 318 (13): 1215–1216.
  • Jasanoff, S., and S.-H. Kim. 2009. “Containing the Atom: Sociotechnical Imaginaries and Nuclear Power in the United States and South Korea.” Minerva 47 (2): 119–146. doi:10.1007/s11024-009-9124-4.
  • Lyons, A., M. L. Rasmussen, J. Anderson, and E. Gray. 2021. “Counting Gender and Sexual Identity in the Australian Census.” Australian Population Studies 5 (1): 40–48. doi:10.37970/aps.v5i1.80.
  • May, T. 1993. Between Genealogy and Epistemology: Psychology, Politics and Knowledge in the Thought of Michel Foucault. University Park: Pennsylvania State University Press. doi:10.1515/9780271071671.
  • Mackie, G., L. Patlamazoglou, and K. Lambert. 2021. "The Experiences of Australian Transgender Young People in School Counseling: An Interpretative Phenomenological Analysis." Psychology of Sexual Orientation and Gender Diversity.
  • McDaniel, J. S., D. Purcell, and A. R. D'Augelli. 2001. "The Relationship between Sexual Orientation and Risk for Suicide: Research Findings and Future Directions for Research and Prevention." Suicide and Life-Threatening Behavior 31(Supplement to Issue 1), 84–105.
  • McNair, R., C. Andrews, S. Parkinson, and D. Dempsey. 2017. GALFA LGBTI Homelessness Research Project.
  • Moore, Lisa Jean, and Paisley Currah. 2015. “Legally Sexed: Birth Certificates and Transgender Citizens.” Feminist Surveillance Studies, 58–76. doi:10.2307/j.ctv1198×2b.8.
  • Mormann, F. 2021. “Beyond Algorithms: Toward a Normative Theory of Automated Regulation.” BCL Review 62: 1.
  • Nietzsche, F. 1998. On the Genealogy of Morality. (M. Clark & A. Swensen, Trans.). Hackett Publishing Company. (Original work published 1887).
  • Plakhotnik, O. 2019. Imaginaries of Sexual Citizenship in Post-Maidan Ukraine: A Queer Feminist Discursive Investigation. (Doctoral dissertation, The Open University).
  • Parviainen, J., and J. Rantala. 2022. “Chatbot Breakthrough in the 2020s? An Ethical Reflection on the Trend of Automated Consultations in Health Care.” Medicine, Health Care and Philosophy 25 (1): 61–71. doi:10.1007/s11019-021-10049-w.
  • Pasquale, F. 2015. “The Black Box Society.” In The Black box Society. Harvard University Press. doi:10.4159/harvard.9780674736061.
  • Perrotta, C., and N. Selwyn. 2020. “Deep Learning Goes to School: Toward a Relational Understanding of AI in Education.” Learning, Media and Technology 45 (3): 251–269. doi:10.1080/17439884.2020.1686017.
  • Phelan, P., and R. Oxley. 2021. “Understanding the Social and Emotional Wellbeing of Aboriginal LGBTIQ(SB)+ Youth in Victoria’s Youth Detention.” Social Inclusion 9 (2): 18–29. doi:https://doi.org/10.17645/si.v9i2.3770.
  • Rosenberg, R. 2022. “Political Entanglements: Practicing, Designing and Implementing Critical Trans Politics Through Social Scientific Research.” In Rethinking Transgender Identities, 76–90. Routledge. doi:10.4324/9781315613703-6.
  • Schwenke, C. 2022. “Social Inclusion: Measuring the Invisible and the Insignificant.” In Rethinking Transgender Identities, 61–75. Routledge. doi:10.4324/9781315613703-5.
  • Selwyn, N. 2015. “Data Entry: Towards the Critical Study of Digital Data and Education.” Learning, Media and Technology 40 (1): 64–82. doi:10.1080/17439884.2014.921628.
  • Skerrett, D. M., K. Kõlves, and D. De Leo. 2015. “Are LGBT Populations at a Higher Risk for Suicidal Behaviors in Australia? Research Findings and Implications.” Journal of Homosexuality 62 (7): 883–901. doi:10.1080/00918369.2014.1003009.
  • Smart, B. 1985. Michel Foucault. London: Routledge. doi:10.1016/j.futures.2021.102778.
  • Smith, A., and M. Fressoli. 2021. “Post-Automation.” Futures 132: 102778. doi:10.1016/j.futures.2021.102778.
  • Spivak, G. C. 2003. “Can the Subaltern Speak?” Die Philosophin 14 (27): 42–58. doi:10.5840/philosophin200314275.
  • Tamboukou, M., and S. J. Ball. (Eds.), 2003. Dangerous Encounters: Genealogy and Ethnography (Vol. 17). Peter Lang Incorporated, International Academic Publishers.
  • Taylor, L. 2017. “What is Data Justice? The Case for Connecting Digital Rights and Freedoms Globally.” Big Data & Society 4 (2). doi:10.1177/2053951717736335.
  • Taylor, L., G. Sharma, A. Martin, and S. Jameson. 2020. Data Justice and Covid-19. London: Meatspace Press.
  • Thompson, Greg. 2021. “Digital Disruption in Teaching and Testing.” In Digital Disruption In Teaching And Testing, 182–199. Routledge. doi:10.4324/9781003045793-11.
  • Veradonir, R. 2022. Queerness is Universal. Queer Majority. https://queercafe.net.
  • Vicars, M. 2006. “Who Are You Calling Queer? Sticks and Stones Can Break My Bones But Names Will Always Hurt me.” British Educational Research Journal 32 (3): 347–361. doi:10.1080/01411920600635395.
  • Vicars, M. 2018. “What did I say That was Wrong? Re/Worlding the Word.” Qualitative Research Journal. doi:10.1108/QRJ-D-17-00049.
  • Wareham, J. 2021. “Why Artificial Intelligence is Set Up To Fail LGBTQ People.” Forbes.
  • Williamson, B. 2021. “Meta-edtech.” Learning, Media and Technology 46 (1): 1–5. doi:10.1080/17439884.2021.1876089.
  • Zeeman, L., N. Sherriff, K. Browne, N. McGlynn, M. Mirandola, L. Gios, R. Davis, et al. 2018. “A Review of Lesbian, gay, Bisexual, Trans and Intersex (LGBTI) Health and Healthcare Inequalities.” European Journal of Public Health 29 (5): 974–980. doi:10.1093/eurpub/cky226.