2,128
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

Parents’ understandings of social media algorithms in children’s lives in England: Misunderstandings, parked understandings, transactional understandings and proactive understandings amidst datafication

Pages 506-522 | Received 22 Mar 2023, Accepted 21 Jul 2023, Published online: 07 Aug 2023

ABSTRACT

In this paper, I ask how parents understand and make sense of their children’s relationships with social media algorithms. Drawing upon 30 think-aloud interviews with parents raising children aged 0 to 18 in England, in this paper, I pay attention to parents’ understandings of and consequent approaches to platform algorithms in relation to their children’s lives. I locate this work within user-centric research on people’s understandings of algorithms, and research about parents’ perspectives on data and datafication in relation to sharenting. Through my data, I draw out four modes – misunderstandings, parked understandings, transactional understandings and pro-active understandings. I suggest that parents’ often flawed understandings of their children’s myriad interfaces with algorithms deserve scrutiny not through a lens of blame or individualised parental (ir) responsibility but within cross-cutting contexts of parenting cultures and families’ diverse contextual resources and restraints. I conclude by highlighting attention to parents’ approaches to algorithms in children’s lives as critical to parents’ data and algorithm literacies.

IMPACT SUMMARY

Prior State of Knowledge: Parents in diverse contexts try to understand and support their children’s digital lives, and also often share content about their children on a variety of platforms. Prior research has shed significant light on the datafication of childhood.

Novel Contributions: This study investigates parents’ diverse understandings of algorithms underlying social media platforms and the ways in which they approach algorithms in their children’s lives.

Practical Implications: Parents’ knowledge about algorithms and datafication is uneven. Policymakers need to better support adult media literacies, including data and algorithm literacies. Schools’ communication to families and carers could also become key vehicles to raise awareness about datafication.

In this paper, I present findings about parents’ deeply contextual understandings - in its plural form – of social media algorithms in relation to their children. I draw upon a qualitative project with 30 parents of children aged 0 to 18 in England, which looked at parents’ experiences of algorithms in various domains, from search, newsfeeds and recommendations, to algorithms in the public domain (Das, Citation2023), and parents’ literacies with algorithms. But, in this paper, I focus on parents’ understandings of social media algorithms in their children’s lives, moving away from considering technical skills or in/accuracies in parents’ understandings of algorithms but rather treating understandings in their plural and fluid form. There are myriad ways in which children interface with algorithms, but how do parents approach and understand algorithms (c.f. Siles et al., Citation2022; Lomborg & Kapsch, Citation2020) in relation to their children’s interfaces with these? Sharenting (Blum-Ross & Livingstone, Citation2017; Siibak & Traks, Citation2019), connected devices within the household (Mascheroni, Citation2020), information about children on public domains or apps, children’s self-directed online practices, younger children’s media use and consumption, shaped by parental mediation (Alper et al., Citation2023; see also Warren, Citation2003), and indeed the growing adoption of algorithms in decision making in the public domain (Das, Citation2023; Lomborg et al., Citation2023) which impact households and children – are instances where children, and data about them, interface, with algorithms and algorithmic systems. Parents’ understandings of these interfaces between their children and algorithms influence parental choices and actions which shape children’s lives, as research on parental perceptions about media use might indicate (Joginder Singh et al., Citation2021; Schlesinger et al., Citation2019). In this paper, I draw out that parents’ deeply contextual and widely divergent understandings of platform algorithms in relation to the children are rarely driven by any degree of apathy or inattention to the implications of datafication but rather, shaped often by an unclear sense of the implications of algorithms, and datafication more broadly.

Parents, algorithms and everyday parenthood

Our insights into parents’ understandings and approaches to algorithms in their children’s lives might draw upon advances in user-centric algorithm studies on users’ awareness (c.f. Siles et al., Citation2022) and understanding of algorithms in complex, everyday circumstances (Siles, Citation2023) in cross-media contexts (Hasebrink & Hepp, Citation2017). Dogruel et al., (Citation2022) inquire into users’ definitions of algorithms and the sources upon which they were based but also place a normative emphasis on users’ expectations and desires with regard to how algorithms would affect their decision-making. Gran et al. (Citation2021) analyse people’s views regarding algorithms in Norway, developing a typology of six: unaware, uncertain, affirmative, neutral, sceptic, and critical. Users’ understandings are also often involved with the messiness of feelings. Participants in Eslami et al. s (2019) study of Facebook users who were discovered not to know the Facebook news feed curation algorithms initially reacted with amazement and wrath. Indeed, as van der Nagel (Citation2018) suggests, feelings, emotions, and beliefs matter, in terms of people making sense of how algorithms operate. People’s understandings link to their myriad, deeply contextual beliefs about how algorithmic interfaces might operate, as Karizat et al. (Citation2021) demonstrate in relation to not solely to technical understandings but also the implications of these beliefs for their actions going forward, in relation to platforms. As Willson (Citation2017) observes, users' bottom-up techniques for interacting with algorithmic interfaces matter because they shape and influence these spaces and how they function, as users work within, or sometimes against interface norms. As Lomborg and Kapsch (Citation2020) explore drawing upon Stuart Hall’s encoding and decoding theory, although it may appear that users lose agency once they are entangled in algorithmic interfaces, there are myriad opportunities for active involvement, both working within and against algorithms.

The hermeneutic moment of interpretation (Iser, Citation1974) is valuable in unpacking parents’ understandings, as algorithms are not dry technological rules or items used devoid of context. As we have long known of the use of technologies in everyday life (Bakardjieva, Citation2005; Ytre-Arne, Citation2023), parents’ understandings and interpretations in relation to algorithms are implicated in complex power dynamics in platform societies. Algorithms are encountered and made sense of by parents in socially shaped contexts, with wide-ranging inequalities and inequities (Katz & Rideout, Citation2021). Phenomenological theorisations of the hermeneutic circle (see Grondin, Citation2015; Wilson, Citation2018) help us locate parents’ interpretations and understandings of what algorithms are, or what they might mean in their interfaces with their children as something constantly in movement between, on the one hand, contextually shaped knowledge of parenting cultures and traditions, and indeed of digital systems and their workings, which are carried into an ongoing dance with making sense of algorithms in the here and the now. Of course, algorithms are not texts, unlike the text of hermeneutic reception theory or reader-response. Nonetheless, parents’ interpretative work (Iser, Citation1974) is indeed demanded, invited, shaped, resourced and restrained by algorithmic norms.

Yet, parents’ interpretative work around algorithms is but one component of the broader approach to algorithms in children’s lives which I am interested in here. Understanding algorithms, in the context of children’s diverse lives, across diverse stages of parenthood also involve structural shapers and the complex conditions of parenting cultures (c.f. Faircloth, Citation2013; Livingstone & Blum-Ross, Citation2020), including intensive parenting cultures (Lee et al., Citation2014). As scholarship on intensive parenting (Lee et al., Citation2014) and transcendent parenting (Lim, Citation2019) has argued, parenting sits within broader cultures which are gendered, classed and raced. Sharenting, or indeed grand-sharenting (Staes et al., Citation2023), where content is shared about children on platforms, generating not solely digital footprints for both carer and child but also fodder for algorithmic systems, invited questions not solely about the sophistication, or lack thereof, of adults’ understanding of algorithms, but it is equally a site where other myriad forces around intensive parenting, and societally shaped imperatives to self-represent a certain way, might be involved. There is much to be gleaned about parents' sense-making of algorithms in children’s lives by looking at the conversation on parents’ approaches to datafication and datafied childhoods (Mascheroni, Citation2020). Mascheroni et al. (Citation2023) draw attention to how caring dataveillance (Lupton, Citation2020), sharenting (Blum-Ross & Livingstone, Citation2017; Holiday et al, Citation2022; Plunkett, Citation2019), and the broader contexts of the parent-child relationship are now intertwined in a way that weaves together care, caring, datafication and surveillance, resulting in a specific configuration of the mediated parent-child relationship. Work by Barassi (Citation2017) and Leaver (Citation2017) also draws attention to the complex intertwining of everyday practices of care, surveillance, hope and anxieties about digital futures (Livingstone & Blum-Ross, Citation2020), and care-related surveillance (Lupton, Citation2020) within platform societies and datafication in general (Boyd & Crawford, Citation2011). There are also diverse institutions parents and children engage with, with differently articulated claims about children’s data. The complexity of children’s and parents’ circumstances might mean some children and some parents find their data more frequently sought, collected, shared, or lost more often than others (see Alper, Citation2023; Goggin & Ellis, Citation2020). In addition, as Kamleitner and Mitchell (Citation2019) note, parents’ data are intertwined with children’s data. This literature on sharenting and dataveillance in relation to the parent-child dyad has at times explored the potential risks for children (Esfandiari & Yao, Citation2022) and parents’ lack of digital literacy (Barnes & Potter, Citation2021) in terms of protecting children’s privacy,furthering the commodification of children’s lives amidst a datafied childhood (Mascheroni, Citation2020), from wombs (Leaver, Citation2017) to the brink of adulthood.

Positioning parents at par with private and public institutions or assigning blame to parents for unthinking, irresponsible or even callous actions might result in binary discourses of parent blame. These discourses might fail to contextualise parents and parenthood critically within the myriad gendered, classed and raced pressures parents parent within in digital societies, and might ignore what Clark drew attention to as the “emotional work that digital media have introduced into contemporary family life” (Clark, Citation2011, p. 1) in her theorising of parental mediation. Cino notes (Cino, Citation2022) that there is a need to step away from parent blame or positioning parents as irresponsible, shallow, or deliberately damaging their children’s safety and instead look at data-related practices as complex and often dilemmatic entanglements (Cino, Citation2022). Cino highlights a “potential double bind for parents in the digital age” (p 143), which beckons us to reframe conversations about the datafication of the parent-child rapport and parents’ handling of children’s data as more complex than one which allocates blame necessarily to parents. Campana et al. (Citation2020) – writing on Instadads, for instance, also draw attention to notions of protection as fathers attempt to negotiate profit motives around online sharing, with practices of shielding and protecting children. Cino’s reminder to consider parents’ dilemmatic entanglements and how parents are positioned here is vital, alongside Mascheroni et al's. (Citation2023) consideration of the many gendered and generational nuances of parents’ approaches to data. In what follows, I draw upon this rich scholarship to approach parents in their real, lived contexts, seeking to unpack their discourses around how they position, approach, and essentially understand social media algorithms in terms of their children and their children’s lives.

Methods

The work presented in this paper is part of a broader project - Parents talking algorithms - which involved 30 online, “think aloud” interviews (Swart, Citation2021) with parents of children aged zero to 18 across England ( and ). Algorithms are often fuzzy and invisible and represent behind-the-scenes workings for lay users. I drew upon broader instances of methodological precedents within user-centric algorithm studies (Gruber et al., Citation2021; Swart, Citation2021) to not consciously use the word algorithm itself during interviews unless parents volunteered the term themselves, only asking what the word meant to them as the final question. Instead, the “think aloud” process saw parents go online on their own devices on a variety of platforms of their own choice, as relevant to their everyday parenting, as we spoke through the complex ways in which algorithmic filtering, shaping, recommending – worked, and what parenting decisions, choices or emotions these prompted, in our in-depth, qualitative interviews. This meant, in all cases, parents decided which technological instance to speak of and which platform we would discuss, depending on what was central in that particular parent’s life.

Table 1. Participant profiles.

Table 2. Participants cited in this paper.

Parents were offered flexibility, with most of the interviews occurring outside of my working hours. The call for participants was circulated on public Facebook groups aimed at local sharing, selling and buying, and participants filled in an initial “Expression of Interest” form hosted on Qualtrics. The final sample had a good amount of diversity. Parents had children ranging from currently expecting a baby, through weeks-old infants, right across the span of primary and secondary school years, up till children old enough to have completed schooling. A quarter of the parents came from minority ethnic backgrounds. Exactly half of the total sample were mothers, and half were fathers. Parents’ professions also varied, including stay-at-home parents, parents on parental leave, those with small businesses, those in professional services, and those in private and public sector industries. Participants received a small £10 voucher as a token of gratitude. Interviews were transcribed within Microsoft Teams, checked for consistency and analysed through a combination of NVIVO and then pen and paper immersive coding.

The project underwent an ethics and governance review at the University of Surrey, UK, which included scrutinising all research instruments. In this paper, I focus only on parents’ understandings of social media algorithms in relation to their children, as opposed to the broader focus on algorithms within the Parents talking algorithms project.

Results

Below, I draw out patterns in parents’ sense-making practices as they try to understand social media algorithms in their children’s lives – their misunderstandings, parked understandings, transactional understandings and proactive understandings. At the outset, it is important to clarify here that these are not watertight sub-types within a rigid typology. No parent is fixed in their constantly unfolding journeys of sense-making around algorithms and data to any one position. The same parent’s interpretations and approach might well shift fluidly across the long journey of parenthood. Likewise, parents’ understandings of children’s interfaces with algorithms are also context-dependent and platform dependent. A parent who might have significant misunderstandings about a particular platform and relation to a toddler might have a set of very different understandings and expectations of a different kind of platform, for instance, in relation to a teenager.

Misunderstandings

In this strand of my findings, my attempt is not to establish misunderstanding in terms of indicating parents’ technical errors in understanding algorithms but rather a variety of mistaken assumptions around how algorithms work and what they seek to achieve in relation to their children. Lisbeth is a nursery worker in the north of England, where she and her partner are raising two secondary school-aged children aged 14 and 11. Lisbeth speaks of her perceived and self-reported technical difficulties and lack of competencies. She is particularly perplexed by why her search results for homework help look a certain way, privilege certain things or why for instance, her Facebook timeline appears replete with posts from other parents that prompt her to feel inadequate or sad. When speaking about algorithms in relation to her children, particularly thinking about their interfaces with algorithmically shaped platforms, she says whether a child is a “subscriber” to a YouTube channel would wholly determine the content of what is recommended to them on YouTube. Lisbeth struggles to reconcile the somewhat inappropriate to highly inappropriate content that her son appears to be recommended online, which requires her to pay attention to what he is watching and consuming with her understanding of subscriptions.

There are things where you’ll hear a word and say that’s inappropriate and try and double check that they’re not subscribing to these people’s videos so that it doesn’t happen. But despite having parental guidance on and not allowing them to look at certain pages, still these videos keep coming through … . How does YouTube decide what to show you next?

Lisbeth, mother of a 14 year old and an 11 year old, Newcastle, England

Lisbeth explains to me as we progress in our conversation that subscriptions and whether her son has subscribed to a particular channel would ultimately determine the stream of content he is recommended. When she figures out during our conversation that this is not the case, Lisbeth is surprised and does not know what to do because she does not quite understand what shapes the stream of content or generates the auto-play recommendations that her 11-year-old has access to.

Hettie is a young mother who, when speaking to me, had only given birth a few weeks ago. Hettie lives in the South of England and has a low income. She has particularly struggled with intense anxiety around a news story in England about a nurse accused of murdering infants in a hospital, which was making national headlines at the time of our interview. Hettie is perplexed that whenever she is online on various platforms, stories from local newspapers or national or regional news outlets keep cropping up on her news feed and her timeline, often even as pop-ups inviting her to read further and further about the story about this nurse. However, why, she ponders, do these keep cropping up on her feeds? Whilst her child is only a few weeks old, she muses about the future. She draws my attention to an incident about Peppa Pig videos on YouTube, giving way to sinister content for toddlers.

I know that my little sister used to watch episodes of Peppa Pig on YouTube, and then the next recommended one would be somebody who has done some awful stuff to Peppa Pig and … put on YouTube, which kids can see, and it’s not good … . I don’t like the idea of it because I know what one lead to, you know, you end up getting down a rabbit hole … .I just feel like people use the right keywords and buzzwords to get it be the next recommended … . I I’m not one of those superstitious people who thinks my phones listening to me.

Hettie, mother of a newborn, Southampton

In my conversation with Hettie, this remembered account of a toddler’s encounter with sinister Peppa Pig videos on YouTube and her stumbling across stories about the nurse making headlines at a very advanced stage in her pregnancy align. Her musings about the Peppa Pig instance involve her misunderstandings about why certain things come to be recommended over others. Although she attempts her explanations around the right keywords and “buzzwords” that people might use to get something to feature higher on the platform, a clear idea of how recommendation algorithms function and why a toddler might encounter sinister content is something she does not quite grasp.

Nandini is a single parent, raising a seven-year-old and a five-year-old in the west of England. She describes her challenges, day-to-day coping with single parenthood, and the special educational needs of her son and her daughter’s middle ear deafness. She relies on fellow parents with children with special educational needs or those who need additional support at school. Her social media use is prolific, with much sharenting (c.f. Staes et al., Citation2023). She ponders the recommended list of content her son sees on video platforms. She argues that the way to be selective successfully, about YouTube videos is reading the title. If something is violent, the title should say so, and it should not (emphasis on should) come up as a recommendation for her seven-year-old. Except that it indeed does. Nandini muses -

Videos do say contains violence. The titles say … OK, so he’s quite savvy in that way … Will it be based on what he types it, isn’t it? And then what? That what that generates on the other side … .We set rules … .I say to him. like last week, he watched something and I said no, darling, that’s inappropriate. You don’t watch that anyway.

Nandini, mother of a 7 year old and a 5 year old, Bristol

Nandini’s ongoing conversation with her son shows the care and attention she brings to her engagement with these algorithmic interfaces in relation to her little boy’s viewing experience. The intentions Nandini reveals are far from those of disinterest or disengagement. However, the misunderstandings and confusion around what he is invited to consume punctuate and, in many ways, shape and impede these acts of care.

Parked understandings

A significant strand of parents spoke about their children’s interfaces with algorithms as something they would think about and act on, but only later. These are what I call parked understandings, where parents make sense of the presence of algorithms and their implications for their children, but in a somewhat removed and distant way, allocating this time and space at a later point down the road. Parked understandings are complex, I suggest, because they involve some awareness, juxtaposed with feelings of being overwhelmed, resignation, or sometimes incorrect assumptions about arbitrary moments when datafication and algorithms apparently begin to “matter.”

Isabel, a single mother of a four-year-old daughter, talked very clearly about how her stance has changed from being quite “cynical” about algorithms to being reasonably relaxed about them. She describes a state of complete saturation where she cannot produce the bandwidth to worry about such things as a function of being a single mother. She tries to do the absolute best for her child, where she invests a lot of time and energy to portray “a very positive front” for her child. She says –

I am aware that obviously there’s computerised systems that are just picking up odd words and things that we’re saying or typing into the system, and that is being used to generate all of the media advertisements and probably much more twisted. … . I am aware of those things. It’s not something I’ve looked into or spent a great amount of time thinking about in all honesty … . I’d say … at the moment … I do sort of have an awareness that over the years that I may feel differently about this … .

Isabel, mother of a four-year-old, Shropshire, England

There is a juxtaposition in Isabel’s talk of sitting alongside an awareness of algorithmic interfaces and data journeys where there might not be much technical competence. However, there is still practical awareness (Cotter, Citation2022), which sits alongside her resignation (Draper & Turow, Citation2019) and determination to not think about something to put forward a happy front for one’s child. The interesting dichotomy here is that Isabel’s parking of concerns for later works alongside the future-focused worries she expresses in our conversations about her own child in relation to safety and security online (see also Alper, Citation2019 on the future talk).

When we spoke, Rhianne was on maternity leave with a toddler and an infant at home. She lives in the south of England, and it became swiftly evident in the course of our conversation that theirs was an affluent household, with her partner working in a technology company. She noted to me, with examples, that there was always a high amount of technology available for access and use. However, in keeping with the interviews with other parents of toddlers and small children, there was a very mild sense of being bothered by datafication or algorithmic shaping in any specific way.

I think my stance would change. I think it’s different, but I’m an adult. I think we’d certainly have parental blocks on and kind of I don’t know much about it. I haven’t had to use them, but I know that there’s like a parent setting for YouTube and Google .which my husband’s talked about putting in place .

Rhianne, mother of a 2 year old and an infant, Surrey, England

Rhianne’s parked understandings relate closely to filters, technical filters and blocks and various things she does not “know much about,” relegated to the future. On the one hand, these understandings relate to the broader predominance of filters and blocks as the defining features of conversations with parents about platforms, where, parents often cited communication from schools as focusing their attention on staying safe online (rather than datafication, for instance). However, there is also minimal, if any, broader concern in our conversation regarding her sharing content about her children online or using an extensive set of applications to track and monitor her toddler and baby. The significantly high availability of technology and the partner working within the technology industry shapes, to a great extent, how Rhianne feels, as she expresses a great deal of confidence in terms of making alternative decisions about algorithms at a later point in parenting, but not just yet, relating to Alper’s (Citation2019) exposition of future talk, in many ways.

This sense of not yet ness – was equally visible in my conversation with Jackson, a design engineer in the East Midlands and the father of 4 and 2-year-old daughters. Despite having noticed that the baby Annabelle videos his daughters are fond of on YouTube frequently change to inappropriate content automatically, Jackson articulates a set of concerns about platforms parked in the future. His concerns about the future, though, are distinct, clear and on the right tracks –

I think my big concern would be the fact that my daughters will be hitting an age where they’re using social media. … … People paying YouTube for showing their videos so it will push you in certain directions … Videos and it shows makeup or something and they’re clicking on that and then next thing you know that it’s talking about … at a breast implant or something completely extreme … .and stuff like that would worry me.

Jackson, father of a 4 year old and 2 year old, East Midlands

Jackson’s daughters are only 2 and 4 years old. As Livingstone and Blum-Ross outline in their work on parenting for digital futures (2020; also Alper, Citation2019), Jackson makes sense of the future in relation to his own childhood – “as a child, I never had social media. it was never a it was never a worry.” He connects the algorithmic suggesting, shaping and potential manipulation of his little girls, ten years on, to the logics of platform societies, but as he speaks to me in the here and the now, these concerns are parked for later.

Transactional understandings

Cutting across my conversations with the parents, there was often a sense of transaction and resignation (Draper & Turow, Citation2019), where some parents spoke of algorithmic shaping and manipulation in contemporary societies as part and parcel of modern life in relation to personal gains (Eslami et al., Citation2019). This was not necessarily expressed in an entirely emotion-free manner. However, there was a profound sense of resignation and eventual acceptance that they were trade-offs in modern life and that, for certain conveniences, one had to part with specific amounts of personal information amidst datafication, even if these related to one’s children.

Freddie works in the public services in Greater London and lives in the South of England with his family, with a 12-year-old son and an eight-year-old daughter. Throughout, Freddie appears somewhat distanced from the day-to-day banalities of parenting, which is reflected in what he reports of his newsfeeds and timelines, which appear entirely tailored to his hobbies and much advertising. However, he also takes a transactional view of datafication and his family’s relationship with algorithmic interfaces. He displays a general technical awareness of data flows. But, he argues, with resignation (Draper & Turow, Citation2019), that this is a part of contemporary life.

Well, it is not something I can directly control. They are gonna do it … They’re all using the data that that you presented on a search point soon as you press the button, it saves a little clip of you because … . that it now links across to all your devices … . pretty much resigned to it. Yeah, it’s a facet of modern life that … .

Freddie, father of a 12-year-old and an 8-year-old, Southampton, England

Freddie reiterates a combination of some concerns around messages on online bullying and grooming, as parents often seemed to, perhaps shaped by the content of school curriculum and schools’ communication to families, but a broader sense of resignation and not entirely caring about datafication in general, is also evident. This was a distinct strand in my data set which I call a transactional approach or a transactional sense-making of algorithmic interfaces where parents show a certain amount of technical awareness and competency simultaneously accompanied by a degree of distance, resignation and a sense that there are certain privileges one must give up in return for the advantages and benefits that these interfaces apparently bring – a transaction.

Raising a five-year-old boy, and a two-year girl in Kent, Andrew is a healthcare professional who has made significant changes to his work patterns to stay at home for a certain amount of the week to look after his toddler. Sharing care equally with his partner, he is a heavy social media user, particularly finding his content increasingly tailored towards cheap days out, eateries and local cafes recommended to him. Whilst his children are small still, when Andrew reflects on algorithmic shaping – of searching, finding, sharing, recommendations, and myriad other things – he appears distanced and resigned to accommodate it as part and parcel of modern life -

An algorithm would be … I guess, programmed in to … . It’s like Facebook … to track … .I do think I will say that I am not kind of ignorant of it … I don’t know how you, how you kind of even start, what kind of maintain or police.

Andrew, father of a 5 year old and a 2 year old, Kent, England

Andrew’s resignation and his view of algorithmic shaping, and indeed datafication as transactional, does not emerge from a position of uncaring apathy. Instead, a complex mix of entanglements (Cino, Citation2022), ranging from lack of clarity, parenting contexts, a degree of digital overwhelm, and a sense of inability to act, as has often been heard in user-centric research on algorithms, leads Andrew to develop his stance.

Rijula, who spoke to me whilst nursing her months old daughter and with her toddler around the house, was, unlike Freddie or Andrew, and indeed unlike the vast majority of the parents, very clear from the outset that she was very aware of data traces and the journeys of search data. She illustrated to me clearly why she thought her searches for infant feeding on search engines were more likely to return to websites relating to breastfeeding, natural weaning and attachment parenting. With this significantly high level of technical awareness, Rijula draws to my attention that she feels “OK” about datafication and algorithmic shaping, but with glimpses, at times, of her view shifting somewhat from the transactional view she begins the quote below with, in relation to algorithms and her data, to the implications for her children, later on -

It makes me feel OK because then it is relevant to me … . So I made my peace with it per se … … generally, because we play a lot of music using Alexa. So then that Amazon Music … becomes a bit pushy … … . So I mean we have we have put parental … We have put parental lock on everything. Even now, like even before they were born, the day we had internet in this house, we had been into locks and everything because even accidental searches sometimes can be like I feel very iffy about it.

Rijula, mother of a 3 year old and an infant, Bristol, England

This quote illustrates Rijula’s movement and flux between a transactional view on algorithms, drawing out, at the start, that it makes the world more relevant for her, and a different stance in relation to her children in the future where the conversation appears to shift to feeling “iffy”, and resorts to physical barriers, similar to the many others I spoke to, around locks, blocks and filters. This relates to parents’ separation, seemingly, between algorithms in relation to themselves and their own data and that of their children’s, not often grasping that the two are not necessarily entirely as disconnected as they might like to think (Kamleitner & Mitchell, Citation2019). These transactional understandings of the relationships with algorithms, is I suggest, a complex mix of partly functional approaches to a uses and benefits relationship with technology, partly a lack of clarity, technical knowledge and a sense of overwhelm and resignation, but rarely perhaps, one of apathy, or a lack of care and consideration.

Pro-active understandings

I now turn attention to parents’ understandings of their children’s interfaces with algorithms, which compel them to act, in the here and the now. These – proactive – understandings vary, however, in precisely what the action is. Whether at all, it ends up being in the best interests of children. There is also, amidst such action, a marked absence of holding institutions responsible, mainly commercial institutions including technology companies, to protect best interests and an imperative to act at the level of individual parents.

Brett is a single father with shared custody of minor children aged 4 and 1. He works in the world of technology support in higher education. He has a good functional grasp of algorithmic shaping on platforms. In discussing his quest for information as a parent, he draws attention, for instance, to the different search results he and his ex-partner would see on parenting approaches, which he said, were algorithmically shaped and tailored. Brett appeared relaxed and indeed curious and accepting of algorithms in feeding out of and shaping different worldviews on parenting. But, there was a shift in relation to his children and a range of actions he described when it came to his children’s interfaces with algorithmic systems -

My son’s quite into kind of the same things all boys are, say, Ghostbusters and Power Rangers and stuff like that, and obviously on YouTube you … . can obviously watch the original episodes of Power Rangers … . He was watching YouTube … and then notice the next video kind of moved into some kids playing with some Nerf guns and then a couple of videos later, it started getting slightly more into kind of, you know, Modern Warfare.

Brett, father of a 4 year old and a 1 year old, Southampton, England

Brett is clear to me that this strategy involves both an acceptance that inappropriate content will be recommended to his child and that he needs to monitor his child constantly. Unlike many of the parents we met previously who had a range of misunderstandings around algorithmic shaping, what he says about personally needing to monitor YouTube recommendation algorithms showing inappropriate content following his son watching Ghostbusters material online is driven by clarity about how algorithms function. This is followed in conversation rapidly by him indicating that platforms, including YouTube, cannot be expected to take responsibility for algorithms, as he equates such notions as akin to censorship and firmly places the responsibility with the parent (see Das & Graefer, Citation2017). However, this individually pro-active parental responsibility around algorithms gives way to clear expectations of institutions when Brett, later in a conversation about algorithms in the future and the public domain, expresses disdain about “fat cats and tech bosses”. These dichotomies are abundant in parents’ understandings and draw our attention to how these understandings and expectations are fluid, in flux, and often appear uneven across contexts and platforms.

Proactive understandings do not necessarily drive technical action alone. Audrey, a church learning coordinator raising three children between 2 and 18, notices that her 12-year-old and her nearly 18-year-old are recommended and suggested world views and beliefs. She tries proactively to initiate conversations with them on their quests for information, views and “their truth”. She is worried about TikTok mainly, but instead of mechanical blocks and filters, she is willing to speak to her children about the relationship between recommendations, the nature of newsfeeds and timelines (which she finds perplexing, in the broader context of our conversations) and her children’s searches for facts and views. Audrey says -

How kids get the truth … . That has been a worry. That’s been a constant conversation between us and the 12 year old and the nearly 18 year old because the 12 year old’s kind of using TikTok and stuff more and stuff’s coming to him … . Been suicides on TikTok and we’ve had to sit him down and have big conversations like that.

Audrey, mother of a 2 year old, a 12 year old and a 17 year old, Southampton, England

Audrey spoke at length about the emotional implications of algorithmic filtering of news feeds in terms of her own parenting journey, starting, at the outset of our conversation with assumptions of a chronological timeline on social media platforms, moving on, then to anger and upset (Eslami et al., Citation2019) about how emotions and algorithmic shaping might lie in relationships of manipulative entanglement. Audrey’s actions differ from parents who speak first, not of conversations, but of technical blocks. However, they align in the sense that they never quite articulate the responsibilities of platforms or commercial enterprises to behave differently, placing the locus of being pro-active on parents alone.

Felix, very different to any parent in the rest of this work, introduced me to his deep distrust of platforms, his equating all technology with the gravest of risks, and his resultant policy of a near complete technology ban in his home. Felix presented a very clear account of algorithmic shaping and manipulation on platforms and then spoke to me about his conclusion that his children must not use devices at all, to search, share or more, developing his views, as ever, in relation to his own remembered childhood and past (c.f. Livingstone & Blum-Ross, Citation2020) -

I grew up in a house where TV was the dominant factor, and I swore that I would never I would never expose my children to that, so my children really have no access. … It’s at least it’s about a year ago they had no access to digital devices whatsoever. There were no tablets in our House, no screens. They were allowed to have one hour of TV a week. … .

Felix, father of an 11-year-old and a nine-year-old.

Felix’s actions, entirely different from any other parent I spoke to, sit perhaps at the extreme of parental action in relation to their children’s real, perceived and anticipated interfaces with algorithms. A blanket ban, which their household has seen so far, may or may not work in their family’s future journey, and indeed, raises questions about what sort of parental action is apt and desirable in terms of children’s best interests in relation to algorithms.

Discussion

There are critical methodological lessons I gleaned from this work. Not only are algorithms and algorithmic shaping challenging to pinpoint, articulate and speak about during fieldwork but there is also an essential inseparability of parents’ understandings from their actions concerning their children. Whilst parents might repeatedly draw out that they have a different stance towards their own data and their own actions on search engines or shopping portals, entertainment systems or social media platforms, as opposed to their children’s data to which they might report having very different stances, in reality, there are clear connections between the two. This is not just because of data overlaps and the ways in which family data intertwine between members of a household (see Kamleitner & Mitchell, Citation2019), but also because parents’ literacies (Barnes & Potter, Citation2021) and practises around technology broadly in relation to themselves shape, inspire and determine what they are able to do in relation to their children.

Particular attention needs to be paid to the shaping influence of children’s ages and stages in terms of parents’ approaches and perspectives on platform algorithms in relation to their children. As we often see, the parent of a toddler located thinking about algorithms, datafication or personal data in general almost as a future concern. Juxtaposed with the richness of the scholarship that we have seen so far on datafication (Van Dijck, Citation2014; Bucher, Citation2012) and pregnancy and early parenthood (see Barassi, Citation2019; Leaver, Citation2017), it is interesting to see the manufacturing of these concerns as future worries rather than matters for the here and the now. Also, parents’ approaches sit across the categories I identified, depending on their purpose of engaging with the digital and often the kind of platforms they are engaging with.

Future research in this area needs to continue to pay contextualised, in-depth attention to parents’ interpretations, understandings and approaches to emerging and ever newer algorithmic interfaces, in their own lives and their children’s – not solely, or at all, in relation to allocating blame on parents, but teasing out how their actions – what Cino (Citation2022) calls dilemmatic entanglements - are contextually-shaped, and in what ways their literacies might be supported, for protecting the diverse best interests of parents, children and families. Thus, it is critically important that we go beyond technical understanding and listen to the stories (See Karizat et al., Citation2021; Swart, Citation2021) parents share. This is, of course, in recognition that parents’ perspectives and approaches, all of which have contexts and histories, go on to shape the environments in which decisions are made about children’s data and practices. But also, it is perhaps in seeking to understand parents’ own contexts and lived experiences around algorithmic interfaces that we might be just able to shift from blaming parents for being unreasonable or lacking sensibility around data, or even implying such, towards trying to make sense of parents’ own sense-making practises, and how these come to be.

Indeed, such a shift in the locus of inquiry, from potential parent blame to unpacking parents’ deeply contextualised sense-making, might lay critical steps in thinking about enhancing and developing parents’ own data, algorithm and AI literacies (see Barnes & Potter, Citation2021), not solely in their roles as users and citizens, but also critically in their roles as parents raising children in contemporary datafied societies which continue to be profoundly unequal. This means that attention to developing adult media literacies, including data and algorithm literacies, and ensuring that schools’ communication to parents and families about technology includes and focuses on the challenges of datafication, is of vital importance.

Acknowledgments

I thank the University of Surrey for the sabbatical leave which made this work possible. I thank the wonderful anonymous peer reviewers and JOCAM Editor Vikki Katz for their constructive feedback and support, and Paul Hodkinson for his comments on an earlier version of this manuscript. I thank the parents who contributed their time for this study and the Media Use Research Group at the University of Bergen, Norway, where Brita Ytre-Arne and her colleagues provided valuable feedback on my writing. My thanks also to those colleagues who have informally read drafts of the broader project this is part of, including Tereza Pavlickova, Ana Jorge, Francisca Porfirio and Ana Kubrusly.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the University of Surrey [FASS Sabbatical Scheme].

Notes on contributors

Ranjana Das

Ranjana Das is a Professor in Media and Communication, in the Department of Sociology, at the University of Surrey. Her research interests are technology users, user-centric algorithm studies, algorithmic literacies, datafication and families. Contact r (dot) das (at) surrey (dot) ac (dot) uk.

References

  • Alper, M. (2019). Future talk: Accounting for the technological and other future discourses in daily life. International Journal of Communication, 13(2019), 715–735, 1932–8036/20190005. https://ijoc.org/index.php/ijoc/article/view/9678/2558
  • Alper, M. (2023). Kids across the spectrums: Growing up autistic in the digital age. MIT Press.
  • Alper, M., Manganello, J., & Colvin, K. F. (2023). Parental mediation and problematic media use among US children with disabilities and their non-disabled siblings during the COVID-19 pandemic. Journal of Children and Media, 17(2), 1–9. https://doi.org/10.1080/17482798.2023.2180045
  • Bakardjieva, M. (2005). Internet society: The Internet in everyday life. SAGE Publications Ltd. https://doi.org/10.4135/9781446215616
  • Barassi, V. (2017). BabyVeillance? Expecting parents, online surveillance and the cultural specificity of pregnancy apps. Social Media+ Society, 3(2), 2056305117707188. https://doi.org/10.1177/2056305117707188
  • Barassi, V. (2019). Datafied citizens in the age of coerced digital participation. Sociological Research Online, 24(3), 414–429. https://doi.org/10.1177/1360780419857734
  • Barnes, R., & Potter, A. (2021). Sharenting and parents’ digital literacy: An agenda for future research. Communication Research & Practice, 7(1), 6–20. https://doi.org/10.1080/22041451.2020.1847819
  • Blum-Ross, A., & Livingstone, S. (2017). “Sharenting,” parent blogging, and the boundaries of the digital self. Popular Communication, 15(2), 110–125. https://doi.org/10.1080/15405702.2016.1223300
  • Boyd, D., & Crawford, K. (2011, September). Six provocations for big data. In A decade in internet time: Symposium on the dynamics of the internet and society.
  • Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. https://doi.org/10.1177/1461444812440159
  • Campana, M., Van den Bossche, A., & Miller, B. (2020). # dadtribe: Performing sharenting labour to commercialise involved fatherhood. Journal of Macromarketing, 40(4), 475–491. https://doi.org/10.1177/0276146720933334
  • Cino, D. (2022). Beyond the surface: Sharenting as a source of family quandaries: Mapping parents’ social media Dilemmas. Western Journal of Communication, 86(1), 128–153. https://doi.org/10.1080/10570314.2021.2020891
  • Clark, L. S. (2011). Parental mediation theory for the digital age. Communication Theory, 21(4), 323–343. https://doi.org/10.1111/j.1468-2885.2011.01391.x
  • Cotter, K. (2022). Practical knowledge of algorithms: The case of BreadTube. New Media & Society, 14614448221081802, 146144482210818. https://doi.org/10.1177/14614448221081802
  • Das, R. (2023). Algorithms in the public domain: Parents’ fears and expectations about invisible and super-visible children. Parenting for a Digital Future.
  • Das, R., & Graefer, A. (2017). Provocative screens: Offended audiences in Britain and Germany. Springer International Publishing. https://doi.org/10.1007/978-3-319-67907-5
  • Dogruel, L., Masur, P., & Joeckel, S. (2022). Development and validation of an algorithm literacy scale for internet users. Communication Methods and Measures, 16(2), 115–133.
  • Draper, N. A., & Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8), 1824–1839. https://doi.org/10.1177/1461444819833331
  • Esfandiari, M., & Yao, J. (2022). Sharenting as a double-edged sword: Evidence from Iran. Information, Communication & Society, 1–19. https://doi.org/10.1080/1369118X.2022.2129268
  • Eslami, M., Vaccaro, K., Lee, M. K., Elazari Bar on, A., Gilbert, E., & Karahalios, K. (2019, May). User attitudes towards algorithmic opacity and transparency in online reviewing platforms. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–14).
  • Faircloth, C. (2013). Militant lactivism?: Attachment parenting and intensive motherhood in the UK and France (Vol. 24). Berghahn Books.
  • Goggin, G., & Ellis, K. (2020). Disability, communication, and life itself in the COVID-19 pandemic. Health Sociology Review, 29(2), 168–176. https://doi.org/10.1080/14461242.2020.1784020
  • Gran, A. B., Booth, P., & Bucher, T. (2021). To be or not to be algorithm aware: A question of a new digital divide?. INFORMATION COMMUNICATION & SOCIETY, 24(12), 1779–1796.
  • Grondin, J. (2015). The hermeneutical circle. In N. Keane & C. Lawn (Eds.), A Companion to Hermeneutics (pp. 299–305). https://doi.org/10.1002/9781118529812
  • Gruber, J., Hargittai, E., Karaoglu, G., & Brombach, L. (2021). Algorithm awareness as an important internet skill: The case of voice assistants. International Journal of Communication, 15, 1770–1788. https://doi.org/10.5167/uzh-204503
  • Hasebrink, U., & Hepp, A. (2017). How to research cross-media practices? Investigating media repertoires and media ensembles. Convergence: The International Journal of Research into New Media Technologies, 23(4), 362–377. https://doi.org/10.1177/1354856517700384
  • Holiday, S., Norman, M. S., & Densley, R. L. (2022). Sharenting and the extended self: Self representation in parents’ Instagram presentations of their children. Popular Communication, 20(1), 1–15. https://doi.org/10.1080/15405702.2020.1744610
  • Iser, W. (1974). The reading process: A phenomenological approach. In New directions in literary history (pp. 125–145). Routledge. https://doi.org/10.4324/9781003247937-8
  • Joginder Singh, S., Mohd Azman, F. N. S., Sharma, S., & Razak, R. A. (2021). Malaysian parents’ perception of how screen time affects their children’s language. Journal of Children and Media, 15(4), 588–596. https://doi.org/10.1080/17482798.2021.1938620
  • Kamleitner, B., & Mitchell, V. (2019). Your data is my data: A framework for addressing interdependent privacy infringements. Journal of Public Policy & Marketing, 38(4), 433–450. https://doi.org/10.1177/0743915619858924
  • Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in algorithmic resistance. Proceedings of the ACM on Human-Interaction, 5(CSCW2), 1–44. https://doi.org/10.1145/3476046
  • Katz, V., & Rideout, V. (2021). Learning at home while under-connected: Lower-income families during the COVID-19 Pandemic. New America. https://files.eric.ed.gov/fulltext/ED615616.pdf
  • Leaver, T. (2017). Intimate surveillance: Normalising parental monitoring and mediation of infants online. Social Media+ Society, 3(2), 2056305117707192. https://doi.org/10.1177/2056305117707192
  • Lee, E., Bristow, J., Faircloth, C., & Macvarish, J. (2015). Parenting culture studies. Springer.
  • Lim, S. S. (2019). Transcendent parenting: Raising children in the digital age. Oxford University Press. https://doi.org/10.1093/oso/9780190088989.001.0001
  • Livingstone, S., & Blum-Ross, A. (2020). Parenting for a digital future: How hopes and fears about technology shape children’s lives. Oxford University Press.
  • Lomborg, S., & Kapsch, P. H. (2020). Decoding algorithms. Media, Culture & Society, 42(5), 745–761. https://doi.org/10.1177/0163443719855301
  • Lomborg, S., Kaun, A., & Scott Hansen, S. (2023). Automated decision‐making: Toward people‐centred approach. Sociology Compass, e13097. https://doi.org/10.1111/soc4.13097
  • Lupton, D. (2020). Caring dataveillance: Women’s use of apps to monitor pregnancy and children. In The Routledge companion to digital media and children (pp. 393–402). Routledge.
  • Mascheroni, G. (2020). Datafied childhoods: Contextualising datafication in everyday life. Current Sociology, 68(6), 798–813. https://doi.org/10.1177/0011392118807534
  • Mascheroni, G., Cino, D., Amadori, G., & Zaffaroni, L. G. (2023). (Non-) sharing as a form of maternal care? The ambiguous meanings of sharenting for mothers of 0-To-8-Year-old children. Italian Sociological Review, 13(1), 111–130.
  • Plunkett, L. A. (2019). Sharenthood: Why we should think before we talk about our kids online. The MIT Press. https://doi.org/10.7551/mitpress/11756.001.0001
  • Schlesinger, M. A., Flynn, R. M., & Richert, R. A. (2019). Do parents care about TV? how parent factors mediate US children’s media exposure and receptive vocabulary. Journal of Children and Media, 13(4), 395–414. https://doi.org/10.1080/17482798.2019.1627227
  • Siibak, A., & Traks, K. (2019). The dark sides of sharenting. Catalan Journal of Communication & Cultural Studies, 11(1), 115–121. https://doi.org/10.1386/cjcs.11.1.115_1
  • Siles, I. (2023). Living with algorithms: Agency and user culture in Costa Rica. MIT Press.
  • Siles, I., Gómez-Cruz, E., & Ricaurte, P. (2022). Toward a popular theory algorithms. Popular Communication, 21(1), 1–14. https://doi.org/10.1080/15405702.2022.2103140
  • Staes, L., Walrave, M., & Hallam, L. (2023). Grandsharenting: How grandparents in Belgium negotiate the sharing of personal information related to their grandchildren and engage in privacy management strategies on facebook. Journal of Children and Media, 17(2), 1–27. https://doi.org/10.1080/17482798.2023.2177318
  • Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media+ Society, 7(2), 20563051211008828. https://doi.org/10.1177/20563051211008828
  • van der Nagel, E. (2018). ‘Networks that work too well’: Intervening in algorithmic connections. Media International Australia, 168(1), 81–92. https://doi.org/10.1177/1329878X18783002
  • Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. https://doi.org/10.24908/ss.v12i2.4776
  • Warren, R. (2003). Parental mediation of preschool children’s television viewing. Journal of Broadcasting & Electronic Media, 47(3), 394–417. https://doi.org/10.1207/s15506878jobem4703_5
  • Willson, M. (2019). Algorithms (and the) everyday. In The Social Power of Algorithms (pp. 137–150). Routledge.
  • Wilson, T. (2018). Consumption, psychology and practice theories: A hermeneutic perspective. Routledge.
  • Ytre-Arne, B. (2023). The politics of media use in digital everyday life. In Media use in digital everyday life (pp. 69–77). Emerald Publishing Limited.