492
Views
0
CrossRef citations to date
0
Altmetric
Research Article

“Those blimmin Ts and Cs”: a mixed methods analysis of how people manage personal information, privacy, and impressions

ORCID Icon, &
Received 01 Nov 2022, Accepted 26 Feb 2024, Published online: 21 Mar 2024

ABSTRACT

Interconnected and smart technologies complicate personal information management (PIM) because users delegate the storing, organizing, and retrieving of personal information to smart and mobile service providers. Meta-level PIM activities are required to maintain the privacy and security of personal information. This study provides insights into how users of location tracking, mobile apps, and smart home technologies perceive PIM and privacy. We turn to the privacy as contextual integrity (CI) and impression management (IM) literatures to explore informational norms and interpersonal dynamics in PIM. This study is based on a mixed methods design to analyze focus groups and interviews with 106 British and Dutch respondents. Combining unsupervised Latent Dirichlet allocation (LDA) topic modeling and thematic analysis, we reveal discursive patterns in respondent accounts of technology use and provide an in-depth interpretation of these patterns. Our findings indicate that PIM practices are associated with the perceived appropriateness of information flows, anthropomorphic interpretations of technologies, and interpersonal surveillance. Thus, impressions are managed toward social actors as well as technology providers. We contribute to PIM research with a demonstration of how PIM in mobile, smart, and location-based technology use cannot be separated from contextual factors and strategies to manage impressions of habits and behaviors.

1. Introduction

Interconnected and smart technologies complicate personal information management (PIM) practices. At the basis of PIM strategies lies an understanding of how personal information is stored, processed, and shared (Bergman et al., Citation2004). When using mobile and smart technologies, users place information management in the hands of service providers, device manufacturers, app developers, and third parties. As such, users have only limited insights into and control over what happens to their personal information. There are many privacy implications to be considered in the ever-increasing use of personal information by digital devices and applications (Beldad & Citra Kusumadewi, Citation2015; Havelka, Citation2021; Liao et al., Citation2019). Not only can unapproved third parties access personal information, but users also often lack control over their personal information. However, the fact that these risks exist does not necessarily mean that they impact PIM strategies.

This study provides insights into everyday PIM and privacy perceptions. While many studies focus on privacy perceptions around a particular (type of) technology (such as smart speakers, Lutz & Newlands, Citation2021 or social networks, Ayalon & Toch, Citation2017), this study includes three different technologies: location tracking tools, mobile apps, and smart home technologies. Together, these technologies encompass the processing of different information types at stake in privacy considerations (such as text-based posts and messages, photos, videos, browsing history, e-mails, location data, social media posts, phone call log data, and physical activity; Vitak et al., Citation2022). We consider different contexts of use connected to intimate homes, communication, entertainment, and mobility. Aiming to contextualize PIM practices, we build on Nissenbaum’s (Citation2004) theory of privacy as contextual integrity (CI) to answer the question: How do different technologies and contexts condition people’s perceptions of privacy across personal information flows?

To explore perceptions of PIM and privacy in different contexts, we conducted focus groups and interviews with 106 British and Dutch respondents. Through a mixed-methods research design, the interview and focus group transcripts were analyzed with unsupervised Latent Dirichlet allocation (LDA) topic modeling and qualitative thematic analysis. This approach allows us to simultaneously reveal discursive patterns in respondent accounts of technology use and to provide in-depth interpretation of these patterns.

Respondents personalize personal information processing and privacy considerations by reflecting on the impressions they make on technologies and technology providers. As such, we realized during the interaction with respondents that interpersonal dynamics were a key aspect of how respondents understood privacy. Therefore, we make use of impression management (IM) literature (Goffman, Citation1949; Kilic et al., Citation2022; Rosenberg & Egbert, Citation2011) to provide insights into the (inter)personal dynamics of PIM practices.

The results section shows how respondents engage in meta-level PIM practices whereby they hand out the management of their personal information to the providers of these services. These practices are associated with the management of appropriate flows of information (characterized by a lack of control) as well as the management of impressions toward devices and platforms. In this process, users default to anthropomorphism when they approach their devices as if they are human beings who hold opinions about them. Moreover, they are explicitly concerned about interpersonal relations in their everyday technology use. Personal information, privacy, and impressions management practices are thus associated with concerns around commercial and interpersonal surveillance in location tracking, smart device security, and nontransparent user consent procedures. By building on CI and IM, we contribute to PIM research with a demonstration of how PIM in mobile, smart, and location-based technology use cannot be separated from contextual factors and strategies to manage impressions of habits and behaviors.

2. Theoretical framework

2.1. Personal Information Management (PIM)

Information is personal to someone when it is controlled/owned by that person, about that person, directed toward that person, sent or shared by that person, generated through experiences of that person, or potentially useful for that person (Jones et al., Citation2018). Whittaker (Citation2011) describes PIM as a curation process that entails keeping, managing, and exploiting familiar information. PIM practices can entail digital information as well as paper documents and can range from a birthday card to a social media post, an e-mail attachment, or a digital picture. These are different information items or forms, particular packages of information that can be managed (e.g., created, stored, named, moved, copied, shared, or deleted). Information items are not set in stone, they can be transformed. PIM practices establish, use, and maintain a connection between information and a particular need (Jones et al., Citation2018). For instance, people save the phone numbers of a friend and retrieve this when they need to reach out to them to plan a social gathering or they store location data to monitor their sports progress.

Jones et al. (Citation2018) distinguish between information finding (searching particular information in public or private spaces), information keeping (such as filing or archiving textual information or images) and meta-level practices (efforts to manage information). Meta-level practices can include organizing information, measuring, and evaluating information, and managing privacy, security, and information flows (Jones et al., Citation2018). In this study, we focus on meta-level PIM practices related to mobile, smart, and location-based technologies. We do this because the use of AI-driven technologies and data-driven services has made the management of personal information overly complex. Smart, mobile, and location-based technologies manage information on behalf of a user in automated ways. This can be information actively provided by the user but also information derived from their behavior or compiled on the basis of various information flows. Smart and mobile technologies take away individuals’ control over their personal information.

Whether such complex forms of information management are appropriate or problematic, is context dependent. Jones et al. (Citation2018, p. 3585) refer to Shannon’s theory of communication which indicates that “the value of information is not absolute, but relative to a context that includes the intentions of the sender, method of delivery, and the current state of a recipient’s knowledge.” Huvila et al. (Citation2014, para. 2) posit that “all PIM activities are contextual, and the role and contexts of particular pieces of personal information change in time.” For example, students were more active in PIM in an academic context than in a health context (Syn et al., Citation2020). To effectively address these contextual factors of PIM and processing, we turn to privacy as CI.

2.2. Privacy as Contextual Integrity (CI)

PIM practices are interconnected with normative considerations. There is no element of human lives that is “not governed by [context-specific] norms of information flow” (Nissenbaum, Citation2019, p. 119). Religious, moral, cultural, and other norms influence the way people behave and act in their daily lives. Nissenbaum’s (Citation2004) framework of privacy as CI supports the understanding of how norms around information management differ across social contexts.

CI focuses on flows of information and on the norms that determine their appropriateness. Norms of appropriateness are shaped by four parameters: The first parameter is context: structured differentiated spheres in social life such as family homes, schools, workplaces, and hospitals – in other words, the context in which information management happens informs which set of norms govern interactions (Vitak & Zimmer, Citation2020). For instance, a patient consultation in a doctor’s office forms a different normative context than the home where personal information is processed by smart technologies.

Second, actors concern the senders, data subjects, and the recipients of information. The distinctions between these groups are blurry. Whereas in this study, the senders are users of location tracking, mobile, and smart technologies, technologies, and devices may also be considered senders when they automatically share information with other users, technology platforms, or third parties (the recipients). Consecutively, data subjects can be technology users as well as others, such as children whose locations are monitored by their parents or family members in smart homes.

Third, attributes, or information types, describe the nature of the information in question. For example, in the context of this study, these can encompass location data, mobile phone log data, recordings (of prompts as well as behavior) in the home, and related metadata.

Finally, transmission principles constrain the flow of information, for instance, doctor-patient confidentiality (Vitak & Zimmer, Citation2020). In the context of this study, transmission principles can be the user consent procedures required for the use of location tracking, smart home services, and mobile apps. Transmission principles shape how personal information flows are processed and managed (Nissenbaum, Citation2009).

These parameters promote an understanding of the many ways that personal information flows and is managed, thus challenging the fact that the legitimacy of information flows rests solely on consent as many legal frameworks imply (Nissenbaum, Citation2019). Furthermore, the legitimacy of information management shall rest not solely on one parameter (data subject’s consent at one point of data collection) but on the aggregated identification and appropriateness of all suggested parameters.

This study focuses on location tracking, mobile apps, and smart home technologies, three types of technologies functioning (partly) on the basis of algorithms. According to Nissenbaum (Citation2019), algorithms pose new challenges for CI because they process various data streams that in amalgamation might trigger privacy norms that individual flows of data do not. Aggregation and derivations of personal information can pose novel challenges for PIM because many algorithmic processes take place in nontransparent forms outside the scope of control of individual users (Nissenbaum, Citation2019). Whereas PIM focuses on practices, procedures, and systems for managing data pertaining to one’s personal life, CI focuses on the norms and values that govern the flow of that information. PIM practices, procedures, and systems must align with the expected norms and values of particular contexts.

2.3. Privacy as a form of Impression Management (IM)

The focus of this paper resides in personal devices and personal information flows. Accordingly, an understanding of appropriate flows of information under a CI framework requires an appropriate account of interpersonal expectations and practices regarding PIM. While device usage in personal contexts does not necessarily exclude corporate or state actors who may have effective or potential access to device data, we argue that interpersonal dynamics, particularly impressions management, are particularly relevant. In the context of a home, for instance, much of the generated data is interpersonal in nature (Kilic et al., Citation2022).

Key notions of IM can be traced back to the idea that the personal interactions are a theater-like dynamic of information disclosure and probing aiming to provoke favorable impressions and avoid embarrassment (Goffman, Citation1949). The idea of the backstage and frontstage is at the forefront of Goffman’s work, with the backstage closely relating to private settings, where IM is no longer a concern. Therefore, while informal in most personal interaction settings, this boundary between the front and the backstage is a key component of the appropriate flows of information. Personal technologies could be a threat to IM mechanisms such as politeness (Brown et al., Citation1987).

While IM theory has been commonly employed to study online and offline interactions in the context of organizational communication (Bolino et al., Citation2016), its roots in interpersonal behavior also make it flexible enough to study the different technologies approached in our study. We also align with IM’s sociological origins and understand it as a necessary and encompassing aspect of social interaction.

Rosenberg and Egbert (Citation2011) translate IM theory into an online context, identifying self-presentation tactics of self-promotion, manipulation, damage control, and role-modeling. Location sharing functions of social media can also be utilized to manage online impressions and to come across as a likeable, cool, socially desirable, and pleasant person (Beldad & Citra Kusumadewi, Citation2015; Chen & Ha, Citation2019). Kilic et al. (Citation2022) find that IM and the fear of giving the wrong impression are some of the main concerns in a smart home setting. Considering the context of our study, these are key aspects of how people may understand information flows. A parent going to a McDonalds restaurant for lunch might be less worried about the impression that Google might have about them (when they receive location-based personalized fast-food ads) than about the impression their children, partners, or colleagues might have of them as a healthy eating role model impression when they access this information.

The implications of personal device usage for IM are twofold: First, individuals are more prone to relax their impressions management behavior when interacting with devices such as virtual assistants (Lucas et al., Citation2014). This hints at the fact that understandings of personal information flows may also shift according to the technology employed. Second, devices and technologies themselves can be used as tactics, or masks, in IM. An exercise tracking app with location tracking may be used to convey impressions of sportiness, healthy lifestyles, or sustainability in an interpersonal context.

2.4. Privacy perceptions in context

Many studies focus on privacy perceptions around a particular (type of) technology (e.g., Ayalon & Toch, Citation2017; Lutz & Newlands, Citation2021). To provide insights into how PIM practices are contextualized by the use of different technologies, this study focuses on smart home technologies, location tracking tools, and mobile apps. This way, we aim to find out whether particular technologies invoke different PIM, CI, and IM considerations for users. By including location-based, mobile, and smart technologies, we can study considerations across personal use contexts. These three types of technologies mainly differ in their purpose of use as smart home technologies are used to control the home and location tracking tools are used to share and monitor locations. Whereas smart home control and location tracking can also be done through mobile apps, this research differentiates mobile apps as a separate technology because of their multipurpose use and distinct privacy aspects. Recent studies identify different privacy aspects and perceptions for the three types of technologies.

In the context of location tracking, this research focuses on users tracking their own location for personal use (e.g., through fitness trackers) or sharing their location with others to allow interpersonal monitoring through social media or location tracking apps like Apple’s “Find My” or Life360. In light of such (inter)personal purposes, privacy concerns can affect long-term location sharing practices whereby privacy perceptions revolve around trust in institutional actors as well as social media users (Beldad & Citra Kusumadewi, Citation2015; Chen & Ha, Citation2019). Moreover, location tracking by parents involves privacy considerations on an interpersonal and a commercial level. Parents actively negotiate the privacy of their children (Mols et al., Citation2023; Widmer & Albrechtslund, Citation2021). Some parents are concerned about location tracking apps limiting children’s privacy and autonomy, whereas others feel that privacy is something to be earned. Some parents’ privacy concerns focus on location tracking apps limiting children’s autonomy whereas others are worried about third parties accessing location data (Sukk & Siibak, Citation2021).

Mobile apps offer interaction, practical support, shopping, finance, and entertainment amongst others. The multidimensional nature of mobile apps entails several types of information to be collected, processed, and shared by a variety of actors. Privacy considerations and PIM are geared toward commercial platforms, mobile technology providers, and digital infrastructure owners. Research shows how privacy perceptions differ among technology users. For instance, American and German students display awareness of personal information processing and privacy risks, but their attitudes range from valuing the benefits of personalization, to accepting privacy risks, to apathetic attitudes (Havelka, Citation2021). The CI parameters are used to gather contextualized insights into privacy perceptions (Vitak et al., Citation2022). This study indicates that American and Dutch respondents were concerned about actors like HR departments and online data brokers using their data, were mixed about local governments and law enforcement, and seemed unconcerned about doctors and social media. Regarding data attributes, the respondents were less concerned about social media posts, phone location data, and physical activities and more concerned about text messages, photos and video posts, phone call log data, and e-mails (Vitak et al., Citation2022).

Smart home technologies provide convenience, information, and amusement. Privacy attitudes in the context of smart homes are often studied in relation to smart speakers. Smart speaker privacy concerns among Dutch university personnel revolve around surveillance, security, and platforms. They were concerned about a lack of transparency around and control over data collection, processing, and sharing by smart speakers (Mols et al., Citation2022). In turn, American research indicates that privacy concerns form a barrier to smart speaker adoption (Liao et al., Citation2019). Connecting smart speakers to CI privacy norms, Abdi et al. (Citation2019) found that privacy norms are context-specific and depend on the type of information (e.g., banking is considered less appropriate than playlists) and recipient (technology providers/platforms and law enforcement are deemed more appropriate than ad agencies). Moreover, the (lack of clear) transmission principles matter in privacy perceptions of smart home technologies (Abdi et al., Citation2019).

While location tracking tools, mobile apps, and smart home technologies are not mutually exclusive and there can be some overlap in use, they emerged in our data as salient and delimited types. Our participants shaped their concerns according to these terms or related ones (meaning that we are using the participant’s own conceptualization and delimitation of technologies and are not imposing a priori categorization). The analysis indicates that our participants shaped their concerns according to these terms or related ones and that motivation for adoption and use of location tracking differs from smart home control and mobile apps. CI helps us to carve out which factors (contexts, attributes, actors, or transmission principles) are impacting PIM practices and IM behavior.

3. Materials and methods

3.1. Data sampling and collection procedures

To explore perceptions of personal information privacy in different contexts, we conducted focus groups and interviews with 106 British and Dutch respondents aged between 23 and 60 between 2018–2021 (see supplementary materials or an overview of the respondents). British and Dutch mobile technology users reside in comparable contexts as the technologies available in the U.K. and the Netherlands are relatively similar, and the countries are similar in how they score on indulgence, power distance, and individualism on Hofstede’s cultural dimensions (Hofstede Insights, Citationn.d.) – factors that can play a role in technology use and privacy norms. Focus groups were conducted to explore perceptions and to gather opinions about different technologies. Focus groups enable researchers to observe how, through interaction, the participants’ views are established, stabilized, debated, and challenged (Stewart & Shamdasani, Citation2007). Group interaction was used “to produce data and insights that would be less accessible without the interaction found in a group” (Morgan, Citation1997, p. 12).

Whereas the focus groups allowed us to gather comprehensive insights into perceptions and normative assessments of privacy, information processing, information management, and technology use, we also wanted to collect information about everyday situated use of such technologies. Therefore, we conducted semi-structured interviews in nine families. We used intensive interviewing, a form of interviewing guided by open-ended, non-judgmental questions which “permits an in-depth exploration of a particular topic or experience and, thus, is a useful method for interpretive inquiry” (Charmaz, Citation2014, p. 24). The combination of focus groups and interviews allowed us to reach more in-depth and thick descriptions of perceptions of technology use and information flows (Lambert & Loiselle, Citation2008).

Participants were recruited through an agency approved by the authors’ institutions. The participants were recruited based on convenience of location, with an equitable distribution amongst genders to ensure the collection of balanced data. In line with the ethics guidelines and approval process at the authors’ institutions, to promote ethical and sustainable consumption and to reward participants for their time, all participants received a gift voucher to spend at a (online) organization of their choice.

Vignettes were used within focus groups to prompt respondents to think about technologies being used in different contexts in their daily lives. These scenarios and activities were formed around smart home technologies, location tracking, and mobile apps with the purpose of prompting the respondents to reflect on the use of these technologies, and their perceptions of privacy concerns and risks. Vignettes help uncover the way participants socially construct risks and how the interplay of arguments unfolds. Vignettes have been widely used by researchers aiming to examine the complexity of social aspects (Barter & Renold, Citation2000), such as the convoluted nature of people’s interactions with the technologies herein focused.

For this study, we set-up different vignettes for the three technologies. For location tracking, we asked the participants to respond to several scenarios about people tracking and sharing their location in their everyday life (e.g., during the work-home commute, during a run). For example, “Kris, a bank clerk, bikes daily to work. Her commute takes approximately 40 minutes. Whenever she works late, or is not home when she is usually expected, her husband checks her location through the “FindMy Friends” app. Kris’ friends think there may be a lack of trust in their relationship, where Kris and her husband feel safer knowing each other’s whereabouts.” Participants were asked to reflect on this and other scenarios (with a variety of use purposes, characters, and considerations toward trust to allow for a variety of responses) and discuss what they did and did not find appropriate uses of information.

Mobile apps use for a variety of purposes like entertainment and communication is widespread, we asked participants to reflect on the use of these technologies in their daily lives by prompting them to consider scenarios such as sharing Spotify lists with friends, using WhatsApp to communicate with colleagues, accepting the terms of service of a social media app that requires access to location data and other apps.

Because smart home technologies were relatively new around the time the first round of focus groups took place, we showed participants a video of a family using a Google Home device during their morning routine (Peek of the Net, Citation2017). The respondents were then invited to interact with a Google Home device that was installed in the focus group space and was connected to a lamp and a smart TV. We provided example prompts, such as “OK Google, what is the weather forecast,” and “OK Google, switch on the lamp” and respondents came up with their own prompts. In the latter two rounds of interviews and focus groups, we relied on scenarios based on the aforementioned commercial and on examples of use (like switching on the thermostat while still at work, connecting smart speakers to streaming services, setting up a smart home with smart lights and a smart doorbell, etc.). The respondents were asked to reflect on appropriate and inappropriate uses and whether these are integrated in their own life or, if not, whether they would consider such uses themselves.

The interview and focus group respondents were briefed on the study, signed consent forms, and pseudonyms were used. All focus groups took place at the authors’ universities and lasted for approximately one hour. Prior to the start of the discussion, the authors, having the role of moderator, introduced themselves and the study. Conversely, most interviews took place at the respondent’s home to facilitate open conversations in a comfortable setting. Due to COVID-19 restrictions, some of the interviews were conducted online through Zoom and Google Meet.

From the focus group and interview transcripts, we selected 188 text fragments in which the respondents discussed privacy, PIM, and user practices related to the three technologies. These text fragments were analyzed quantitatively by means of topic modeling to identify recurring topics, and qualitatively via an abductive thematic analysis to explore perceptions of privacy through the lens of CI.

3.2. Topic modeling

To gather insights into salient aspects of the data with minimal human intervention, we applied unsupervised topic modeling using Latent Dirichlet Allocation (LDA) with the Python gensim package. We chose to conduct topic modeling rather than qualitative inductive coding for two main reasons: (1) we wanted our approach to be replicable and scalable for other researchers working on these topics, thus making a methodological contribution to the paper; (2) given that our research goal focuses on perceptions, we wanted our analysis to emerge as much as possible from participant accounts. While qualitative inductive analysis could also be used, we opted for topic modeling given its effectiveness in reducing the amount of researcher involvement and interpretation that could condition perceptions. Each interview snippet related to privacy was treated as a document, for a total of 188 instances. Words with a frequency below 5 and with presence in more than 20% of were excluded from modeling. Five topics initially emerged from the data, however, after reliability checks with two researchers external to the research team, only topics corresponding to Surveillance, Security and User Consent were deemed reliable enough to extract conclusions. The word distributions for each of the topics can be found below. The full topic model is available for inspection and use.Footnote1

Surveillance:0.008*“everyth” + 0.007*“data” + 0.007*“sound” + 0.006*“privat” + 0.006*“servic” + 0.006*“happen” + 0.006*“monitor” + 0.006*“facebook” + 0.005*“turn” + 0.005*“privac”

Security:0.008*“need” + 0.007*“sometim” + 0.007*“hack” + 0.007*“data” + 0.007*“becom” + 0.006*“happen” + 0.006*“easier” + 0.005*“perhap” + 0.005*“anymor” + 0.005*“trust”

User consent:0.009*“permiss” + 0.008*“data” + 0.008*“turn” + 0.007*“listen” + 0.007*“condit” + 0.007*“mayb” + 0.007*“whatev” + 0.006*“locat” + 0.006*“share” + 0.006*“inform”

3.3. Qualitative thematic analysis

To explore privacy perceptions related to the use of the three types of technologies, we identified overarching themes in the data via a thematic analysis (Braun & Clarke, Citation2006). The analysis took an abductive approach (Thompson, Citation2022) that was not fully data- nor fully theory-driven. Privacy as CI (Nissenbaum, Citation2004) allows for a focused analysis because it offers a means to analyze how people evaluate information flows across contexts. To provide insights into how respondents perceive privacy and PIM in the context of the three different technologies, CI and the three topics identified through topic modeling formed the basis of the coding. More specifically, we created basis codes informed by the results of the topic modeling (security, user consent, and surveillance) and CI parameters (context, actors, attributes, and transmission principles). These seven concepts were combined with the three technologies to form 21 base codes (e.g., the parameter context in relation to mobile apps formed the base code Context_MA). Based on the in-depth analysis of the transcripts, we added data-driven signifiers to establish meaningful codes (e.g., a comment related to the attributes of smart home technologies was labeled as Attributes_SH_sensitive financial info). The codebook with a selection of codes to illustrate this process is included in the supplementary material. This coding process was iterative, meaning that many codes were applied to multiple text segments and codes were adjusted during the coding to ensure that they are representative of the content and not overlapping. Two authors analyzed each transcript to corroborate the codes. The analysis resulted in 165 open codes. In the results section, we provide an in-depth account of how the concerns of the respondents are connected to specific CI parameters, and how in turn, their practices are motivated by PIM goals and IM strategies.

4. Results

4.1. Topic modelling

The topic model described above was applied to the corpus in order to predict the probability that each topic was present in a snippet. To examine differences across technologies, we examined the average salience of each topic, as shown in .

Figure 1. Mean presence of topics per technology type.

Figure 1. Mean presence of topics per technology type.

We tested for statistically significant differences for average topic prevalence across categories with one-way ANOVA, Surveillance: F(2, 185) = 0.33, p = .177, Security: F(2, 185) = 0.04, p = .796, User consent: F(2, 185) = 0.05, p = .804. However, none of the differences were statistically significant (p < .05), which is to be expected given the small sample size and the large variance in the topic prediction values. Therefore, mean comparisons should be interpreted for the sample and as providing context for the thematic analysis, and not as generalizable differences. In addition, the presence of topics may also be influenced by the overall discussion, therefore limiting generalizability even further.

Looking at the data, the surveillance topic appears to be especially salient for location tracking (M = .28), with the prevalence of this topic being lower for mobile apps and smart homes. This means that, when discussing location tracking, respondents in our studies were more likely to resort to vocabulary related to the surveillance topic, being a potential indication that this is a key concern for this type of technology. From a PIM point of view, this suggests that location raises particular concerns among our respondents regarding who accesses the information, meaning that location tracking requires different meta-level PIM strategies than mobile apps or smart home technologies.

In contrast to these sizable differences, asymmetries in the remaining topics were subtler. While the topic of user consent was prevalent overall in our sample (M = .25), its presence was consistent across the three technologies. Possibly hinting that vocabulary related to consent is ubiquitous when discussing privacy, regardless of technologies and applications. Finally, security was less discussed within the scope of location tracking than in the remaining technologies. This may be seen as a surprising finding, given the risks that are usually associated with location data (e.g., stalking). However, it should be noted that the difference in means is quite small and, therefore, one should be cautious in extrapolating conclusions from these numbers.

4.2. Qualitative thematic analysis

The three types of technologies are embedded in personal contexts. The following sections provide an in-depth exploration of how the topics surveillance, security, and user consent of respondents are respectively connected to the CI parameters actors, attributes, and transmission principles. PIM and IM come to the fore as motivational factors in how users deal with concerns around these topics.

4.2.1. Surveillance

The quantitative analysis indicates that surveillance is a more prevalent topic in discussions about location tracking than in those about mobile apps and smart homes. In turn, the qualitative analysis shows that the concerns voiced around surveillance are connected to specific actors that take part in information flows. In smart home contexts actors are personalized as they are identified by product names instead of their parent companies (e.g., Alexa instead of Amazon or Siri instead of Apple). When referring to these actors only few respondents voice surveillance-related concerns, and if they do, these relate to commercial actors and devices listening in.

A connection can be drawn here to how actors are described in the context of mobile apps. These are mainly digital platforms, social media services, and other technology companies of which users are concerned that they are eavesdropping. Jessica mentions that she disabled the smart assistant on her phone because of such concerns:

Well sometimes with colleagues or friends, when you are among others, then you suddenly hear: “I didn’t understand you, can you repeat that? I actually think that it is a very scary idea that he [smart assistant] actually listens all day, even though, and you don’t know what happens, you don’t know what Android does with your data. (Jessica)

When respondents talk about technologies and platforms, they not only refer to product names, but also often discuss them as “he” or “she.” These are all signs of anthropomorphism, a tendency to ascribe human characteristics to non-human actors, which often happens in the use of smart voice assistants (Seymour & Van Kleek, Citation2021). This sometimes happens in a joking manner, for instance, George states: “It’s super cool that you, at home, you say, ‘Good night, Google’ and everything is arranged” to which, Fiona responds: “And who is Google? Your imaginary friend at home?”

In this study, anthropomorphism extends beyond smart home technologies because mobile services and platforms are also often described as persons in a more incidental or casual manner. When talking about mobile apps, Jessica for instance states: “You never know what Android does with your data.” This shows how, apart from smart technologies listening in, respondents are also concerned about the processing of personal information by mobile app providers. Jay for instance says: “Those large companies such as Google and Facebook who know everything from us, you could almost say.” It seems to be the case that, on the one hand, respondents fear that surveillance of commercial actors will pervade their personal sphere by listening in on them. On the other hand, concerns are raised about commercial data collection and data processing practices. Respondents hand over their personal information to commercial actors whereby they lose control over it. Privacy concerns are in such cases caused by a lack of control over personal information.

Surveillance is the most prevalent topic for location tracking tools whereby respondents tend to identify actors as people, mainly friends and family. In an interview, Nadia describes how she makes use of location tracking because “It’s handy that I know when she [daughter] comes home. If I’m cooking and she comes home and then I can set the plates, you know, just, it’s just very practical.” Nadia also describes how her daughter monitors her location: “She said to me: ‘Hey, were you at Sonia’s last night?’ So, I said: ‘Yes, how so?’ She responded: ‘I saw it.’ So, Sonia is the mother of her friend, so she knows where she lives and saw in Live360 that I was there.” Nadia and her daughter engage in interpersonal surveillance. Nadia refers to her daughter as an actor who monitors information flows. Barbara describes that she shares her location with her two best friends and that she once checked their driving speed: “Actually, one of them was driving faster than they should have been. And it’s just, it’s literally a joke. So, I WhatsApp her: ‘Slow down!’ ha.” Libby responds to this anecdote with a similar experience:

So, my son doesn’t drive because he’s got learning difficulties, but his friends do. He has a journey from college to go home when I’m at work, so he goes from college to home and then he gets the bus or he walks or he cycles, or whatever. But one time, he got home in like seven minutes from college, and I didn’t say anything until the evening, I said, “So how’d you get home today?” And he looked at me as if to say well you obviously know [laughing] and he goes “Bailey gave me a lift home and so, you can check the speed.” (Libby)

Whereas respondents discuss such forms of surveillance in a joking manner, many engage in lateral, or interpersonal surveillance (Andrejevic, Citation2002), a horizontal form of peer-to-peer surveillance. This often happens in the family context, and when respondents keep track of the activities and associations of family members, they engage in family surveillance practices (Mols et al., Citation2023). Some respondents do not feel comfortable with interpersonal surveillance through location tracking apps. In one of the focus groups, Fred, Sharon, and Claire engage in a conversation about this:

Fred: My wife wanted me to install that, and I said nah [laughter].

Sharon: You said no, but why? Because that’s what we’re talking about. Isn’t it? Having an app that’s spying where you’re going.

Fred: Yeah, yeah.

Sharon: You don’t like that.

Claire: Because he’s going for a cheeky pint [laughter].

Fred: Exactly [laughter]. I can’t go and see my bit on the side [laughter].

Sharon: Yes, because I’m going for a cheeky McDonald’s breakfast. Yeah, and I didn’t want to share it. [Laughter]

This exchange shows that, when personal information about one’s location flows to friends and family through an app, surveillance concerns are also about IM. Just like facial expressions, clothing, and tone of voice shape impressions, one’s location and the spaces where one is seen also impacts the perception that others have of an individual. This exchange, aligned with the topic modeling results, highlights that respondents were particularly concerned about how location tracking could break a carefully constructed front stage by revealing habits and behaviors that were previously confined to the backstage. In particular, role-modeling impressions could be compromised as shown by the excerpt above.

4.2.2. Security

One form of PIM is meta-level management of privacy, security, and information flows (Jones et al., Citation2018). Such meta-level activities entail indirect and digital information management, whereby people do not have direct (tactile) contact with their information. To use connected devices, mobile apps, and digital services, people must “let out” information like credit card details and they have to accept terms and conditions (Jones et al., Citation2018). The fact that people have to reveal certain information to be able to use particular technologies and services creates security concerns.

The quantitative analysis indicated that security is a key concern of the respondents, but that it was less prevalent in the context of location tracking apps than mobile apps and smart home technologies. These differences are connected to the attributes of information flow (Nissenbaum, Citation2004), or the information items (Jones et al., Citation2018). In other words, the type of information brings up different concerns, especially when this information is managed by third parties. For location tracking, there are two types of information mentioned. The first is location information, and the previous section about surveillance indicates that most respondents find it convenient to share their location with close contacts. Second, they mention battery status as an information type which location tracking apps (like Live360) monitor to warn a user that the phone battery of someone whose location they track needs to be charged. Users find this convenient as well.

However, where the use of location tracking services and mobile apps collides, concerns arise. Multiple respondents are concerned about the security of location data in connection to other apps and services. They fear that the multiplicity of services that use location data can create detailed insights into movements, habits, and travels of a user. As Hannah states: “You have a lot of these apps that store exactly where you go and when and so on and so on … Which have in their terms and conditions ‘your location is traced’ without any indication of why this is relevant.” In line with this, Peter describes his own experience and concerns:

So yes, I have looked at Google in location searches and it has indeed just updated where I have been in the last 8 years, you can see everything, vacations, hotels, everything … I was just like, okay, I thought it was funny, it is nice to see where you have been. But that’s the dilemma, you can misuse it in a terrible way. But it is also great fun to look back. (Peter)

For location data, it becomes clear that the context of use determines whether respondents find it appropriate that their location information flows across services. Sharing location information with a location tracking app for personal use is deemed convenient and appropriate whereas apps requesting access to location information without a clear purpose is deemed inappropriate and concerning.

In the broader context of mobile app use, respondents voice concerns about various information types, such as browsing history, pictures, videos, phone calls, and streaming history. Considering PIM, respondents are afraid of mobile apps combining different flows of personal information types whereby the user loses control of who is accessing their information. As Susan remarks: “With Facebook, it is of course all kinds of websites that they get data from. So not even 80% of what they know about you is what you are entering, but from what they can fetch from there.” In this example, Susan is not able to exercise control over the information Facebook manages about her, nor does she exactly know which information types they have. This creates uncertainties and stress.

For smart home technologies, respondents’ concerns revolve around the security of their personal information, accounts, and services. Most respondents are not concerned about their devices registering their everyday behavior, as Anya remarks: “I’m really boring, so I don’t think I care. My Alexa is probably listening to me permanently, and all it can hear is me talking to the cats. So, I don’t really care, they’ll be very bored with me, very quickly.” This quote displays signs of anthropomorphism connected to IM; Anya reflects on her smart device not only listening in on her but also becoming bored with her. She ascribes human feelings like boredom to her smart speaker.

Moreover, that respondents care more about their personal information than about smart technologies getting to know their in-home activities, becomes clear in an interview with a married couple. Grace indicates: “We disabled the smart energy meter. We are interested in technology, but it should not be that we end up being hacked, or something. We are incredibly careful about that. Eh, a smart home is nice, but it should not be the case that, eh…” To which her husband Oscar adds:

That I get hacked and that I can’t access my software, or whatever … I think that is interesting and it’s nice to see if people have that [smart technologies] but it also has a very dark side … The risk is that people abuse it, well, through hacking for instance. I think that such [smart] devices are easy to follow and to manipulate. (Oscar)

The role of additional actors (in this case mobile app, location tracking, and smart home service providers) makes PIM a complicated challenge and poses questions about how people can trust that their information is secure and safe (Jones et al., Citation2018). Respondents like Grace and Oscar are concerned that their information is not secure in the hands of smart technology providers and that this causes risk of malicious parties to hack into their systems and to harm their digital and physical safety.

Based on the quotes above, as well as the results from the topic modeling, we can also establish a connection between security concerns and IM. One can argue that, if security concerns were driven purely by a sense of [physical] safety, participants would have been particularly cautious with location technologies given the potential for burglary, stalking and other crimes that can be facilitated by gaining information on the location of the victim. However, we see that security concerns are raised around mobile apps and smart speakers, which are much more prone to compromising IM tactics. Even when participants claim that they are not concerned about security in these technologies, they do so through an impressions management angle: “I’m really boring” stated Anya above. When it comes to specifying the actual security risks posed by accessing information, participants tend to be vague. However, the type of information they highlight, such as home activities and behavior suggest that IM concerns are also at the forefront of safety considerations. For instance, Rick states: “[Google Home] knows everything about your life and I would not choose to have such a device in my home because of that.”

4.2.3. User consent

When people have to decide whether to allow third parties access to their information, user consent procedures can offer some control. However, Jones et al. (Citation2018) indicate that meta-level PIM practices also include digital information flows whereby users must accept terms and conditions that they do not necessarily feel comfortable with. The topic modeling indicates that consent is a topic prevalent in discussions of all three technologies. The qualitative analysis establishes that, across contexts, having control over their information and autonomy to decide who may access it, is crucial to respondents. Most respondents are aware of consent procedures, which in connection with CI can be understood as transmission principles, guaranteeing informed, effective, and, ideally, acceptable flows of information (Nissenbaum, Citation2014).

Some respondents are carefree when it comes to smart home technologies, Brad for instance argues that he is “well-connected in everything, smart lives, assistants, health trackers.” Many respondents voice questions about the information collected by smart devices listening in on conversations at home. Pascal argues: “your data’s out there, what’s happening with the data, all your information? And like, Alexa, or your smart app, it’s constantly listening to you all the time. It records everything you do in your room, in your house. What happens to all that information?” The transmission principles of smart home technologies entail the processing of audio recordings made in the home. While users have to consent to the recordings of audio and/or video in order to use smart technologies, respondents are alarmed by news messages and rumors about smart devices listening in and recording conversations of users. These concerns are amplified by the fact that respondents do not feel in control over how smart home technologies manage their personal information. This becomes clear in the concerns described by Joel:

It just doesn’t feel good. For example, if there is a smart device with a camera in it or something. And that there is just information that is private, that it is used in one way or another. Purely the feeling that I have no grip, that prevents me [from using smart home technologies] … That it is completely opaque. (Joel)

In the context of location tracking, many respondents have provided consent for the collection of location data in order to disclose their location to Snapchat and WhatsApp contacts. However, respondents like Sharon do not feel comfortable with sharing their location through social media or messaging apps. She states: “We’re all secretive, that we don’t want people or things or robots or whatever taking our data and where we’ve been and what we’re doing and what we’re buying and what we’re eating and when we’re pooing. No, we don’t want them to know that then, do we?” Sharon’s quote displays uncertainty about which actors can access which data attributes. This, in combination with a lack of knowledge about the purpose for doing it, falls in line with shared sentiments about mobile apps in general.

More specifically, in the context of a broader range of mobile apps, the transmission principles identified by the respondents were significantly more unclear. Users (senders and data subjects) identified the conditions under which the data is collected and processed as well as the application’s terms and conditions, which they tend to grudgingly accept, as transmission principles. Lucian states that “when you download an app to your phone you get all the blimmin Ts&Cs [terms and conditions] nobody bothers to read them … you just sign and that’s it.” Several respondents discuss the lack of transparency in policy documents. They feel this leads to a lack of knowledge about what is collected and processed, by whom, and why. All these aspects seem to be obfuscated by overcomplicated and long privacy policies and terms and conditions resulting in some degree of concern amongst respondents. Arnold, for example, argues that “we’ve probably all agreed to Twitter’s terms and conditions, and who knows what it says?” Leo adds that users also lack the autonomy to engage in negotiations about the management of their personal information:

It is a consideration that occurs every time you want to continue to use an app. Because it is non-negotiable, you can’t say, I want those conditions and not these conditions. So, if you do not accept it [terms and conditions], you cannot use the app. So, it is a consideration for me every time, that I must make in a split second: do I want to keep this app even longer or not, yes? So, then I must accept the conditions. (Leo)

Leo’s quote touches upon a frustration that almost all respondents have. Mobile phone users lack the power to negotiate how their personal information is managed by mobile app providers. Whereas respondents would like to actively negotiate transmission principles, they experience a lack of control to do so as well as a lack of transparency around data use. Because the desire or necessity to use mobile apps prevails, many respondents are resigned to having their personal information managed by mobile app providers in nontransparent ways. Some respondents do not comply with forced terms and conditions and engage in acts of resistance instead. They actively search for settings to control to the best possible extent how much and what kind of data is being shared with whom and as a result, they devise pragmatic strategies to manage access to their personal information flows and to the transmission principles guiding them.

The respondents provide examples of their PIM strategies in response to uncertainties about consent procedures and data collection and processing. For example, Bjorn rejects access requests: “The first thing I do with any device is to turn everything off as much as possible.” Peter carefully weighs which apps he wants to use and states that he has “categorically refused an app because it wanted to access my photos and location. I couldn’t imagine what they should do with that, so I didn’t install it. But, well, I actually could do that because there is an alternative app for the same service.” Similarly, Jay describes how an app “wanted to know everything from me, wanted to know who my contacts were, I thought: what does that have to do with it? So, I didn’t want it. I really pay attention to that, yes.” Only for a few respondents such considerations lead to them not using particular services at all. Jay does not want social media apps on his smartphone: “I also don’t have Twitter on my phone, I don’t have that kind of apps.” Lukas also states: “So, for me, I don’t have many apps on my phone because I don’t generally like, um, sharing data.”

The control over transmission principles, a key component of CI, is essential to one’s PIM. What the interviews make clear is that respondents feel frustrated by the lack of transparency of terms and conditions across contexts, the limited control they have over how commercial parties manage their personal information, and the limited availability of services that seem more trustworthy. Acts of resignation seem to be caused by a shared feeling that there is no alternative service or technology that gives users more control over appropriate personal information flows. User consent procedures are perceived as unavoidable that does not offer autonomy or control.

A salient point in many of the quotations above is the tendency to personalize technologies and the concerns behind consent (e.g., “[the app] wanted to know,” “we don’t want them to know,” “[Alexa] is listening”). This again reinforces the trend that participants may be seeing technologies and the consent procedure as an interpersonal dynamic more than an institutional or technical one. By attributing human traits and considerations to technologies and platforms, respondents not only manage their personal information, but also the impressions they could make on technology providers. The empirical data underlines the notion that legitimacy of information management cannot depend on only a legal basis of consent, but that it instead rests on aggregated appropriateness of all CI parameters and context-specific norms (Nissenbaum, Citation2019) as well as on personal interpretations of information processing.

5. Discussion and conclusion

Our study shows how meta-level PIM practices are contextualized by the use of different technologies. Respondents in our study adjust their behaviors according to the technology used, the context in which it is used, and express different concerns depending on these very technologies. We add to Whittaker’s (Citation2011) understanding of PIM as the curation of personal information by demonstrating that technological mechanisms in which this curation takes place and personal interpretations of data collection influence the management process itself.

We indicate that different theoretical strands are useful to study contextual factors of PIM by connecting it privacy as CI (Nissenbaum, Citation2004) and IM (Brown et al., Citation1987; Rosenberg & Egbert, Citation2011). The former emphasizes how different dimensions of privacy are associated with information management needs, based on key parameters governing the flow of information, while the latter highlights how interpersonal dynamics shape what is understood as personal information and the consequences of its disclosure. Our multi-method approach identifies key discursive patterns as well as the narratives in respondent accounts of technology use. This methodological approach may be a template for future studies on PIM, especially those that have a meso-level focus, aiming to connect individual interpretations of personal information with general patterns for meta management practices.

The findings demonstrate that the way in which respondents construe risks associated with using digital technologies is directly connected to how they perceive information flows between actors, its norms, and contexts. Results indicate that respondents uphold different norms of appropriate use of information for smart home technology providers, mobile platforms, and personal connections. The perceived appropriateness and envisioned impressions of information flows are essential to one’s PIM and particular concerns and practices revolve around the topics surveillance, security, and user consent.

Surveillance is the most frequent topic in the context of location tracking. Respondents’ concerns particularly focus on specific actors (both commercial entities and one’s social circle) that take part in information flows. They are concerned about how these may process one’s personal information and about what impressions they get from the user. Here a process of anthropomorphism becomes visible as users personalize devices and platforms by approaching them as human being with subjective views and human feelings about their modes of use, routines, and habits.

When it comes to security, respondents are mainly concerned about the security of their smart devices, as they fear that these can be hacked. Moreover, respondents fear that smart home platforms breach their privacy by listening in on their conversations. Mobile app providers cause concerns because users experience a lack of control over their personal information. Respondents are anxious about their online security because they are uncertain about the combination of different data flows of personal information. In mobile app use, people lose sight of who is in possession of their data, and for what purposes. PIM concerns are perceived as interpersonal challenges between users, personalized technologies, and commercial actors.

Privacy considerations align with uncertainties about user consent in all three contexts. Respondents feel that they lack control over their personal information because they do not have the opportunity to decline or negotiate terms of services and lack the autonomy to decide who can access one’s personal information. They display expectations of appropriate personal information flows and the need to exercise autonomy over their own information when this is managed by digital service providers. The lack of control that is voiced by many respondents mirrors existing research about PIM (Alon and Nachmias (Citation2020) identify it as one of seven salient affective aspects of PIM) and privacy (see for instance Auxier et al., Citation2019). A lack of control can limit users to actively manage their personal information.

As a contribution to the existing literature (like Lutz & Newlands, Citation2021), our topic modeling and qualitative analysis show how interpersonal dynamics play a key role in shaping perceptions of different technologies. This has led to some findings that at first seem counterintuitive, such as a lack of security concerns with location tracking technologies, but that then are justified when we examine user interaction with technology from an interpersonal angle. These interpersonal dynamics mean that impressions management forms an important theoretical tool to foster understanding of CI and PIM practices.

This study indicates a complexity of responses that highlights how PIM needs to be seen as a dynamic phenomenon intricately woven into people’s everyday lives. This directly influences how people adopt and interact with digital technologies and applications. The contextual and everyday nature of technology use needs to inform policy and regulations around technology for these to be of actual use in people’s lives instead of a “one-size fits all” approach to privacy as we see in legal frameworks. Conceptually, we demonstrate that privacy as CI and IM complement current knowledge on PIM in that the appropriate flow and various stages of information (data collection and processing) coalesce with one’s management of personal information and interpersonal interpretations of data processing.

As such, our study aligns with calls for a theoretical update of PIM to incorporate new affordances and data generating and management technologies (Feng & Agosto, Citation2019). Location tracking, smart-home devices and mobile apps are tools for PIM, but also are associated with meta-level PIM activities that relate to interpersonal or commercial surveillance concerns regarding how information is managed by others. While this may suggest that all privacy preserving behaviors or responses to surveillance concerns are a form of PIM, we should also delineate that these domains often address data that is not personal, (e.g. business information or intellectual property), and behaviors that do not fall under the scope of information management (e.g. wearing sunglasses to avoid being recognized in public versus limiting access to one’s location history). Our study illustrates how incorporating privacy and surveillance concerns in a PIM framework can shed light on motivations for and practices of everyday information management. We therefore join the calls of others (Cushing, Citation2023; Feng & Agosto, Citation2019) in suggesting a review of PIM to consider the role of audiences and information flows, under frameworks such as CI, to better understand behaviors and expand the scope of personal information in line with new technological affordances.

The limitations of this study bring opportunities for future PIM research. The data collection is based on focus groups and interviews whereby the respondents were prompted in different manners (a variety of vignettes, a commercial, and interacting with a smart speaker). We acknowledge that the way they were prompted might have influenced the outcomes, including the topic modeling results. Moreover, whereas the sample is large for a qualitative analysis, it is limited for the quantitative analysis and does not allow for statistically significant results. It does, however, allow for the in-depth qualitative exploration of the three topics surveillance, security, and user consent which provides insights relevant for PIM and CI. In addition, the set-up of the quantitative analysis allows for the use of the topics and approach in future research to validate or supplement the findings of this study. To further explore the connections between the three themes and CI, future research could focus on the statistical relationship between the three themes surveillance, security and user consent and the level of acceptability of an information flow. This way, the topics constructed in this study can form the basis for follow-up explorations of privacy attitudes and PIM strategies and how these differ along contextual factors.

Moreover, by expanding the research scope to other groups of users in different (cultural) contexts, future research can further explore the differences between how surveillance, security, and consent issues around the three technologies are perceived (e.g., surveillance as a more salient topic for location tracking and mobile apps than smart home technologies). Research can for instance also include factors like the domestication of the technologies, public debates, and news discourses about privacy in the context of the technologies, digital literacy, and anthropomorphism in technology use.

Acknowledgments

The authors would like to thank all the research participants for sharing their experiences.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek [628.001.024].

Notes on contributors

Anouk Mols

Anouk Mols is a senior researcher in the Research Group Communication in Digital Transition and lecturer in Communication at University of Applied Sciences Utrecht in the Netherlands. Her research focuses on digital literacy, media use, AI, privacy, and surveillance.

Jorge Pereira Campos

Jorge Pereira Campos is an Assistant Professor at Lusíada University in Porto (Portugal) and affiliated researcher with the COMEGI research center. His research focuses on consumer and organizational behavior with a particular interest in how consumers and workers navigate AI developments in different contexts.

João Fernando Ferreira Gonçalves

João Fernando Ferreira Gonçalves is an assistant professor at the Media and Communication department of Erasmus University Rotterdam (the Netherlands). His research focuses on artificial intelligence, online discussions, incivility, and deliberation. He also studies how to use machine learning techniques to analyze online discourse and tackle practical issues like automatic hate speech detection and moderation.

Notes

References

  • Abdi, N., Ramokapane, K. M., & Such, J. M. (2019). More than smart speakers: Security and privacy perceptions of smart home personal assistants. In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). Santa Clara, CA, The United States.
  • Alon, L., & Nachmias, R. (2020). Anxious and frustrated but still competent: Affective aspects of interactions with personal information management. International Journal of Human-Computer Studies, 144, 102503. https://doi.org/10.1016/j.ijhcs.2020.102503
  • Andrejevic, M. (2002). The work of watching one another: Lateral surveillance, risk, and governance. Surveillance & Society, 2(4), 479–497. https://doi.org/10.24908/ss.v2i4.3359
  • Auxier, B., Rainie, L., Anderson, M., Perrin, A., Kumar, M., & Turner, E. (2019, November 15). Americans and privacy: Concerned, confused and feeling lack of control over their personal information [report]. Pew Research Center.
  • Ayalon, O., & Toch, E. (2017). Not even past: Information aging and temporal privacy in online social networks. Human–Computer Interaction, 32(2), 73–102. https://doi.org/10.1080/07370024.2016.1203791
  • Barter, C., & Renold, E. (2000). ‘I wanna tell you a story’: Exploring the application of vignettes in qualitative research with children and young people. International Journal of Social Research Methodology, 3(4), 307–323. https://doi.org/10.1080/13645570050178594
  • Beldad, A., & Citra Kusumadewi, M. (2015). Here’s my location, for your information: The impact of trust, benefits, and social influence on location sharing application use among Indonesian university students. Computers in Human Behavior, 49, 102–110. https://doi.org/10.1016/j.chb.2015.02.047
  • Bergman, O., Boardman, R., Gwizdka, J., & Jones, W. (2004). Personal information management. CHI ’04 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’04): Association for Computing Machinery, 1598–1599. https://doi.org/10.1145/985921.986164
  • Bolino, M., Long, D., & Turnley, W. (2016). Impression management in organizations: Critical questions, answers, and areas for future research. Annual Review of Organizational Psychology and Organizational Behavior, 3(1), 377–406. https://doi.org/10.1146/annurev-orgpsych-041015-062337
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Brown, P., Levinson, S. C., & Levinson, S. C. (1987). Politeness: Some universals in language usage (vol. 4). Cambridge University Press.
  • Charmaz, K. (2014). Constructing grounded theory. SAGE.
  • Chen, J. C., & Ha, Q. A. (2019). Factors affecting the continuance to Share Location on Social Networking Sites: The influence of privacy concern, trust, benefit and the moderating role of positive feedback and perceived promotion innovativeness. Contemporary Management Research, 15(2), 89–121. https://doi.org/10.7903/cmr.19268
  • Cushing, A. L. (2023). PIM as a caring: Using ethics of care to explore personal information management as a caring process. Journal of the Association for Information Science and Technology, 74(11), 1282–1292. https://doi.org/10.1002/asi.24824
  • Feng, Y., & Agosto, D. E. (2019). Revisiting personal information management through information practices with activity tracking technology. Journal of the Association for Information Science and Technology, 70(12), 1352–1367. https://doi.org/10.1002/asi.24253
  • Goffman, E. (1949). Presentation of self in everyday life. American Journal of Sociology, 55, 6–7. http://educ333b.pbworks.com/w/file/fetch/53313682/goffman_intro.pdf
  • Havelka, S. (2021). Typologies of Mobile privacy behavior and attitude: A case study comparing German and American Library and information science students. The Serials Librarian, 81(1), 42–58. https://doi.org/10.1080/0361526X.2021.1875961
  • Hofstede Insights. (n.d.). Country comparison tool. Retrieved from https://www.hofstede-insights.com/country-comparison-tool
  • Huvila, I., Eriksen, J., Häusner, E., & Jansson, I. (2014). Continuum thinking and the contexts of personal information management. Information Research, 19(1). http://informationr.net/ir/19-1/paper604.html
  • Jones, W., Dinneen, J. D., Capra, R., Diekema, A. R., & Pérez-Quiñones, M. A. (2018). Personal information management. In J. D. McDonald & M. Levine-Clark (Eds.), Encyclopedia of library and Information Sciences (4th ed. pp. 3584-3605). CRC Press.
  • Kilic, D., Crabtree, A., McGarry, G., & Goulden, M. (2022). The cardboard box study: Understanding collaborative data management in the connected home. Personal and Ubiquitous Computing, 26(1), 1–22. https://doi.org/10.1007/s00779-021-01655-9
  • Lambert, S. D., & Loiselle, C. G. (2008). Combining individual interviews and focus groups to enhance data richness. Journal of Advanced Nursing, 62(2), 228–237. https://doi.org/10.1111/j.1365-2648.2007.04559.x
  • Liao, Y., Vitak, J., Kumar, P., Zimmer, M., & Kritikos, K. (2019). Understanding the role of privacy and trust in intelligent personal Assistant adoption. In N. Taylor, C. Christian-Lamb, M. Martin, B. Nardi (Eds.), Information in contemporary society. iConference 2019. Lecture notes in computer science (Vol. 11420, pp.102–113). Springer. https://doi.org/10.1007/978-3-030-15742-5_9
  • Lucas, G. M., Gratch, J., King, A., & Morency, L. P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94–100. https://doi.org/10.1016/j.chb.2014.04.043
  • Lutz, C., & Newlands, G. (2021). Privacy and smart speakers: A multi-dimensional approach. The Information Society, 37(2), 147–162. https://doi.org/10.1080/01972243.2021.1897914
  • Mols, A., Campos, J. P., & Pridmore, J. (2023). Family surveillance: Understanding parental monitoring, reciprocal practices, and digital resilience. Surveillance & Society, 21(4), 469–484. https://doi.org/10.24908/ss.v21i4.15645
  • Mols, A., Wang, Y., & Pridmore, J. (2022). Household intelligent personal assistants in the Netherlands: Exploring privacy concerns around surveillance, security, and platforms. Convergence: The International Journal of Research into New Media Technologies, 28(6), 1841–1860. https://doi.org/10.1177/13548565211042234
  • Morgan, D. (1997). Focus Groups as Qualitative Research. SAGE Publications.
  • Nissenbaum, H. (2004). Privacy as contextual integrity. Washington Law Review, 79, 119–157. http://www.kentlaw.edu/faculty/rwarner/classes/internetlaw/2011/materials/nissenbaum_norms.pdf
  • Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the Integrity of Social Life. Stanford University Press.
  • Nissenbaum, H. (2019). Contextual integrity up and down the data food chain. Theoretical Inquiries in Law, 20(1), 221–256. https://doi.org/10.1515/til-2019-0008
  • Peek of the Net. (2017) Google home official ad. Retrieved from https://www.youtube.com/watch?v=OsXedJq1aWE&t=2s
  • Rosenberg, J., & Egbert, N. (2011). Online impression management: Personality traits and concerns for secondary goals as predictors of self-presentation tactics on Facebook. Journal of Computer-Mediated Communication, 17(1), 1–18. https://doi.org/10.1111/j.1083-6101.2011.01560.x
  • Seymour, W., & Van Kleek, M. (2021). Exploring interactions between trust, anthropomorphism, and relationship development in voice assistants. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–16. https://doi.org/10.1145/3479515
  • Stewart, D. W., & Shamdasani, P. N. (2007). Focus groups: Theory and practice. SAGE Publications.
  • Sukk, M., & Siibak, A. (2021). Caring dataveillance and the construction of ‘good parenting’: Estonian parents’ and pre-teens’ reflections on the use of tracking technologies. Communications the European Journal of Communication Research, 46(3), 446–467. https://doi.org/10.1515/commun-2021-0045
  • Syn, S. Y., Sinn, D., & Kim, S. (2020). Impact of contexts, resource types and perceptions on information management within the personal domain among college students. Aslib Journal of Information Management, 72(6), 909–927. https://doi.org/10.1108/AJIM-05-2020-0163
  • Thompson, J. (2022). A Guide to abductive thematic analysis. The Qualitative Report, 27(5), 1410–1421. https://doi.org/10.46743/2160-3715/2022.5340
  • Vitak, J., Liao, Y., Mols, A., Trottier, D., Zimmer, M., Kumar, P. C., & Pridmore, J. (2022). When do data collection and use become a matter of concern? A cross-cultural comparison of U.S. and Dutch privacy attitudes. International Journal of Communication, 17(2023), 471–498. https://ijoc.org/index.php/ijoc/article/view/19391
  • Vitak, J., & Zimmer, M. (2020). More than just privacy: Using contextual integrity to evaluate the long-term risks from COVID-19 surveillance technologies. Social Media+ Society, 6(3), 1–4. https://doi.org/10.1177/2056305120948250
  • Whittaker, S. (2011). Personal information management: From information consumption to curation. Annual Review of Information Science and Technology, 45(1), 1–62. https://doi.org/10.1002/aris.2011.1440450108
  • Widmer, S., & Albrechtslund, A. (2021). The ambiguities of surveillance as care and control: Struggles in the domestication of location-tracking applications by Danish parents. Nordicom Review, 42(s4), 79–93. https://doi.org/10.2478/nor-2021-0042