2,590
Views
2
CrossRef citations to date
0
Altmetric
Commentry

Blood, Sweat, and Tears: Navigating Creepy versus Cool in Wearable Biotech

ORCID Icon
Pages 779-785 | Received 02 Oct 2017, Accepted 11 Jan 2018, Published online: 08 Feb 2018

ABSTRACT

This paper considers how personal data protections figure into the design process of wearable technology. The data question is becoming especially important in the face of recent innovations in biotechnology that capitalize on the new fungibility of biology and electronics, in which new biotech wearables capitalize on the ability to analyze and track changes in blood, sweat, and tears. Interviews and participant observation with wearable tech designers, data scientists, fashion tech entrepreneurs, and select experts in cybersecurity and intellectual property law, reveal a range of approaches to data protection in design within the culture where wearables are beginning to merge with biotech. While a few respondents were extremely vigilant about protecting consumer’s privacy, the majority felt that when consumers ‘opt in’ to data sharing, they should be cognizant of the risks. From their perspective, it is not necessarily the producer’s responsibility to protect user's personal data. These attitudes present a problematic logic, which leaves users vulnerable to data exploitation. The paper concludes by arguing that this laissez-faire culture is the environment in which wearable biotech is being developed and will be deployed. This emerging technology raises issues about bodies, data, and ownership in crucial need of analysis and critique to push its move into the mainstream toward more equitable and inclusive ends.

On a rainy Thursday, entrepreneurs, fashion executives, and intellectual property lawyers gathered on the 50th floor of a midtown office building to discuss technology and fashion. A particularly lively session found panelists considering the ‘creepiness’ of data collection vs. the convenience it affords. ‘Immediate satisfaction is what our customer wants. Customization and personal styling are also what she wants. Data collection is the only way to accomplish this, but when we talk about data collection, it gets a little creepy,’ explained a fashion house executive discussing their RFIDFootnote1 enabled smart store. In it, smart garments track customer behavior via interacting with a smart dressing room that assesses whether try-ons result in purchases, while mapping in-store traffic patterns. Not only does the technology help with stock choices and store geography, their 18–30-year-old customers love it. They can get a different size ‘without leaving the dressing room half naked,’ order drinks, or change to more flattering or occasion specific lighting. For this target customer, these conveniences more than offset what the brand exec termed the ‘creep factor.’ Besides, he averred, knowing that ‘50 blue dresses went into the dressing room is not P.I.I.’ (industry lingo for personally identifiable information).

A dressing room mirror watching you try on clothes may seem pretty creepy. As many of the fashion and tech industry professionals I spoke to for my researchFootnote2 pointed out, however, if that feels invasive, you can simple choose to avoid smart stores. When discussing data vulnerability, the majority of my respondents emphasized that users choose or ‘opt in’ to giving up P.I.I. as part of interacting with their product. In other words, these producers consistently cited the fact that in the creepy vs. convenient calculus, convenience tends to win, and this choice figured prominently in their treatment of customer data.

A tech designer I met at a fashion tech summit in New York CityFootnote3 typified this attitude, explaining that when a customer is sufficiently engaged, they give up all the data they can in exchange for relevant results:

One of the businesses that I’m involved with, which is a recommendation engine; an algorithm, for expecting mothers … the recommendation engine works – well, is being put into a play at – around moments where the user is very engaged, and involved in the process – abnormally so.

In this scenario, the convenience of one-stop shopping for all their baby’s needs prompts users to ‘opt in’ to sharing their data:

[Compared to] looking for a pair of socks on a website, I could go to a website, and my actual – my investment in that activity is pretty low … But, if I’m going to a website to make a list of baby products for my unborn child who’s inside my belly, then I’m going to give all of the information I possibly can to the system to get them to yield the better results, and the most personalized results for me. So, that’s our strategy with this business … In this scenario, we’re not making that data anonymous. We are sharing the user’s personal email address and information with our partners.

For this designer, the returns afforded by ‘opting in’ balance out any creepiness because, ‘if you put something in, it gives you a valuable response.’ In his view, customers choose to ‘give all the information’ they ‘possibly can to the system’ because it yields ‘better results.’ Keeping the data anonymous wouldn’t make sense to either the business or the consumer.

In the creepiness/convenience calculus, the choice seems pretty clear. If it creeps you out to have your personally identifiable information tracked and shared, don’t ‘opt in’ by shopping at the smart store, or signing up for the personalized app. What happens, however, when potentially creepy data gathering fashionable technology becomes cool? How do users then navigate the creep-to-cool ratio?

In the age of the Fitbit, strapping technology onto your body to track and monitor yourself is not only doable, but desirable. Bio-sensing wearables meld together passive and active data collection, often offering round the clock and personal self-surveillance, knowledge, and control. In other words, wearable technologies that have come into fashion are making the line between creepy vs. cool much fuzzier. But, as my respondents pointed out, users ‘opt in’ by wearing the device, and get cool info and benefits in return, so what’s the problem?

Communication scholars have been sounding the alarm about the risk of possible data loss, leakage, or compromise inherent in self-tracking wearable technologies for quite some time (Ajana, Citation2017; Fotopoulou & O’Riordan, Citation2016; Lanzing, Citation2016; Levy, Citation2014; Nicholls, Citation2016; Sanders, Citation2017, among others). Opting-in to gain self-knowledge, for instance, often opens the door to third parties (Crawford, Lingel, & Karppi, Citation2015, p. 481). When the body is datafied, it is more susceptible to both surveillance and to compromise by hacking (Haggerty & Ericson, Citation2000; Clough, Haber, Scannell, & Gregory, Citation2015). Some have gone so far as to claim giving up data is a form of free labor, with the lion’s share of benefits going to corporations, not users (Lupton, Citation2016). Others have found evidence of the need for resistance and push back via struggles over data ownership, and called for more equitable participation in data gathering and interpretation (Nafus & Sherman, Citation2014; Neff & Nafus, Citation2016, p. 7 and cf.).

Do users know what they are getting into when they strap on a device? Do they care? This is a deceptively simple question. A cultural fondness for the idea that technology is neutral, i.e., just a tool, complicates the notion of data protection or privacy in tracking and wearable devices, and threads through debates in communication and related circles regarding technological affordances.Footnote4 The rough idea that ‘affordances are the dynamic link between subjects and objects within sociotechnical systems’ (Davis & Chouinard, Citation2017, p. 1), belies the complexity involved in determining whether or not a device can ‘afford’ privacy to the user.

Privacy is significantly variable, as it depends on dynamic relations between ‘the technical features of a given medium, the knowledge and skills of a user to employ those features, and the individual’s attitudes toward restricting access to some or all content’ (Evans, Pearce, Vitak, & Treem, Citation2017, p. 44). While some of the examples discussed above involve passive data gathering, some users actively give up vulnerable data in exchange for ‘goods’ such as convenience or cool. As communications scholars have pointed out, ‘a desire for privacy is far from universal. Many users instead try to share content with as large an audience as possible to achieve information diffusion’ (Evans et al., Citation2017), as in social media, for example.

Grappling with privacy as affordance, outcome, or something in between brings up the intricate relationship between designer and user. While it is users who opt into data sharing, how it will be shared is deeply affected by designers, ‘who have the power to enable and constrain certain action possibilities through their design choices’ (Bucher & Helmond, Citation2018, p. 6). At the same time, however, wearables’ portability and mobility demand another layer of analysis, since ‘what mobile media afford has nothing to do with a specific button, but rather with the kinds of communicative practices and habits they enable or constrain’ (Bucher & Helmond, Citation2018, p. 12). From this angle, the notion that users ‘opt in’ to data collection with any kind of comprehensive knowledge of what that means for the vulnerability of their data becomes politically contentious.

For the fashion and tech professionals, I interviewed, the path to the best course of action on this issue is unclear. While they deeply felt the importance of their design choices, they expressed a wide range of attitudes about where the responsibility for protecting consumer data should fall. Some blamed the victim, saying they should know better than to carelessly engage in the click-click-gimme-gimme culture of convenience and personalization. In a Skype interview, a founding designer for a wearable camera company was unconcerned about possible negative reactions to his product’s potential to invade privacy because people don’t understand how the process works. Citing a ‘man on the street’ interview of everyday passersby, he observed:

This guy takes the camera to the street and starts to ask regular American, ‘Who is Snowden?’ Nobody knows. Some people know the name. They know that he did something bad, but they don’t know exactly what. ‘You guys actually remember he revealed some secrets?’ … No, nobody could remember that. Then the guy … starts asking people, ‘Do you know did you ever send a picture of your dick?’ Of course, they all did. ‘But do you care that the NSA, your government, saved on its servers a picture of your dick?’ People immediately explode. They explode in rage.

For him, it is the public’s lack of understanding that is to blame for their choices, choices which render their data vulnerable.

In contrast, others took the power of their role as designers to heart. A designer of connected/smart jewelry I interviewed in a Soho café was very clear how her company prioritized protecting their users from their products’ inception:

That was a huge thing we were thinking about as designers. There’s no GPS information, and there’s no – we don’t have any user data on our servers. So, we don’t have any personal identifying information. So, the signal’s – so my bot can recognize your bot, but it’s all encrypted and then it’s all, it’s not stored, so it has information about your bot when we’re in the same proximity, but it doesn’t store it.

While these protocols were important to her users, she admitted that they made it difficult to foster community, the bread and butter of many wearable products that track personal information. She explained that her company compensated by providing a website where wearers could ‘opt in’ and share with others on the site.

In a phone interview, another self-tracking bracelet designer was somewhat more comfortable about users sharing their data with her product. She was, however, still at pains to explain how the data is very well protected both between users, and within the company, saying:

So, it’s – yeah, even within the company. It’s very few people who have the keys to really know me as a user, I’ve walked this many steps, or my weight is X. That’s just not information that we as a company internally pass around lightly. So, the data is very secure and protected even inside the company.

She attributed this aspect of her product to the company culture in which she worked. Significantly, however, she placed the bulk of responsibility for data protection while using the product on users, a false choice for many. On the one hand, inputs make the app smarter; on the other, the interface is ‘totally opt in.’ Thus, while ‘the user can remain as anonymous as they want to and withhold that information from us,’

when they do share that data with us, the smarter our algorithm can be in the future at providing them advice and insights and recommendations based on who they are.

In the face of such smart advice and useful insights, who would want to choose anonymity?

According to an intellectual property lawyer who lectures on privacy issues for wearables, it is a no brainer that people lean toward the benefits in this sort of cost-benefit analysis, because they don’t realize what is really going on:

I think that when someone gets new technology, and you are about to stick it on your arm … and it asks if you agree to the terms of this device, of course I read it because I’m a lawyer, but most people probably sit there and say, well what’s the cost/benefit? Do I really care if this device knows where I’ve been all day? No! Because it’s gonna give me more useful information than do harm to me. But, having said that, you know, if you don’t realize that you are putting on something that has your data that could potentially be given to somebody else, you may not even understand or know what the ramifications are.

She emphasized that since privacy protections are in flux, it is up to the individual manufacturers to pay attention to how the field develops. From her perspective, it is the individual company’s responsibility to be sensitive to these issues, as the consumer either doesn’t care, or doesn’t know they should.

In the final panel that rainy afternoon in midtown, one I.P. lawyer joked, ‘maybe our leaking data should be protected as intellectual property!’ With the advent of personalized technologies that engage unique biological identifiers by measuring chemicals in blood, parsing the meaning of sweat, or analyzing tears, she might not be so far off the mark (Nissenbaum & Patterson, Citation2016; Prahl, Citation2015; Thompson, Citation2015; Wood, Citation2018). Newly invasive tech/body interfaces in biotech wearables grant unprecedented levels of access to currently unprotected human data, bringing the data question into sharp relief (Glazer, Citation2016; Lafrance, Citation2017).

In the face of these developments, the ongoing analysis and critique of the complex issues surrounding personal data protection in the realm of wearable technology demands a new urgency. The snapshot of attitudes in fashion tech design examined here, for instance, reveals a problematic logic permeating the culture where wearable biotech is developing. Saying users can rely on choice as protection against data exploitation resembles the claim that workers in a capitalist system can choose whether or not to sell their labor. If workers opt out of the system, they will be denied its benefits, like earning a living wage to pay for a place to live or put food on the table. Similarly, the idea that wearers can opt out of sharing their data via their devices is definitely possible, but the point becomes moot when this anonymity interferes with function, and access to information crucial to daily life. Inevitably personal identifiers get logged, location services get flipped on, and individual efforts at protecting vulnerable data go by the wayside.

Until now, the calculus of creepiness vs. convenience or cool has at least had the illusion of choice with regard to personally identifiable information. With new developments in biometrics pulling for increasingly intimate data excavation, however, engaged users actively sharing their blood, sweat, and tears may be giving away far more than they bargained for. Navigating these advances in an equitable manner urgently demands critical analysis of how wearable tech’s new entanglements with biotech trouble basic assumptions about bodies, technology, value, and ownership.

Disclosure statement

No potential conflict of interest was reported by the author.

ORCID

Elizabeth Wissinger http://orcid.org/0000-0001-6873-5933

Additional information

Funding

This work was supported by City University of New York Chancellor's Research Fellowship, Faculty Publication Program Research and Writing Grant, Professional Staff Congress of CUNY Research Award.

Notes

1 Radio Frequency identification via tiny computer chips able to track items at a distance.

2 This paper draws on 22 interviews with fashion and tech designers, data scientists, and wearable tech entrepreneurs, as well as select experts in cybersecurity and intellectual property law. The interview data is combined with participant observation at fashion tech summits, conferences, and meet ups in the New York City. The term ‘fashion tech’ broadly refers to both wearing technology as fashion and technological innovations in the fashion industry.

3 Quote is from a personal interview a few weeks after the event.

4 The debate about this concept’s meaning and use is widespread, and ongoing. For a sampling of the issues at hand, see Bucher and Helmond (Citation2018), Davis and Chouinard (Citation2017), Evans et al. (Citation2017), Nagy and Neff (Citation2015).

References

  • Ajana, B. (2017). Digital health and the biopolitics of the quantified self. Digital Health, 3, 2. doi: 10.1177/2055207616689509
  • Bucher, T., & Helmond, A. (2018). The affordances of social media platforms. In J. Burgess, T. Poell, & A. Marwick (Eds.), The Sage handbook of social media (pp. 233–253). London: Sage.
  • Clough, P. T., Haber, B. R., Scannell, J., & Gregory, K. (2015). The datalogical turn. In Non representational methodologies: Re-envisioning research (pp. 146–164). New York, NY: Routledge.
  • Crawford, K., Lingel, J., & Karppi, T. (2015). Our metrics, ourselves: A hundred years of self-tracking from the weight scale to the wrist wearable device. European Journal of Cultural Studies, 18(4–5), 479–496. doi: 10.1177/1367549415584857
  • Davis, J., & Chouinard, J. (2017). Theorizing affordances: From request to refuse. Bulletin of Science Technology and Society, 36, 241–248. doi: 10.1177/0270467617714944
  • Evans, S. K., Pearce, K. E., Vitak, J., & Treem, J. (2017). Explicating affordances: A conceptual framework for understanding affordances in communication research. Journal of Computer-Mediated Communication, 22, 35–52. doi: 10.1111/jcc4.12180
  • Fotopoulou, A., & O’Riordan, K. (2016). Training to self-care: Fitness tracking, biopedagogy and the healthy consumer. Health Sociology Review, 25(3). Retrieved from http://sro.sussex.ac.uk/60044/
  • Glazer, A. (2016, March 9). Biometrics are coming, along with serious security concerns. Wired. Retrieved from https://www.wired.com/2016/03/biometrics-coming-along-serious-security-concerns/
  • Haggerty, K., & Ericson, R. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605–622. doi: 10.1080/00071310020015280
  • Lafrance, A. (2017, March 24). Who owns your face? Theatlantic.com. Retrieved from https://www.theatlantic.com/technology/archive/2017/03/who-owns-your-face/520731/
  • Lanzing, M. (2016). The transparent self. Ethics and Information Technology, 18(1), 9–16. doi: 10.1007/s10676-016-9396-y
  • Levy, K. E. C. (2014). Intimate surveillance. Idaho Law Review, 51, 679.
  • Lupton, D. (2016). The quantified self (1st ed.). Cambridge: Polity.
  • Nafus, D., & Sherman, J. (2014). Big data, big questions this one does not go up to 11: The quantified self-movement as an alternative big data practice. International Journal of Communication, 8, 11.
  • Nagy, P., & Neff, G. (2015). Imagined affordance: Reconstructing a keyword for communication theory. Social Media+Society, 1(2). Retrieved from http://journals.sagepub.com/doi/full/10.1177/2056305115603385
  • Neff, G., & Nafus, D. (2016). Self-Tracking (1st ed.). Cambridge, MA: The MIT Press.
  • Nicholls, B. (2016). Everyday modulation: Dataism, health apps, and the production of self-knowledge. In H. Randell-Moon & R. Tippet (Eds.), Security, race, biopower (pp. 101–120). New York: Palgrave Macmillan.
  • Nissenbaum, H., & Patterson, H. (2016). Biosensing in context: Health privacy in a connected world. In D. Nafus (Ed.), Quantified: Biosensing technologies in everyday life (pp. 79–100). Cambridge, MA: MIT Press.
  • Prahl, A. (2015). Designing wearable sensors for preventative health: An exploration of material, form and function (PhD thesis). University of the Arts London.
  • Sanders, R. (2017). Self-tracking in the digital era: Biopower, patriarchy, and the new biometric body projects. Body & Society, 23(1), 36–63. doi: 10.1177/1357034X16660366
  • Thompson, C. (2015, October 19). Google just patented a smart, futuristic, solar-powered contact lens. Business Insider. Retrieved from http://www.slate.com/blogs/business_insider/2015/10/19/google_patents_smart_wearable_contact_lens_that_monitors_data_and_is_powered.html
  • Wood, J. (2018). "Revolutions in Wearable Technology for Apparel" High Performance Apparel.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.