1,246
Views
2
CrossRef citations to date
0
Altmetric
Open Peer Commentaries

Direct-to-Consumer Neurotechnologies and Quantified Relationship Technologies: Overlapping Ethical Concerns

In her target article, Karola Kreitmair (Citation2019) discusses what she calls direct-to-consumer neurotechnologies (DTC neurotechnologies): technologies available on the market for monitoring or modulating neurological and psychological functioning. Kreitmair’s aim is to identify a set of basic ethical concerns that apply to this class of technologies—a class which, Kreitmair acknowledges, is unwieldy. We recently undertook a similar project in a joint pair of papers (Danaher et al. Citation2018a, Citation2018b): we identified an unwieldy class of what we call quantified relationship technologies (QR technologies): technologies used for tracking or logging aspects of romantic or other intimate relationships with the aim of improving them. In our articles, we identified and critically assessed a set of ethical concerns related to such technologies. In this commentary, we wish to explore the relationship between our treatment of the ethics of QR technologies and Kreitmair’s treatment of the ethics of DTC neurotechnologies.

We see the classes of DTC neurotechnologies and QR technologies as overlapping or intersecting. Accordingly, some of the ethical concerns that we discussed in relation to QR technologies should plausibly be added to the list of basic ethical concerns about DTC neurotechnologies that Kreitmair has assembled. Below, we first explain and motivate this claim, and then discuss the status of these different ethical concerns. In particular, we ask whether they should be regarded as knockdown arguments against these technologies, or as ethical constraints to be respected in their potential development or use. Before proceeding, however, we wish to note that we are in broad agreement with most—if not all—of what Kreitmair argues in her article. What follows is therefore not so much a criticism of her discussion as it is an attempt to develop and expand upon it.

KREITMAIR’S ETHICAL CONCERNS ABOUT DTC NEUROTECHNOLOGIES

Examples Kreitmair gives of DTC neurotechnologies are: (1) noninvasive brain stimulation technologies, such as transcranial direct current stimulation and Vagus nerve stimulation, (2) virtual reality systems, (3) wearable devices for measuring blood pressure, breathing volume, and blood sugar, and (4) smartphone mental health apps meant to replace or augment traditional therapy. Kreitmair argues that each of these technologies is intended to promote the well-being or flourishing of their respective users. Accordingly, Kreitmair’s ethical focus is on whether these technologies really do promote those ends or whether, in contrast, they might in fact be detrimental to well-being and flourishing. In line with this, Kreitmair argues that it would be ethically problematic if DTC technologies are not:

  1. Safe

  2. Transparent with respect to how they work

  3. Respectful of privacy

  4. Epistemically appropriate by giving the user true and useful information

  5. Conducive to “existential authenticity”

  6. Justly distributed among people

  7. Subject to proper oversight

We agree that these are valid and important ethical concerns. However, it occurs to us that they are mostly centered around “individualistic” risks or problems. In raising these concerns, then, Kreitmair’s main aim appears to be the protection of individual users’ safety, privacy, experience, understanding, access to the technologies and so on. But how might these technologies affect individuals’ relationships, social experiences, or (other) interpersonal connections and commitments? Following what Jennings (Citation2016) calls the “relational turn in bioethics,” we suggest these latter dimensions deserve greater consideration (see also Clark et al. Citationin press; Earp and Savulescu Citation2018, Citation2020; Griffy-Brown et al. Citation2018)

In establishing a set of basic ethical concerns about DTC neurotechnologies, we argue that the potential relational or interpersonal effects of the technologies should be given at least as much weight as individual-based concerns. Focusing on our previous example of romantic relationships, is it potentially bad for one’s love life or sex life that one is using DTC neurotechnologies? Might it impair the flourishing of the relationship? When Kreitmair briefly discusses the concept of human flourishing in her piece (before moving on to laying out her ethical concerns), she does mention close, loving relationships as central to well-being. Based on her own theoretical commitments, then, it seems to us that relational considerations should be among the basic ethical concerns one might raise about DTC neurotechnologies.

SOME QR TECHNOLOGIES FORM A SUBSET OF THE CLASS OF DTC NEUROTECHNOLOGIES

So what sort of effects might DTC neurotechnologies have on relationships? One way to approach this question is to explore the overlap between the QR technologies that we have identified and the DTC neurotechnologies identified by Kreitmair. In explaining what she means by DTC neurotechnologies, Kreitmair writes:

… neurotechnology refers to technology that enables the monitoring or modulation of neurological or psychological function … Direct-to-consumer or DTC refers to products that can be purchased directly by a consumer, without any involvement of a researcher or treating clinician. Moreover, DTC neurotechnology tends to be personal, digital, and mobile. These technologies are personal, because they tend to be used by only one individual at a time, and monitor or modulate the brain function of that individual. They are digital, because they utilize binary computing systems that enable powerful processing and online connectivity. Finally, these technologies are often mobile, because they are small and light-weight enough that individuals can carry them around on their persons.

Like the DTC neurotechnologies described in this passage, what we call QR technologies (see above definition) are available direct to the consumer. Some of them enable measuring or modulating neurological or psychological functioning (as it relates to intimate relationships). They also tend to be digital and mobile.

The only thing about Kreitmair’s characterization of DTC neurotechnologies that might seem to clash with our account of QR technologies is her claim that DTC neurotechnologies “tend to be personal.” That is, they “tend to be used by only one individual at a time, and monitor or modulate the brain function of that individual.” But tendencies by their nature are not without exceptions, and we see no reason why at least some DTC technologies could not be used by couples for relationship-related purposes. For example, certain devices might be used to measure aspects of sexual performance or sexual experience, which is a sub-set of what some existing QR technologies are used for (see Danaher et al. Citation2018a). Such measurement could still be seen as “personal” if just one participant of the sex act used the device, but there might be interpersonal implications assuming that the sex act involved two or more people. Even one person’s use of a technology to measure aspects of a sexual or other intimate encounter could affect the quality and meaning(s) of that encounter for the others involved. Indeed, Kreitmair herself makes this point both in her present essay and in a commentary she wrote in response to our target article on QR technologies (see Kreitmair Citation2018).

ADDING TO THE LIST OF BASIC ETHICAL CONCERNS ABOUT DTC NEUROTECHNOLOGIES

If at least some QR technologies can reasonably count as a sub-class of DTC neurotechnologies, then some or all of the ethical concerns one might have about QR technologies might also pertain to DTC neurotechnologies (at least the ones whose use might have an interpersonal aspect). In addition to the concern about privacy raised by Kreitmair, which appears on our list of ethical concerns as well, we identified seven other worries one might have about QR technologies. Specifically, we argued that they might:

  1. Ultimately be ineffective: they might not achieve what they purport to achieve, even if one accepts that the end to which they are being put is acceptable (whether at an individual or relational level). The ethical concern here would simply be one of misused or wasted resources.

  2. Put too much focus on what can be measured, ignoring or downplaying important aspects of relationships (or other aspects of life) that cannot—or perhaps should not—be measured or quantified.

  3. Promote objectionable exchange-based or bargaining models of interpersonal interaction, by virtue of tracking or logging different aspects of relationships (or other social phenomena).

  4. Undermine mutual trust by encouraging interpersonal surveillance.

  5. Encourage an overly instrumental view of the value of relationships.

  6. Reinforce problematic social scripts or stereotypes.

  7. Put too much responsibility on individuals to solve the problems associated with their relationships or lives (through tracking and optimizing behavior) and not enough on systemic or structural reform (e.g. social supports for vulnerable parties).Footnote1

Some DTC neurotechnologies might be subject to one or more of these objections, insofar as they overlap with or count as instances of QR technologies. For DTC neurotechnologies that do have an interpersonal element or which may be used in ways that substantially affect relationships, therefore, we suggest that these concerns be added to the list.Footnote2

CONCLUDING DISCUSSION

Suppose that the seven ethical objections discussed by Kreitmair are expanded to include our seven additional objections, for a total of fourteen. At first blush, the sheer volume of ethical concerns one might have about DTC neurotechnologies could be taken to suggest that they should simply be avoided or discouraged in some way, if not outright prohibited (for a related discussion, see Danaher et al. Citation2017).

But this depends on how one views the status of those fourteen concerns. It depends, for example, on whether one views any (or perhaps all) of them as knock-down arguments against the development, use, or availability of the technology, or whether they might instead be seen as ethical guidelines or constraints that should govern such development, use or availability without necessarily seeking to avoid those outcomes altogether. As one of us has argued elsewhere:

as long as one does not subscribe to a strong technological determinism, according to which the mere availability of a given technology inevitably produces certain social outcomes (thus making most ethical discussion irrelevant), the solution does not lie in avoiding potentially problematic technologies altogether. Instead, it has to do with attempting to anticipate any difficulties that may be associated with those technologies and then modifying the context (social, legal, etc.) in which they would most likely be used (Earp et al. Citation2015, 329).

As we understand Kreitmair, it is something like this approach she would endorse with respect to the original seven concerns she raised in her piece. In our articles on QR technology, however, we went a bit further, arguing for a stance of “cautious optimism.” Despite the existence of various valid ethical concerns, that is, we suggested that QR technologies could plausibly developed, used, and made available in a responsible way, such that they would have a genuinely positive impact on at least some people’s intimate relationships. Part of our motivation in arguing this was that it can be easy, when considering a vexing new technology, to think of all the ways it might go wrong. But through ethical deliberation and active efforts to put in place appropriate policy measures, even technologies whose risks may seem more salient than any benefits might be harnessed to better ends. Might there be grounds for cautious optimism towards DTC neurotechnologies?

Kreitmair raised a number of nuanced ethical concerns about DTC neurotechnologies. We have added another seven. These concerns suggest that care and perhaps even caution may be appropriate in this case. In general, however, concerns raised in ethics research about emerging technological developments need not be seen as automatic stop signs. Rather, they might be seen as contours of a roadmap for potential regulation of the technologies, and perhaps even a green light for those technologies that demonstrate respect for, or adequately incorporate, the ethical constraints that have been identified. ▪

Notes

1 It is worth noting that this last objection adds an important dimension to our consideration of the justice of these technologies. Justice is not simply about equal access to the technology but how the technology affects the distribution of other social burdens and benefits. This dimension seems to be overlooked in Kreitmair’s analysis of justice.

2 We wish to emphasize that some of the objections listed above might also apply to DTC neurotechnologies that are not QR technologies. For example, some non-QR DTC neurotechnologies might be ineffective and therefore wasteful of resources; some might reinforce problematic social scripts or stereotypes, undermine mutual trust among people, etc.

REFERENCES

  • Clark, M. S., B. D. Earp, and M. J. Crockett. in press. Who are “we” and why are we cooperating? Insights from social psychology. Behavioral and Brain Sciences.
  • Danaher, J., B. D. Earp, and A. Sandberg. 2017. Should we campaign against sex robots? In Robot sex: Social and ethical implications, ed. J. Danaher and N. McArthur, 47–71. Cambridge, MA: MIT Press.
  • Danaher, J., S. Nyholm, and B. D. Earp. 2018a. The benefits and risks of quantified relationship technologies. The American Journal of Bioethics 18(2): W3–W6. doi: 10.1080/15265161.2017.1422294.
  • Danaher, J., S. Nyholm, and B. D. Earp. 2018b. The quantified relationship. The American Journal of Bioethics 18(2): 3–19. doi: 10.1080/15265161.2017.1409823.
  • Earp, B. D., A. Sandberg, and J. Savulescu. 2015. The medicalization of love. Cambridge Quarterly of Healthcare Ethics 24(3): 323–336. doi: 10.1017/S0963180114000206.
  • Earp, B. D., and J. Savulescu. 2018. Love drugs: Why scientists should study the effects of pharmaceuticals on human romantic relationships. Technology in Society 52(1): 10–16. doi: 10.1016/j.techsoc.2017.02.001.
  • Earp, B. D., and J. Savulescu. 2020. Love drugs: The chemical future of relationships. Stanford: Stanford University Press.
  • Griffy-Brown, C., B. D. Earp, and O. Rosas. 2018. Technology and the good society. Technology in Society 52(1): 1–3. doi: 10.1016/j.techsoc.2018.01.001.
  • Jennings, B. 2016. Reconceptualizing autonomy: A relational turn in bioethics. Hastings Center Report 46(3): 11–16. doi: 10.1002/hast.544.
  • Kreitmair, K. V. 2019. Dimensions of ethical direct-to-consumer neurotechnologies. The American Journal of Bioethics Neuroscience 10(4):152–166.
  • Kreitmair, K. 2018. Phenomenological considerations of sex tracking technology. The American Journal of Bioethics 18(2): 31–33. doi: 10.1080/15265161.2017.1409842.