3,414
Views
9
CrossRef citations to date
0
Altmetric
Guest Editorial

Direct-to-Consumer Neurotechnology: What Is It and What Is It for?

This article refers to:
Dimensions of Ethical Direct-to-Consumer Neurotechnologies

WHAT IS DIRECT-TO-CONSUMER NEUROTECHNOLOGY?

The use of neurotechnology applications in the consumer domain for extra-clinical purposes has recently raised significant attention among researchers. Different labels have been used in the attempt to define this heterogeneous and rapidly developing technological market. Among others: ‘consumer’ (Ienca, Haselager and Emanuel Citation2018), ‘consumer-directed’ (Kellmeyer Citation2018), and ‘direct-to-consumer (DTC)’ neurotechnology (Kreitmair Citation2019; Wexler and Reiner Citation2019).

A taxonomy is starting to emerge from this discussion. While prima facie synonymous, these labels identify non-identical intersectional subsets of neurotechnology. Consumer neurotechnology is the subset of all neurotechnological applications used in the consumer market. This subset includes two main categories: (i) the study of consumer behavior in the neuromarketing setting; and (ii) DTC devices & services for personal use.

The former category involves the use of standard neurotechnology (e.g. functional magnetic resonance imaging, fMRI, or electroencephalography, EEG) to reveal hidden information about the consumer experience, including preference measurements (Ariely and Berns Citation2010). In recent years, the number of companies using neuromarketing research approaches has increased exponentially (Plassmann, Ramsøy and Milosavljevic Citation2012).

The latter category is defined by Kreitmair (Citation2019) as neurotechnological products “that can be purchased directly by a consumer, without any involvement of a researcher or treating clinician.” She observes that DTC neurotechnologies have three common features: they tend to be personal, digital, and mobile. They are considered personal as they tend to be used by only one individual at a time, digital as they utilize binary computing systems and online connectivity, and mobile because their hardware is typically “small and light-weight enough that individuals can carry them around on their persons” (153).

While this description adequately captures the essential features of DTC neurotechnologies, researchers debate on where the boundaries of this subset should be drawn. Kreitmair advocates for an inclusive approach which envelopes not only consumer-grade brain-computer interfaces (BCIs) and personal neuromodulation devices but also more widely adopted technologies such as wristband activity trackers, smartphones and mobile apps. Her rationale is that although these devices and programs neither read nor write neural information, they can nonetheless be used to indirectly monitor and make inferences about psychological functions. As an example, she cites the E4 wristband from Empatica, a wearable tracker that combines a Global Positioning System (GPS) with electrodermal activity data from galvanic skin responses to allegedly reveal “how emotional arousal varies with location”(Kreitmair Citation2019). In her view, since emotional arousal is a psychological function —and, we would add, since psychological functions are physically realized via neural activity—then even wearable trackers like the E4 might be considered a DTC neurotechnology. This argument can be formalized as follows: if T can be used to make inferences about PFs, then T∈N (T = technology, PF = psychological function, N = neurotechnology).

This liberal definition raises a taxonomic risk: if even devices like the Fitbit and the Apple Watch are considered neurotechnologies, then the definition of DTC neurotechnology becomes so broad it is semantically and pragmatically vacuous. Psychological functions such as affective states can be reliably inferred through a huge variety of non-neural measures. Paul Ekman’s famous tests of emotion recognition such as the Pictures of Facial Affect (POFA) stimulus set and the Facial Action Coding System are well-known metrics for inferring emotions from facial expressions. Similarly, research in speech processing and machine learning has shown that both the content and prosody of spoken language can be used for emotion recognition or even for “inferring clinical depression” (Asgari, Shafran and Sheeber Citation2014). This would imply that any voice recorder or any technology that collects photographic or videographic records of facial expressions (e.g. any camera) should be considered a neurotechnology too even though they do not establish any meaningful interface with someone’s nervous system. But if everything is neurotechnology then nothing is.

These technologies have profound structural and functional diversity and they do not directly process neural activity. Therefore, it is unlikely that a broad definition of DTC neurotechnology, like the one proposed by Kreitmair, can be of any use for scientists and regulators. In contrast, a more narrow definition that restricts the neurotechnology label only to “wearable devices for recording and uploading our brain activity” (Kellmeyer Citation2018) appears epistemically and normatively much preferable.

While perhaps taxonomically unhelpful, Kreitmair’s observations about the DTC neurotechnology “continuum” have the merit of capturing an important implication of consumer neurotechnology: if enough data are collected and adequately analyzed, it is possible to make inferences about psychological functions not only from neural recordings (e.g. EEG or fMRI data) but also from non-neural measures such as electrodermal activity, voice records, online searches, etc. More broadly, “big data” allows researchers and companies to make health-related inferences from non-medical data. As we argued elsewhere (Ienca et al. Citation2018, Ienca, Vayena and Blasimme Citation2018; Vayena and Gasser Citation2016), novel scientific and ethical challenges are raised by the increasing possibility of collecting, aggregating and mining differently structured data from myriads of data points and subsequently using them to identify patterns using machine learning and other intelligent analytics. To adequately meet these challenges, scientists, ethicists and regulators should not focus exclusively on data types or from where the data originate but also on what the collected data can be used for. This does not require conflating scientific or technological taxonomies, but will likely benefit from dynamic approaches to technology assessment, ethics and regulation.

WELLNESS, WELLBEING AND HUMAN FLOURISHING

Kreitmair’s analysis discusses what DTC neurotechnology is and what it is for, that is what its ultimate function can be. The author rightly argues that “while they may address concerns that are also addressed by medical devices, DTC neurotechnologies (in the US), by definition, cannot be intended for the treatment or management of particular medical conditions, else they would be subject to FDA regulation” (155). Nonetheless, she correctly acknowledges that the lines are blurred and several DTC products touch “on conditions the treatment of which may also fall into the clinical arena.” This is due not only to the health-related big data quandary described above but also to the fact that several DTC make para-clinical claims (Coates McCall et al. Citation2019).

According to Kreitmair, once “we recognize that consumer products are not intended to fulfill the function of addressing medical needs” we become better-positioned to identify “the appropriate ethical desiderata for such products” and to recognize their “ultimate function […] namely to contribute to ‘the good life’ of the user.” In her view, this notion of “‘the good life’ corresponds broadly to the Aristotelian idea of ‘a life of human flourishing.’”

DTC neurotechnologies are complex sociotechnical constructs, hence we concur that their ethical desiderata are not reducible solely to assessing their safety and effectiveness. Ethical considerations regarding, among others, fair access to technology, cognitive liberty, the privacy of brain data, the right to object to automated processing and the ethical boundaries of cognitive enhancement are essential for an adequate assessment of this technological family. That being said, it is questionable that human flourishing in the Aristotelian sense is the ultimate function of these devices for two main reasons.

First, the function of a technology is not defined by its marketing claims. DTC neurotechnology companies frequently incorporate claims related to wellbeing in their marketing strategies. For example, the company Emotiv advertises that their DTC neuroheadsets can improve the “mental wellbeing” of users. Similarly, the rival company Neurosky claims their products can “deliver unique insights into body and mind health and wellness that can motivate people to make better lifestyle choices.”Footnote1 However, there is no evidence that these devices can actually improve wellbeing or wellness according to any of the established assessment scales such as Quality of Life (QoL), happiness, life-satisfaction, or preference-based measures.

Second, even if we assume for the sake of argument that DTC neurotechnologies could improve wellbeing by any relevant metric, those physical or psychological measures of wellbeing wouldn’t be equivalent to human flourishing in the Aristotelian sense. Human flourishing is one popular English translation of Greek word “eudaimonia” (Greek: εὐδαιμονία), sometimes also translated as happiness or welfare. In Aristotle's account, as articulated in the Nicomachean Ethics and the Eudemian Ethics, eudaimonia involves “virtuous activity in accordance with reason” [1097b22–1098a20]. In other words, Aristotle’s notion of human flourishing is much more complex and morally demanding than our modern concepts of wellness and wellbeing. Like many other ancient Greek philosophers such as Socrates, Epicurus and the Stoics, Aristotle considered virtue (aretē) and its exercise the primary constituent of Eudaimonia, more important than external goods such as health or beauty.

If virtue is inherent to human flourishing in the Aristotelian sense, then in order for DTC neurotechnologies to contribute to human flourishing, they should enable users to exercise, exhibit or improve their virtue. However, there is no evidence that DTC neurotechnologies can have any positive effect on virtue. At present, most DTC products are relatively simple devices for brainwave sensing, real-time feedback and self-quantification, hence quite far from enabling any form of moral enhancement in terms of virtuous activity or practical wisdom. Enhancing moral behavior—for instance by augmenting empathy, reducing aggressiveness, improving practical wisdom, etc.—is not a priority for DTC neurotechnology companies and the current technology is far from enabling users to accomplish these goals.

To conclude, although DTC neurotechnology companies might vaguely refer to wellness and wellbeing in their marketing claims, there is no compelling argument that the ultimate function of DTC neurodevices is contributing to human flourishing in the Aristotelian sense.

DISCLOSURE STATEMENT

The author declares no conflict of interest. ▪

Notes

1 See: https://www.neurosky.jp/company_en/ (last accessed August 27, 2019)

References

  • Ariely, D., and G. S. Berns. 2010. Neuromarketing: The hope and hype of neuroimaging in business. Nature reviews Neuroscience 11(4): 284–292. doi:10.1038/nrn2795.
  • Asgari, M., I. Shafran, and L. B. Sheeber. 2014. Inferring clinical depression from speech and spoken utterances. In 2014 IEEE international workshop on Machine Learning for Signal Processing (MLSP) Reims, France, 1–5.
  • Coates McCall, I., C. Lau, N. Minielly, and J. Illes. 2019. Owning Ethical Innovation: Claims about commercial wearable brain technologies. Neuron 102(4): 728–731. doi:10.1016/j.neuron.2019.03.026.
  • Ienca, M., A. Ferretti, S. Hurst, M. Puhan, C. Lovis, and E. Vayena. 2018. Considerations for ethics review of big data health research: A scoping review. PLoS One 13(10): e0204937. doi:10.1371/journal.pone.0204937.
  • Ienca, M., P. Haselager, and E. J. Emanuel. 2018. Brain leaks and consumer neurotechnology. Nature Biotechnology 36(9): 805–810. doi:10.1038/nbt.4240.
  • Ienca, M., E. Vayena, and A. Blasimme. 2018. Big data and dementia: Charting the route ahead for research, ethics, and policy. Frontiers in Medicine 5:13. doi:10.3389/fmed.2018.00013.
  • Kellmeyer, P. 2018. Big brain data: On the responsible use of brain data from clinical and consumer-directed neurotechnological devices. Neuroethics :1–16.
  • Kreitmair, K. V. 2019. Dimensions of Ethical Direct-to-Consumer Neurotechnologies. AJOB Neuroscience 10(4): 152–166.
  • Plassmann, H., T. Z. Ramsøy, and M. Milosavljevic. 2012. Branding the brain: A critical review and outlook. Journal of Onsumer Sychology 22(1): 18–36. doi:10.1016/j.jcps.2011.11.010.
  • Vayena, E., and U. Gasser. 2016. Strictly biomedical? Sketching the ethics of the big data ecosystem in biomedicine. In The thics of iomedical ig ata, 17–39. Berlin, Germany: Springer.
  • Wexler, A., and P. B. Reiner. 2019. Oversight of direct-to-consumer neurotechnologies. Science 363(6424): 234. doi:10.1126/science.aav0223.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.