277
Views
0
CrossRef citations to date
0
Altmetric
Introduction

Fake Research and Harmful Findings: Introduction to the Special Issue

ORCID Icon

ABSTRACT

The traditional mutual support of scientific progress and social advancement has given way to public reservation. Research is no longer considered worthwhile in general. Parts of the public have come to fear both scientific error and scientific success. This raises the question of how to deal with findings that could have a detrimental impact on society. In a different vein, fake research poses a serious challenge to science in that it could undermine the credibility of scientific accounts. Fake research actively produces ignorance rather than knowledge (agnotology). The question is how such actively misleading approaches are to be identified and separated from usual scientific error.

Aristotle famously opened his Metaphysics by stating that ‘all men by nature desire to know’, which expresses the appreciation of knowledge and understanding as such, that is, independent of any practical benefit garnered by this knowledge. It is intrinsically worthwhile to understand the world around us. As a result, seeking knowledge and doing research benefits the common good. The Scientific Revolution in the seventeenth century added the thought that scientific knowledge and understanding also contributes to improving the lot of humankind. Seeking the truth and enabling the intervention in nature are twin enterprises. By teaching us to use the forces of nature as tools, science guides and empowers humans to become masters of their own destiny. Truths about natural processes can be utilised for developing capabilities or skills. Francis Bacon claimed that the new science was supposed to enrich human life and to ease its hardship by new inventions; and the new means to promote this end was knowledge of the causes and the laws of nature (Bacon Citation1620, I.§81, I.§129). In the same vein, René Descartes assumed that the new science promised to afford ‘knowledge highly useful in life’. If we come to know the forces of nature ‘as distinctly as we know the various crafts of our artisans, we might apply them in the same way to all the uses to which they are apt, and thus render ourselves the lords and possessors of nature’ (Descartes Citation1637, IV.2, p. 101). Bacon and Descartes conceived the notion of ‘applied science’ in that understanding the workings of nature engenders practically relevant knowledge. In this vein, Bacon distinguishes between experiments that bring light and experiments that bear fruit and claims that, as in cultivating fruit, the light precedes the fruit (Bacon Citation1620, I.§70, p. 149). On the whole, scientific knowledge was treasured in itself and at the same time valued as a source of practical skills.

A few decades later, scientific knowledge was considered a driving engine of political progress. Eighteenth-century Newtonianism adopted John Locke’s political philosophy and sought to merge the growth of knowledge about nature with the advancement of political rights and the separation of powers (the launch pad for liberal democracy in the modern sense) (Koyré Citation1965, 18). Fast-forwarding to the twentieth century, we see the Vienna Circle as promoting the interconnection between empiricism, which included the appreciation of scientific knowledge, and utilitarianism as a cherished position in moral philosophy. The commitment to scientific knowledge and scientific philosophy is a major instrument for promoting social progress (Carnap, Hahn, and Neurath Citation1929, 202, 210, 221–222). Again, scientific knowledge was highly esteemed in itself and appreciated as making headway in social and political matters.

Today, this honeymoon is over. Research is no longer considered worthwhile in general. Parts of the public have come to fear scientific error and scientific success. As to the first item, technology and technological innovation is no longer blissfully accepted. Environmental disasters produced by new chemical substances, such as DDT, CFCs, or the presently debated PFAS, have suggested to some quarters of the wider audience that technologies may be accompanied by unanticipated risks and side-effects. In some cases, people have come to see that the implementation of research findings has done more harm than good.

Whereas this kind of fear addresses the lack of knowledge and the risk of scientific error, a different kind of fear is generated by scientific success. People are not afraid of the failed cloning of humans, but of its accomplishment. The feeling is that if science is left to its own devices, it will create morally doubtful opportunities that are liable to transform society in an unwanted and awful way. The present debates about the prospects and risks of artificial intelligence are of the same type: will future AI advance human well-being or will it make living and working conditions more precarious and eventually throw humankind on the scrap heap?

It goes without saying that hardly any such issue will be clear-cut; lots of issues will leave room for debate. The chief point here is that it is generally and unmistakably recognised that research may do harm.

Janet Kourany strongly emphasises this conclusion. Starting off from the natural sciences, it is widely acknowledged that certain kinds of biological research are hazardous. A case in point is the creation of a highly efficient flu virus by the Rotterdam biologist Ron Fouchier in 2011 that proved to be fatal to ferrets. For obvious reasons, its effect was not tried out in humans, but the moral and legal question arose whether the result should be published or better covered by the veil of beneficial ignorance. The fear was that Fouchier’s discovery could be abused by bioterrorists (Kourany and Carrier Citation2020, 10). Many scientists think that research of such kind has a high potential of being harmful and should not be pursued.

As Kourany argues, the same kind of logic should be applied to the social sciences. The area she has in mind is cognitive differences research, which seeks to identify and explore disparities in the intellectual capacity of diverse segments of society. Such research has a high potential of doing damage to women and minorities and is likely to interfere with human flourishing. More concretely, what is harmful is detrimental information about others and the lack of helpful information, that is, knowledge that science does not offer. In order to avoid inflicting harm on the people, science ought to promote egalitarian or equality-supporting social values. As Kourany argues, letting research be guided by such values can proceed in good harmony with epistemic demands. A socially engaged science is in no way tantamount to an epistemically crippled science; it will not be biased nor incoherent. What is impaired, of course, is the freedom of research. Such freedom needs to give way to moral values such as the preservation of health or social equality.

Torsten Wilholt addresses the same question of how to deal with a trade-off between epistemic aspirations and social or moral values. He sides with Kourany and agrees that the moral and social damage done by research endeavours militates against their pursuit. Yet, he also inquires what impact such stopped or blocked projects have on the credibility and trustworthiness of science. It is clear that science would lose its competence for such abandoned fields, and such a loss of credibility could be damaging to socially and morally relevant ambitions. The withdrawal of science could cede the ground to scientifically uneducated laypeople. The attempt to avoid producing harm could, via the retreat of science, give free reign to political actors who produce harm themselves. Wilholt’s suggestion is that science should not explicitly withdraw from contested issues via bans and moratoria but rather tacitly scale back research activities. In fact, it is an old political slogan that you should never openly announce the closing down of some institution. What will happen is that its defenders will raise resistance and rally support so that the institution comes out more robust and durable as before. The only way to close down an establishment is to slowly and tacitly starve it to death.

The second topical chunk of this special issue is fake research. Fake research is not to be confused with scientific error. Lots of bona-fide research-endeavours may make mistakes on some occasions and come out flawed, but they are unquestionably part of the scientific enterprise. In a Millian spirit, one might even say that they benefit this enterprise by providing opportunities for other scientists to hone their views and sharpen their positions. By contrast, fake research either deliberately misrepresents the data, distorts opposing approaches, and poses misleading and ill-supported conclusions, or alternatively commits egregious methodological blunders to the same effect. In both varieties, fake research actively produces ignorance rather than knowledge. Pseudo-research of this kind has been first identified by Robert Proctor in 1992, who coined the term ‘agnotology’ for the theory of such ‘agnogenetic’ or ‘agnotological’ moves (Proctor Citation2008, 27–28).

Not any maintenance or production of ignorance is harmful though. Most of us care for the protection of privacy, and many would want to stay ignorant about an imminent fatal genetic disease they may be afflicted with without any therapeutic options available. Such ignorance may be ‘virtuous’. Furthermore, it is a common feature in research that if certain aspects are highlighted, others are thereby eclipsed. Ignorance may thus be an unintended but unavoidable by-product of the gain of knowledge. For instance, if AIDS is conceived and addressed as a biomedical problem of how to combat the virus, the complementary public health problem of how to deal with those afflicted with the virus (that is, access to medical care, assistance for their daily lives, etc.) is left untreated (Kourany and Carrier Citation2020, 12).

Fake research, as dealt with in this special issue, is of the harmful variety. The question addressed is how to conceive of and conceptualise this kind of endeavour. Proctor had placed the intention to mislead at centre stage. His pivotal notion is the active and strategic construction of ignorance (Proctor Citation2008, 7–8). Such agnogenetic machinations are part of a plot to deceive the public and politics. It may, however, appear impractical to have agnotology turn around maybe hidden or actively concealed intentions and objectives. More tangible criteria seem desirable.

Inmaculada de Melo-Martín looks for such defining characteristics and comes out empty-handed. As she argues, no reliable indicators of fake research can be identified. What is even worse, no research endeavour can be dependably branded as producing ignorance and harm because each has the capacity to stimulate criticism and raise opposition, both of which would actually be beneficial to the production of knowledge. On the contrary, labelling certain accounts as ignorance-producing and leaving them out of consideration is liable to weaken critical scrutiny and thereby hurts the epistemic quality of research. This applies, in particular, to the supposed criterion of bad-faith motivation or the willingness to misinform the public or politics. What is more important is to focus on the role that social, political, and moral values play in the formulation and implementation of science-based public policies. Agnotological categories should be replaced with the more comprehensive question of the role nonepistemic values play in respectable research.

By contrast, Mathias Girel argues that in identifying agnogenetic moves appeal to intentions is feasible and appropriate. Behavioural indicators suffice to narrow the range of relevant intentions sufficiently. The ascription of intentions is achieved by resorting to conceptual resources from the philosophy of action. Following Elisabeth Anscombe, describing actions in general presupposes the identification of critical intentions. Such intentions are constitutive of the nature of the corresponding action. Establishing intentions is not a requirement specific of pinpointing agnotological behaviour, but rather a prerequisite of deciding what agents are doing in the first place. Singling out agnogenetic behaviour needs to disentangle the nested intentions at work, but this is what we achieve routinely in discursive interaction.

Martin Carrier also argues that agnogenetic machinations can be delineated adequately. Given this ability, they should, in fact, be identified and ostracised since they are suited to produce severe harm in the area of political regulation. Relevant kinds of fake research are twofold. One is agnogenetic ploys in which scientific dissent is created by interested parties from industry or politics in order to support their own partisan goals. Such endeavours are characterised by three common elements: a methodological rule is violated, economic or political goals are served, and a regulatory process (e.g. the market admission of a substance) is interfered with. The chief harm done by scientifically unjustified dissent does not arise in the sciences or in epistemic respect in the first place. The trouble rather lies with the impact of such ploys on the political sphere of regulating markets. Carrier also claims that such agnotological accounts can be identified by a variety of indicators. The second stronghold of agnotological accounts is the populist antiscience movement that mistakenly suspects fake research in the scientific mainstream. Carrier suggests three remedies to reduce or eliminate the influence of such groups: disclosing fallacies, improving the understanding of scientific methods, and distinguishing more clearly between science and politics in political decision-making.

All contributors to this special issue agree that the old covenant between science and social goods is abrogated. We have moved beyond attitudes popular in the 1960s that advertise technocratic thought and place science at the helm of social evolution. Think of Jacques Monod who promotes social and political institutions whose primary concern is to defend the enrichment of the kingdom of ideas. In his view, humans create this kingdom but are their subjects at the same time. Science is not only the source of truth but also of moral inspiration (Monod Citation1969, 180). This passage may easily be misunderstood to the effect that humans are primarily called upon to extend scientific knowledge. Such primacy of epistemic aims sounds distressful to contemporary ears. It is not generally accepted but agreed upon by many scientists, politicians, and philosophers and social students of science that societal institutions need to guide and do justifiably guide scientific research in the social arena. The public should be protected against harmful and misleading research. That is the overall message of this special issue. The Scientific Revolution has brought the idea into the world that humans are the masters and owners of nature. Now the time has come to add that humans, or society at that, are the masters and owners of science.

Disclosure Statement

No potential conflict of interest was reported by the author(s).

References

  • Bacon, Francis. 1620. Neues Organon, ed. W. Krohn, trans. R. Hoffmann, lat./dt., Hamburg: Meiner, 1990. The New Organon, trans. J. Spedding, R.L Ellis, D.D. Heath, The Works VIII, Boston: Taggard and Thompson, 1863.
  • Carnap, Rudolf, Hans Hahn, and Otto Neurath. 1929. “Wissenschaftliche Weltauffassung. Der Wiener Kreis.” In Logischer Empirismus. Der Wiener Kreis, edited by H. Schleichert, 201–222. München: Fink, 1975.
  • Descartes, René. 1637. Discours de la méthode. Hamburg: Meiner. 1960.
  • Kourany, Janet, and Martin Carrier. 2020. “Introducing the Issues.” In Science and the Production of Ignorance. When the Quest for Knowledge is Thwarted, edited by J. Kourany, and M. Carrier, 3–25. Cambridge, Mass.: MIT Press.
  • Koyré, Alexandre. 1965. Newtonian Studies. Cambridge, Mass.: Harvard University Press.
  • Monod, Jacques. 1969. Chance and Necessity. An Essay on the Natural Philosophy of Modern Biology. New York: Vintage Books, 1972.
  • Proctor, Robert N. 2008. “Agnotology. A Missing Term to Describe the Cultural Production of Ignorance (and its Study).” In Agnotology. The Making and Unmaking of Ignorance, edited by R. N. Proctor, and L. Schiebinger, 1–33. Stanford: Stanford University Press.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.