Evidence-based insights and potential solutions provided by science are now more important than ever: the credibility, reliability and trustworthiness of science is an issue of crucial importance (Oreskes, 2019; Kitcher, 2001). Public distrust in science has increased in recent years and became more vocal in the public sphere. Increasingly numerous voices are questioning whether scientific information is sufficiently valid, disinterested and objective. Additionally, many participants in public debate experience an increase of polarization and especially social media offer space for distrustful, polarized and accusatory forms of communication. The public sphere – as an arena where citizens come together, exchange opinions and deliberate – has fragmented into multiple bubbles so it seems (Nguyen, 2020).
Even with its aim to foster trust in science, the IANUS project recognizes that “healthy skepticism” is crucial, not only as an intrinsic dimension of research methodologies, but also in the context of public debate, among whose purposes is to determine whether and when public trust in science is warranted. Still, besides the question how to foster trustworthiness in science, the aftermath of the COVID-19 experience has raised questions about whether scientists can still trust public debate. We notice skepticism among scientists whether a safe and respectful exchange of views is still possible in the public sphere and whether scientific expertise and validated knowledge are still sufficiently valued. The term science may refer to a broad spectrum of perspectives, moreover, where many disciplines and paradigms are involved, at times endorsing diverging perspectives and resulting in diverging views concerning policy and decision-making. Involving multiple disciplinary perspectives is an important requirement for developing a comprehensive approach, for instance in the case of the COVID-19 crisis (Sulik et al., 2021), which was not only about viruses and vaccines, but also about cultures, values, governance and behaviour. Yet, the spectacle of a plethora of contradictory positions may either challenge public trust in academic knowledge (e.g., the claim that, for any possible position, and expert can be found who supports it), or may raise the question among scientists whether becoming involved in such debates is a meaningfulness exercise.
Guest editors from the IANUS project Hub Zwart and Loreta Tauginienė propose a special edition of the Journal of Academic Ethics to address several possible answers to these questions. In particular, the call wants to address:
What is trust in science? Public trust in science can mean several things: trust in what scientists say (epistemic trust) and what they do – trust in a scientific method (reproducibility and replication of their work), in research findings, in individual scientists, in research institutions, in products of research and innovation, in science as a system.
Science and society: proximity or distance? Societal trust in science hinges on many factors, including the cohesion of scientific consensus on a given topic, the role science is assigned by government and policymakers, the over-extension (or sobriety) of media reports on preliminary research results, and the dilution of scientific objectivity and political neutrality through industry funding and interest conflicted science. While collaboration with societal stakeholders and industry is part of interactive and participatory research, allowing science to broaden its knowledge base, safeguarding transparency and academic independence are important dimensions of trustworthy interactive research.
Scientific misconduct and trust in science. The most long-lasting solution is to examine the forces which made such misconduct possible. This allows institutions to re-calibrate the modus operandi of science, including the role of ‘perverse incentives’. Decisive is not the occurrence of misconduct per se, but the response of research performing organisations to misconduct: can they move away from a defensive focus on minimising reputation damage towards a proactive approach, opting from prevention by strengthening the resilience of the research ecosystem (Zwart & Ter Meulen, 2019).
Trust in science and university governance. How can university governance as one of internal factors affecting the scientists shape the trust in science (e.g. through research security). How are academic institutions fostering open, responsive, responsible, impact-driven, and inclusive research, how are academic institutions reconsidering their reward systems, e.g., when it comes to acknowledgment and reward of impact-driven research, and how may this affect public trust in science?
Trust in science, conspiracy theories and the COVID-experience. The COVID-19 experience has been a watershed event also concerning trust in science. On the one hand, scientific expertise was seen as decisive in addressing the global challenge and fostering preparedness. On the other hand, the COVID-19 experience fuelled conspiracy theories (e.g., the recently published document entitled The Conspiracist Manifesto). How to analyse an assess this experience and what can we learn in terms of pathways for change?
The call for papers will be open until June 30th 2024. All submitted papers will be subject to double-blind peer review and approval by the editorial team and JAET editor-in-chief. At least two peer reviewers will be selected from the JAET reviewers’ database. You can follow this link for more information about submission details and check the Journal’s Instructions for Authors before you start writing the paper.
References
Biddle, J. (2018). “Antiscience Zealotry”? Values, Epistemic Risk and the GMO Debate. Philosophy of Science, 85(3), 360–379. https://doi.org/10.1086/697749
Kitcher, P. (2001). Science, Truth, and Democracy. Oxford University Press.
Koch, S. (2020). Responsible research, inequality in science and epistemic injustice: an attempt to open up thinking about inclusiveness in the context of RI/RRI. Journal of Responsible Innovation, 7(3), 672–679. https://doi.org/10.1080/23299460.2020.1780094
Nguyen, C. T. (2020). Echo Chambers and epistemic bubbles. Episteme. 17(2), 141–161. https://doi.org/10.1017/epi.2018.32
Oreskes, N. (2019). Why trust science? Princeton University Press.
Sulik, J., Deroy, O., Dezecache, G., Newson, M., Zhao, Y., El Zein, M., & Tunçgenç, B. (2021). Facing the pandemic with trust in science. Humanities and Social Sciences Communications, 8, 301. https://doi.org/10.1057/s41599-021-00982-9
Zwart, H., & Ter Meulen, R. (2019). Editorial: Addressing Research Integrity Challenges: From penalising individual perpetrators to fostering research ecosystem quality care. Life Sciences, Society and Policy, 15, 5. https://doi.org/10.1186/s40504-019-0093-6