Keywords: Metascience, science reform, methodology, philosophy of science
The scientific community has entered a challenging era, as originally noted by Wagenmakers (2012). High-profile instances of fraud, failures to replicate foundational studies in psychology, and admissions of research misconduct (John et al., 2012) cast a shadow over the field of psychology initially, and the broader enterprise of science over the decade that has passed. In response to these concerns, a movement aimed at reforming scientific practices has emerged (Field, 2022; Munafò et al., 2017; Spellman et al., 2018). This movement has introduced various initiatives to enhance research methods, reduce misconduct, and increase transparency (Ravenzwaaij et al., 2023). Direct replications and articles reporting null results and errors (Devine et al., 2020) have become more accessible for publication. Concepts like preregistration and registered reports have gained significant popularity. Additionally, various aspects of science are now more "open," encompassing preprints, open-access publications, open peer review, and openly accessible data and code.
While these initiatives offer immediate benefits to both the scientific community and researchers, one might argue that the scientific reform movement is still in its early stages; the long-term effects of many interventions have yet to be assessed. Furthermore, the system of academic science is a complex one, and the downstream consequences of modifying complex systems are notoriously difficult to anticipate. Notably, Devezer et al. (2020) have raised concerns regarding the "little evidentiary backing" and lack of a framework for assessing the validity and efficacy of reform policies. Ioannidis (2014) and Tiokhin et al. (2021) share similar apprehensions about the potential unintended consequences of well-intentioned reforms. Additionally, adopting reform practices might carry negative consequences for students and early career researchers, who are often beholden to supervisors and structures with their own dependence on traditional academia (Field, 2023).
Field, van Dongen, and Tiokhin arrived at an articulation of these concerns individually, yet at a similar time. In the early 2020’s, Tiokhin came to van Dongen about frustrations that people were talking about positive primary effects of the science reform movement, without considering potential second and even third order effects. They agreed that unintended consequences drive change just as much as their primary catalyst. Together, they decided they were going to write a book addressing unintended consequences in the reform movement, and that they would “make waves.” Although the book has yet to materialize, their discussion produced a workshop for the Center for Unusual Collaborations in Amsterdam in 2021 and a session at the virtual Metascience conference of 2021.
Field’s own concerns about the unintended consequences of science reform began in 2020, during her PhD which centred on the “science reform community” and its practices. While she was supportive of the changes the community had been ushering in, she was concerned about the lack of reflexivity that she saw in some of the community’s members and the discomfort some members clearly experienced when questioned about how certain initiatives and practices would affect other aspects of the academic enterprise. Field is an editor at the Journal of Trial & Error and had pitched a special issue concept to the JOTE team in mid-2021, but was unsure if she had the time and energy to bring it to fruition on her own. When she attended the 2021 Metascience session run by Tiokhin and van Dongen (among others), Field realized she could harness some of the fine brain power of her metascientific colleagues and reached out to Tiokhin and van Dongen in the hope that they would help her with the project as guest editors.
To address the challenges we identified, and to ensure that reform efforts remain credible and self-reflective, we argue that the scientific reform movement must continually interrogate its practices and proposed initiatives and remain vigilant to the unintended consequences that may arise. In this context, this special issue aims to inspire scholars to critically reflect on the trajectory of the scientific reform movement. We now outline the eight articles of the special issue, each of which deal with a different facet of the science reform movement.
Questionable Metascience Practices by Rubin (2023) considers a parallel concept to “questionable research practices” called questionable metascience practices (QMPs). Rubin explores 10 QMPs, including ignoring or rejecting criticism of one’s own proposed reforms, and the overemphasis on replication. He urges metascientists to “reflect on the ways in which they (a) handle criticism, (b) conceptualize replication, (c) consider researcher bias, (d) avoid sweeping generalizations, and (e) acknowledge the diversity and pluralism of science.”
Reflections on Preregistration: Core Criteria, Badges, Complementary Workflows (2023) by Thibault, Pennington and Munafò, explores the issues surrounding preregistration, a key pillar of the reform movement. While preregistration promises transparency in theory, challenges emerge in practice, especially in the case of clinical research trials, such as the absence of analysis plans and issues with the awarding of preregistration badges. Their recommendations to the reform and clinical research communities in light of these concerns are straightforward: They propose the consideration of “parallel initiatives to simplify and standardize preregistration (e.g., adopt itemized core preregistration criteria), and to leverage complementary workflows that necessitate open research practices.”
Thomas Hostler’s article The Invisible Workload of Open Research (2023) interrogates the burden carried by researchers who practice open science, with concrete examples (a perspective that lends gravitas to existing abstract discourse). Hostler discusses how the adoption of open research practices may exacerbate stress, burnout, and workload pressures in academia, opening a Pandora’s box that some science reformers prefer to keep closed. He advocates for awareness from the science reform community (or communities), stating that “It is neither the specific role nor within the capability of open research advocates to tackle the root causes of workload issues, but they must be aware of the potential implications of their calls for systemic changes in incentives for open research.”
A team of researchers from the Quala Lab, led by Steltenpohl, contributed an article on qualitative open science: Rethinking Transparency and Rigor from a Qualitative Open Science Perspective (2023). They explain that imposing rigid quantitative standards on all research can have unintended negative consequences for many individuals and groups in the science reform movement, cautioning against the perspective that transparency is a “one-size-fits-all” concept. Qualitative researchers have unique considerations, such as reflexivity and positionality statements. Looking ahead, they argue that through “…expanding open science guidelines to leverage a broader array of rigor and transparency-promoting practices (e.g., reflexivity), we can truly begin to advance practices.”
In A Manifesto for Rewarding and Recognizing Team Infrastructure Roles, Bennett and colleagues (2023) consider team science in the context of the reform movement. They talk about professional team infrastructure roles (such as lab technicians, project managers and data stewards), which involve the crucial activity of supporting research while at the same time do not have responsibility of leading team projects, and therefore typically miss out on reward and recognition, especially under current systems and structures. They underscore the need for fairness in how such personnel are treated, and bring the issue to light using three case studies. “Acknowledging the contributions of all research roles,” they argue, “will help retain skill and expertise, and lead to collaborative research ecosystems that are well-positioned to address complex research challenges.”
Buzbas and Devezer’s (2023) article Tension Between Theory and Practice of Replication scrutinizes the popular yet somewhat divisive topic of replication. They describe a mismatch between the push for more replication studies and concern about irreproducibility with iterative and slow (but crucial) theoretical developments. They advance theoretical considerations of “non-exact” replication studies and meta-hypothesis testing in multi-lab replications and warn of the problems that enacting reforms without robust theoretical foundations can pose for the reform movement. They emphasize the need for a theoretical framework of metascience to guide science reform, especially in the context of large-scale replication studies. “Theoretical work is still in its early stages of development and needs to continue,” they explain, while at the same time noting that “another major challenge arises for the next generation of reform: How do we bridge the gap between theory and practice?”
Finally, in Reputation without practice? A Dynamic Computational Model of the Unintended Consequences of Open Scientist Reputations, Linde et al. (2024) delve into the dynamics of open science advocacy, exploring how being an advocate for open science can affect academic careers. It introduces a dynamic model to examine two types of open science behavior (practicing open science and/or advocating open science) and how they can impact the career progression of academic researchers. They found that groups that both practice and advocate open science will be dominant in a scientific community that values open science, and that advocating OS may not have the same advantages as practicing OS. They write: “These results are encouraging to those who feel practicing open science ‘is not worth it’: in addition to benefits to science at large, our results suggest engaging with OS benefits the individual researcher as well.”
With the objective of stimulating a discourse in the realm of scientific reform, together with the structural backing of the Center of Trial and Error and the input of several metascience researchers, we have co-produced a special issue that focuses on "second order" effects or "second-generation" challenges—issues that may arise during or as a consequence of addressing primary reform concerns. The peer-reviewed contributions that appear in the special issue deal with a range of issues: problematic reformer behaviors, the mismatch between preregistration theory and practice in clinical trial research, the hidden workload of open science, open science challenges for qualitative research, neglected yet important roles in team science, the tension between theory and practice of replication, the impact of open science advocacy and practice on researchers’ careers, and the role of moral positions on core values in the progress of the science reform movement.
The complexities of scientific reform require thoughtful, well-rounded solutions built on inclusive discussions, and the contributions in this special issue provide a rich tapestry of perspectives to guide our way forward. As we continue to refine our scientific practices, we must ensure that our shared journey towards scientific reform remains reflexive, critical, and balanced.
Bennett, A., Garside, D., Praag, C. G. van, Hostler, T. J., Garcia, I. K., Plomp, E., Schettino, A., Teplitzky, S., & Ye, H. (2023). A manifesto for rewarding and recognizing team infrastructure roles. Journal of Trial & Error, 4(1), 60–72. https://doi.org/10.36850/mr8
Buzbas, E. O., & Devezer, B. (2023). Tension between theory and practice of replication. Journal of Trial & Error, 4(1), 73–81. https://doi.org/10.36850/mr9
Devezer, B., Navarro, D. J., Vandekerckhove, J., & Ozge Buzbas, E. (2020). The case for formal methodology in scientific reform. Royal Society Open Science, 8(3), Article 200805. https://doi.org/10.1098/rsos.200805
Devine, S., Bautista-Perpinya, M., Delrue, V., Gaillard, S., Jorna, T., Meer, M., Millett, L., Pozzebon, C., & Visser, J. (2020). Science fails. Let’s publish. Journal of Trial and Error, 1(1), 1–5.
Field, S. M. (2022). Charting the constellation of science reform. [Doctoral dissertation, University of Groningen] (p. 10 33612 229114775). Pure.
Field, S. M. (2023). Risk reform, or remain within the academic monolith? The Psychologist, 36, 45–47. https://www.bps.org.uk/psychologist/risk-reform-or-remain-within-academic-monolith
Hostler, T. J. (2023). The invisible workload of open research. Journal of Trial & Error, 4(1), 21–36. https://doi.org/10.36850/mr5
Ioannidis, J. P. A. (2014). How to make more published research true. PLOS Medicine, 11(10), Article 1001747. https://doi.org/10.1371/journal.pmed.1001747
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
Linde, M., Pittelkow, M.-M., Schwarzback, N., & Ravenzwaaij, D. (2024). Reputation without practice? A dynamic computational model of the unintended consequences of open scientist reputations. Journal of Trial and Error, 4(1), 82–110.
Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., Percie du Sert, N., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), Article 0021.
Ravenzwaaij, D., Bakker, M., Heesen, R., Romero, F., Dongen, N., Crüwell, S., Field, S. M., Held, L., Munafò, M. R., Pittelkow, M. M., Tiokhin, L., Traag, V. A., Akker, O. R., Veer, A. E., & Wagenmakers, E. J. (2023). Perspectives on scientific error. Royal Society Open Science, 10(7), Article 230448.
Rubin, M. (2023). Questionable metascience practices. Journal of Trial & Error, 4(1), 5–21. https://doi.org/10.36850/mr4
Spellman, B. A., Gilbert, E. A., & Corker, K. S. (2018). Open Science. Stevens’ Handbook of Experimental Psychology and Cognitive Neuroscience, 5, 1–47. http://doi.org/10.1002/9781119170174
Steltenpohl, C. N., Lustick, H., Meyer, M. S., Lee, L. E., Stegenga, S. M., Reyes, L. S., & Renbarger, R. L. (2023). Rethinking transparency and rigor from a qualitative open science perspective. Journal of Trial & Error, 4(1), 47–59. https://doi.org/10.36850/mr7
Thibault, R. T., Pennington, C. R., & Munafò, M. R. (2023). Reflections on preregistration: Core criteria, badges, complementary workflows. Journal of Trial & Error, 4(1), 37–46. https://doi.org/10.36850/mr6
Tiokhin, L., Panchanathan, K., Lakens, D., Vazire, S., Morgan, T., & Zollman, K. (2021). Honest signaling in academic publishing. PLOS ONE, 16(2), Article 0246675. http://doi.org/10.1371/journal.pone.0246675
Wagenmakers, E. J. (2012). A year of horrors. De Psychonoom, 27, 12–13.