Much of the evidence in modern health economics is built on experiments, which have been critical to the understanding of everything, ranging from the behaviour of the insured and preferences for health and healthcare to health behaviours and lifestyle choices [1]. A problem for the discipline is that the publication process creates a persistent bias in that novel positive findings are more likely to be published [2]. Researchers can create positive findings by reanalysing the data until statistical significance is achieved or by conducting multiple analyses and then focusing on the most striking results. Well planned and executed experiments often produce “negative” findings that are more robust but harder to publish [3]. The Centre for Open Science has recently followed the initiative of the journal Cortex in championing a method of providing more reliable evidence: registered reports [4, 5]. Here we argue why registered reports should be taken up by health economic journals.

Probably the most famous experiment in health economics is the RAND health insurance experiment [6], which started back in the late 1960s and was designed to test how people’s behaviour responded to different types of insurance. Since then, randomised controlled trials have continued to be applied in health economics (such as the Oregon Experiment looking at the effects of Medicaid on clinical outcomes [7]), alongside the proliferation of other experimental approaches such as laboratory and online experiments (commonly, but not exclusively, discrete choice experiments [8]).

An inherent problem in the process of journal publication is a filter that introduces a bias into the process of selecting the research to be published. Generally, humans find the unexpected interesting. This means that researchers sometimes select the results to include in manuscripts submitted for publication. Often this takes the form of p-hacking, which occurs when researchers continue to collect data or undertake alternative analyses until they find a statistically significant result [9]. Editors and reviewers also favour results that reject the researcher’s null hypothesis [2]. All of this means that studies which produce “positive” results with statistically significant p values are more likely to be published than “negative” ones.

There have been efforts to reduce publication bias through the pre-publication of protocols or the registration of trials, but these practices are not standard in health economics (though there have been calls for registers, e.g. [10]). As a result, researchers are free to retrofit study results with glossier research questions, cherry-pick favourable findings, perform unlimited numbers of analyses and exclude inconvenient data in order to generate (what are perceived to be) more interesting findings. In the absence of pre-publication protocols, then, any given finding in the health economics literature may just be a chance outcome that is not useful for evidenced-based policy.

Recently, the editors of several health economic journals published a statement encouraging the publication of negative results [11]. This is a positive step, and evidence is emerging that this statement was having some impact [12], but a statement alone cannot fix the issues. Importantly, reviewers and journal editors are not blinded to the results when assessing a manuscript for publication. So, while negative results may not be used as grounds for explicitly rejecting a manuscript, the direction and level of the reported statistical significance of the results can play a role unconsciously. There is a possible unintended consequence of the statement: poorer quality research may be favoured in an attempt to actively respond to it.

Registered reports are a new approach to journal publishing [13]. This approach involves blinding peer reviewers to the results when considering the merits of an experiment. The authors submit a report to a journal based on the background, methods and proposed analyses before the experiment is conducted. The protocol can be modified based on the peer reviewers’ recommendations, but it can then be provisionally accepted by the journal and the report is published. The authors then perform the experiment and expand the article to include both the results and discussion, which are again reviewed. However, providing the authors have implemented the agreed protocol, publication should be guaranteed. Post hoc analyses are permitted, but are clearly labelled as such, with reviewers and readers alike able to judge these findings accordingly.

As noted by Chambers [13], registered reports are not appropriate for all studies. However, there are many studies within the field of health economics that could have been, or indeed would be, eligible. To determine the scope for registered reports, we examined the latest three editions of five prominent health economic journals. We considered the proportion of studies that would have been eligible for registered reports under two scenarios: a stricter criterion in which only randomised controlled trials (RCTs) were eligible, and a second in which the eligibility was broadened to include quasi-experimental (QE) approaches. On average, 3% of the studies in these journals could have been published as a registered report under the strict eligibility criteria, while 29% of the studies were eligible with the broader criteria. The journal-specific results are presented in Fig. 1. Fig. 1 Proportion of published studies in HE journals that would have been eligible for publishing as registered reports. Journals: HE Health Economics, JHE Journal of Health Economics, MDM Medical Decision Making, Pharmacoecon Pharmacoeconomics, VIH Value in Health. RCT randomised controlled trials, QE quasi-experimental Full size image

In this analysis, natural experiments are excluded because there is no primary collection of data in these studies, which means that the prospective process of registered reporting is difficult to apply. However, registered reports could readily be applied prospectively, such as before a policy change.

The take-up of registered reports has varied considerably across fields. As of June 2019, over 200 journals offered registered reports [13]. Psychology leads the way, perhaps in part due to the recognition by this discipline of a reproducibility crisis [14]. This followed the Reproducibility Project, involving a collaboration of 270 contributing authors who attempted to reproduce 100 published psychological studies. The results, which were published in the journal Science, found that only 36% could be reproduced [15]. The response of the field of psychology has been swift, with the rapid adoption of registered reports across a range of journals.

While there is a recognition by economists of the importance of experiments, as demonstrated by the awarding of the 2019 Nobel Prize in economics for experimental work [16], it is highly likely that economics also suffers from a reproducibility crisis [17], and the adoption of registered reports in this discipline has been slow. Further, there has been little discussion of the reproducibility crisis by health economists and what steps might be taken to address it; for instance, are there any plans among health economists for a collaboration of health economists to replicate key empirical studies [18]? The Centre for Open Science currently lists only one economics journal (Journal of Development Economics [5]) and no health economic journals that accept registered reports.

Beyond the scientific benefits of registered reports in minimising the potential for publication bias, there is potentially an important efficiency gain [19]. While registered reports may mean more administrative work for journals, it is likely to save a considerable amount of research funding overall. This is nicely illustrated by a recent controversy regarding a new value set for the five-level version of the EQ 5D, a health status instrument [20]. This study has been criticised by a group at the University of Sheffield [21] partly on methodological grounds and partly because the new algorithm produces results that are different from the existing results. Surely a better and more efficient approach would have been for the review to take place before the field-work component of the study had been conducted.

Registered reports could be extended to tackle the reproducibility crisis, with replication studies submitted as registered reports that could be peer reviewed by the original authors. This would need support and encouragement from journals. Guaranteeing the publication of replication experiments significantly changes the incentives to reward researchers undertaking the important job of validation.

A more radical approach would be to allow authors to make registered reports available on a journal website to encourage empirical researchers to implement them. This would shift a discipline such as health economics closer to a discipline such as physics, which has a clear separation between theoretical and applied researchers, with no expectation that the theoretical researchers conduct experiments to test their hypotheses. It is hard to know if Peter Higgs ever envisaged that the particle he predicted in 1964 would be found 49 years later [22]. Similarly, health economists developing new theories could propose hypotheses that would be tested by others.

Experiments are a powerful tool, but the process of publication means that there is no guarantee that the results of experiments will be reported without bias. We believe that registered reports, in which the protocol is peer reviewed before the experiment has been conducted, will greatly mitigate bias and reduce research waste. Rather than follow the crowd, it is time for health economics as a discipline to adopt registered reports and lead the way for economics as a whole.

References 1. Pauly MV, McGuire TG, Barros PP. Handbook of health economics. Amsterdam: Elsevier; 2011. 2. Emerson GB, et al. Testing for the presence of positive-outcome bias in peer review: a randomized controlled trial. JAMA Intern Med. 2010;170(21):1934–9. 3. Allen C, Mehler DMA. Open science challenges, benefits and tips in early career and beyond. PLoS Biol. 2019;17(5):e3000246. 4. Chambers C. Elsevier Connect: Cortex’s registered reports. 2015. https://www.elsevier.com/editors-update/story/peer-review/cortexs-registered-reports. Accessed 20 Nov 2019. 5. Center for Open Science. Registered reports: peer review before results are known to align scientific values and practices. 2019. https://cos.io/rr/. Accessed 20 Nov 2019. 6. Manning WG, et al. Health insurance and the demand for medical care: evidence from a randomized experiment. Am Econ Rev. 1987;77(3):251–77. 7. Baicker K, et al. The Oregon Experiment—effects of Medicaid on clinical outcomes. N Engl J Med. 2013;368(18):1713–22. 8. Soekhai V, et al. Discrete choice experiments in health economics: past, present and future. PharmacoEconomics. 2019;37(2):201–26. 9. Gelmen A, Loken E. The garden of forking paths: Why multiple comparisons can be a problem, even when there is no “fishing expedition” or “p-hacking” and the research hypothesis was posited ahead of time. Working paper. New York: Columbia University; 2013. 10. Sampson CJ, Wrightson T. Model registration: a call to action. PharmacoEconomics Open. 2017;1(2):73–7. 11. Health Economics Journals Editors. Editorial statement on negative findings. Health Econ Policy Law. 2015;10(3):241. 12. Blanco-Perez C, Brodeur A. Publication bias and editorial statement on negative findings. Working paper. Ottawa: University of Ottawa; 2019. 13. Chambers C. What’s next for registered reports? Nature. 2019;573(7773):187–9. 14. American Psychological Society. A reproducibility crisis? 2019. https://www.apa.org/monitor/2015/10/share-reproducibility. Accessed 20 Nov 2019. 15. Aarts AA, et al. Estimating the reproducibility of psychological science. Science. 2015;349(6251):4716. 16. The Nobel Prize. Press release: the Prize in Economic Sciences 2019. 2019. https://www.nobelprize.org/prizes/economic-sciences/2019/press-release/. Accessed 20 Nov 2019. 17. Smith N. Bloomberg: Why economics is having a replication crisis. 2018. https://www.bloomberg.com/opinion/articles/2018-09-17/economics-gets-it-wrong-because-research-is-hard-to-replicate). Accessed 20 Nov 2019. 18. Organizing Committee of The Replication Network. The Replication Network homepage. 2019. https://replicationnetwork.com/. Accessed 20 Nov 2019. 19. Munafò MR. Improving the efficiency of grant and journal peer review: registered reports funding. Nicot Tob Res. 2017;19(7):773. 20. Devlin NJ, et al. Valuing health-related quality of life: an EQ-5D-5L value set for England. Health Econ. 2018;27(1):7–22. 21. Hernández-Alava M, Pudney S, Wailoo A. Quality review of a proposed EQ-5D-5L value set for England. EEPRU Research Report 060. Sheffield: Policy Research Unit in Economic Evaluation of Health and Care Interventions; 2018. 22. Aitkenhead D. Peter Higgs interview: ‘I have this kind of underlying incompetence’. The Guardian. 2013 Dec 6. https://www.theguardian.com/science/2013/dec/06/peter-higgs-interview-underlying-incompetence. Accessed 20 Nov 2019. Download references

Acknowledgements PC and JB were partially funded by the National Institute for Health Research (NIHR) Oxford Biomedical Research Centre (BRC; grant number NIHR-BRC-1215-20008). The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care.

Ethics declarations Conflict of interest The authors declare that they have no conflict of interest.

Electronic supplementary material Below is the link to the electronic supplementary material. Supplementary material 1 (XLSX 11 kb)

Rights and permissions Open Access This article is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, which permits any non-commercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc/4.0/. Reprints and Permissions