A Community Effort Towards Reproducible Management Science
Whether we can really trust scientific studies depends on one important factor: reproducibility. Data can only be reproduced by other researchers if it is documented transparently. A team including Ben Greiner, Miloš Fišar and Christoph Huber from the WU Institute for Markets and Strategy has investigated the extent to which this applies to management science. They have summarized their most important findings here.
Research in business and management aims to provide sound and credible evidence upon which business and policy leaders can base their decisions. But to what extent can we trust the scientific results? The answer depends on whether the results are transparently documented (reproducible) and whether they are robust and broadly applicable (replicable). While replicability is ultimately an empirical question, to be explored in further studies, reproducibility is a matter of scientific rigor, and provides the groundwork for replicability. In our recent article, Reproducibility in Management Science, we take a significant leap forward and take a closer look. We estimate, for the first time, the reproducibility of a broad range of almost 500 studies in Management Science, a leading academic journal in business and management.
To enable verification of scientific results, in 2019 Management Science introduced a policy that made it a requirement for authors to provide their study materials (that is, their data, code, and everything else needed for the empirical or computational analyses), with some exceptions applying. In the Management Science Reproducibility Project, we directed the collaborative effort of a community of more than 700 experts from relevant fields of research to (attempt to) reproduce a large and representative sample of articles published before and after this policy change. The findings of this project, reported in our article, provide a description of the current state of affairs, highlight the critical role of disclosure policies in scholarly research, and allow us to put forward suggestions for improving the reliability of research results.
Figure 1 shows the main results of our endeavor: the percentage of studies that can be fully or largely reproduced, both before and since the introduction of the disclosure policy.
Consider the initial situation before the introduction of the policy, when providing code and data was voluntary. Because for 88% of articles, materials needed for reproducing a study’s results were not made available, only 7% out of 332 studies could be reproduced (see panel (a) in Figure 1). Among the 40 studies for which the authors did voluntarily provide materials, the reproduction rate is at 55% (see panel (b) in Figure 1).
In our sample of 419 studies published since the introduction of the policy and until January 2023, we observe a remarkable improvement: reproducibility climbed to almost 68% (see panel (c) in Figure 1). When, in addition, all data were available to the assessors and they could meet the soft- and hardware requirements, 95% of articles could be reproduced (see panel (d) in Figure 1).
These results reveal that the largest challenge to reproducibility since introduction of the disclosure policy is data accessibility. For a significant number of studies in our sample, data were not available to the assessors: the datasets may have been under NDA, not available for privacy reasons, or come from subscription databases or other commercial sources to which the assessor did not have access.
Figure 2 displays the main reasons for limited reproducibility. Besides data accessibility, obstacles hindering reproduction include issues such as missing or incorrect code, insufficient documentation, and the complexity of technical requirements.
Our findings emphasize the critical importance of data and code disclosure policies in academic journals. Such policies seem not only to be associated with a considerably higher rate of reproducibility, but also encourage a culture of openness and integrity in academic publishing. They are essential for producing reliable and trustworthy research, which in turn informs sound decision-making in practice.
Several concrete steps can be taken to elevate reproducibility rates further. First, enhancing the data availability through various means such as including de-identified data in replication packages, forming agreements with subscription databases for data access, or providing data through specialized infrastructures that restrict use to specific purposes. Second, refining the review processes for code and data. This might involve making the acceptance of papers conditional upon the approval of replication packages, and integrating the code and data review as an essential step in the manuscript review process at academic journals. Third, professionalizing the code and data review either in-house at the journals or publishers, or by delegating reproducibility certification to specialized third-party agencies.
Such institutional reforms, along with a collaborative effort and awareness across the academic community, are key in enhancing the robustness and reliability of results published in academic journals in business and economics. Management Science has already gone a long way from when hardly any study materials used to be available to an enforced disclosure policy that requires that each article provides study materials (even if it allows for exceptions). However, this journey needs to be continued further, with sufficient resources made available by the publisher(s), to bring reproducibility to 100%.
Reproducibility is an essential feature of reliable research results, but it cannot guarantee replicability. It does not imply that redoing a study – in a different context, with different data, analyses, or research designs – will yield the same outcomes and conclusions. However, reproducibility lays the foundations, ensuring validity of reported results and provision of materials that enable replication attempts and robustness checks, thus supporting our aspiration of reliable and credible scientific evidence.
Detailed research results and further information
Fišar, M., Greiner, B., Huber, C., Katok, E., Ozkes, A., & Management Science Reproducibility Collaboration. Reproducibility in Management Science. Management Science.
Link