Back to Headlines
Science AI Analysis

Investigating the reproducibility of the social and behavioural sciences | Nature

AI
AI Legal Analyst
April 1, 2026, 5:54 PM 5 min read 3 views

Summary

Data availability Data, materials and code associated with this research that can be shared without restriction are publicly available in a living OSF repository ( https://doi.org/10.17605/osf.io/ed8pj ). Readers can also access a registered, archived version of this repository that is precisely the data, code and documentation as they existed on publication of this paper ( https://doi.org/10.17605/osf.io/kmvst ). Code availability Code for individual reproduction projects is available alongside data and materials for each project in the OSF repository ( https://doi.org/10.17605/osf.io/ed8pj ). A registered, archived version of the repository containing precisely the data, code and documentation used to generate the outcomes reported in this paper is also available at OSF ( https://doi.org/10.17605/osf.io/kmvst ).

## Summary
Data availability Data, materials and code associated with this research that can be shared without restriction are publicly available in a living OSF repository ( https://doi.org/10.17605/osf.io/ed8pj ). Readers can also access a registered, archived version of this repository that is precisely the data, code and documentation as they existed on publication of this paper ( https://doi.org/10.17605/osf.io/kmvst ). Code availability Code for individual reproduction projects is available alongside data and materials for each project in the OSF repository ( https://doi.org/10.17605/osf.io/ed8pj ). A registered, archived version of the repository containing precisely the data, code and documentation used to generate the outcomes reported in this paper is also available at OSF ( https://doi.org/10.17605/osf.io/kmvst ).

## Article Content
Subjects
Scientific community
Social sciences
Abstract
Published claims should be reproducible, yielding the same result when the same analysis is applied to the same data
1
,
2
. Here we assess reproducibility in a stratified random sample of 600 papers published from 2009 to 2018 in 62 journals spanning the social and behavioural sciences. The authors of 144 (24.0%, 95% confidence interval (CI) = 20.8–27.6%) papers made data available to assess reproducibility and, for 38 others, we obtained source data to reconstruct the dataset. We assessed 143 out of the 182 available datasets and found that 76.6 (53.6%, 95% CI = 45.8–60.7%) papers were rated as precisely reproducible and 105.0 (73.5%, 95% CI = 66.4–80.0%) were rated as at least approximately reproducible (within 15% of the original effects or within 0.05 of original
P
values) after inverse weighting each of the 551 claims by the number of claims per paper. We observed higher reproducibility for papers from political science and economics compared with other fields, for more recent papers compared with older papers and for papers from journals that require data sharing. Implementation of measures to verify that research is reproducible is needed to support trustworthiness in the complex enterprise of knowledge production
3
,
4
.
Access through your institution
Buy or subscribe
This is a preview of subscription content,
access via your institution
Access options
Access through your institution
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
27,99 €
/ 30 days
cancel any time
Learn more
Subscription info for Korean customers
We have a dedicated website for our Korean customers. Please go to
natureasia.com
to subscribe to this journal.
Go to natureasia.com
Buy this article
Purchase on SpringerLink
Instant access to the full article PDF.
39,95 €
Prices may be subject to local taxes which are calculated during checkout
Fig. 1: Data and code availability by year of publication.
Fig. 2: Data and code availability by field.
Fig. 3: Reproducibility by data and code availability.
Fig. 4: Reproducibility by year of publication.
Fig. 5: Reproducibility by field.
Fig. 6: The percentage of 62 journals with data sharing, code sharing and reproducibility check requirements from 2003 to 2025.
Data availability
Data, materials and code associated with this research that can be shared without restriction are publicly available in a living OSF repository (
https://doi.org/10.17605/osf.io/ed8pj
). The living OSF repository represents improvements, fixes and additions that occur post-publication. Readers can also access a registered, archived version of this repository that is precisely the data, code and documentation as they existed on publication of this paper (
https://doi.org/10.17605/osf.io/kmvst
). The repository includes all available documentation for reproduction attempts, regardless of whether they were completed. This includes most of the data and code from the individual reproduction attempts, save for any data that are proprietary or protected that will not be made available, or for which analyst teams were uncertain or unable to confirm that they were allowed to share secondary data. It is possible that some data, materials or code that could be shared openly is not available at the time of publication. Readers are encouraged to contact the corresponding author or the authors of the relevant subproject (Supplementary Table
2
) to see if more research content can be shared in the living repository. This paper is part of a collection of papers reporting on the SCORE program. Documentation, data and code for the entire program is available at the OSF (
https://doi.org/10.17605/osf.io/dtzx4
).
Code availability
Code for individual reproduction projects is available alongside data and materials for each project in the OSF repository (
https://doi.org/10.17605/osf.io/ed8pj
). This includes a push button package with all code and data used to produce all statistics, figures and tables, and code that populates them directly into the manuscript from a template. A registered, archived version of the repository containing precisely the data, code and documentation used to generate the outcomes reported in this paper is also available at OSF (
https://doi.org/10.17605/osf.io/kmvst
).
References
Dreber, A. & Johannesson, M. A framework for evaluating reproducibility and replicability in economics.
Econ. Inq.
63
, 338–356 (2025).
Article
Google Scholar
National Academies of Sciences, Engineering and Medicine.
Reproducibility and Replicability in Science
(The National Academies Press, 2019).
Culina, A., van den Berg, I., Evans, S. & Sánchez-Tójar, A. Low availability of code in ecology: a call for urgent action.
PLoS Biol.
18
, e3000763 (2020).
Article
CAS
PubMed
PubMed Central
Google Scholar
Stodden, V., Seiler, J. & Ma, Z. An empirical analysis of journal policy effectiveness for computational reproducibility.
Pro

---

## Expert Analysis

### Merits
- Slow improvement to the archiving quality of open datasets shared by researchers in ecology and evolution.
- False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant.

### Areas for Consideration
- The threat of analytic flexibility in using large language models to simulate human data: a call to attention.

### Implications
- Subjects Scientific community Social sciences Abstract Published claims should be reproducible, yielding the same result when the same analysis is applied to the same data 1 , 2 .
- Go to natureasia.com Buy this article Purchase on SpringerLink Instant access to the full article PDF. 39,95 € Prices may be subject to local taxes which are calculated during checkout Fig. 1: Data and code availability by year of publication.
- This includes most of the data and code from the individual reproduction attempts, save for any data that are proprietary or protected that will not be made available, or for which analyst teams were uncertain or unable to confirm that they were allowed to share secondary data.
- It is possible that some data, materials or code that could be shared openly is not available at the time of publication.

### Expert Commentary
This article covers data, article, google topics. Notable strengths include discussion of data. Areas of concern are also raised. Readability: Flesch-Kincaid grade 0.0. Word count: 2256.
data article google scholar pubmed reproducibility code central

Related Articles