The Reproducibility Project: Psychology (RPP) was a large-scale collaboration between 300+ faculty and students working to estimate the rate of reproducibility in psychological science. Initiated by Brian Nosek, the project strove to embody open ideals and practices. To accomplish these goals, replication teams from around the world chose an experiment to reproduce from a sample of psychology papers. Teams worked with original authors and other volunteers to obtain original materials, identify the key findings for replication, plan and conduct analyses, recruit participants, and collect data. Reports on each replication were made publicly available and shared with original authors for comment. A summative report of the collaboration's collective findings was published in Science Magazine. Findings suggest that reproducibility is relatively uncommon in psychology—most studies could not be replicated.
Reproducibility, the ability to mimic an existing experiment's protocol and analyses to obtain the same result, is fraught with controversy and nuance. Exact definitions of reproducibility differ between and within fields and different research methods may lend themselves to replication more than others. Opinions on the necessity of replication to the scientific process—or at least the frequency with which it is necessary—differ as well. Furthermore, the factors that influence reproducibility are still largely unknown and likely to be both numerous and contextual. But, it is clear that "error" may stem from the replictor or replicatee (or both!). These issues meant that the RPP attracted media and professional attention.
I spent three years as a project coordinator, managing the RPP with Mallory Kidwell and the support of COS's metascience team. As a coordinator, I doubled the number of researchers involved, created a public-facing online home for our work, wrote dozens of reports, managed data, tracked progress, communicated with scores of participants, presented at several workshops, and assisted with writing our final manuscript. This work was conducted using platforms like the Open Science Framework and Google Drive to increase transparency and inclusivity.
In this position I came to appreciate the backstage processes of scientific research and saw the struggles inherent in revealing them. I remain interested in how technology may alter, reveal, or conceal research practices. The contentiousness of reproducibility and replication efforts has piqued my interest in perceptions of scientific values and the way they may be inscribed in our technology or otherwise structure our work.