Review of
"Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: An observational study"

Review of "Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: An observational study"

Submitted by Alfredo_Sanchez-Tojar  

Dec. 2, 2021, 9:33 a.m.

Lead reviewer


Review Body


Did you manage to reproduce it?
Fully Reproducible
Reproducibility rating
How much of the paper did you manage to reproduce?
10 / 10
Briefly describe the procedure followed/tools used to reproduce it

I simply opened the paper's Code Ocean container, read the "" file and ran the "manuscript.Rmd" using the built-in function in Code Ocean. This created a pdf, which, content-wise, is presumably equivalent to the the published paper. I compared all numerical estimates, tables and figures presented in the published article (excluding the supplementary material) and in this pdf. Everything matched.

Briefly describe your familiarity with the procedure/tools used by the paper.

Although I had a decent amount of experience with R and R Markdown, I had never used Code Ocean. Code Ocean required signing up and logging in, but otherwise, it was straightforward to run the code and obtain all the results.

Which type of operating system were you working in?
Windows Operating System
What additional software did you need to install?


What software did you use

Simply logged into the Code Ocean container

What were the main challenges you ran into (if any)?


What were the positive features of this approach?


Any other comments/suggestions on the reproducibility approach?


Documentation rating
How well was the material documented?
10 / 10
How could the documentation be improved?

The documentation provided in the OSF project was clear enough to get me to the Code Ocean container. Once in the Code Ocean container, the simple documentation provided in the "" file was enough for me to reproduce the entire pdf with two clicks.

What do you like about the documentation?

Simple and to the point.

After attempting to reproduce, how familiar do you feel with the code and methods used in the paper?
5 / 10
Any suggestions on how the analysis could be made more transparent?

The approach of using the Code Ocean container is great to reproduce the results, however, it also means that if someone is simply trying to reproduce the results without looking at any of the provided files (data, scripts, etc.), this person would have no understanding of what is behind the scenes. Said that, the authors provided all the necessary files, and the Rmd file with all the text and inline comments makes everything clear and transparent.


Reusability rating
Rate the project on reusability of the material
10 / 10
Permissive Data license included:  
Permissive Code license included:  

Any suggestions on how the project could be more reusable?

The MIT License of the code misses year and full name, which makes me wonder about the full reusability of the code.

Any final comments

Although I downloaded the "manuscript.Rmd" to my laptop and explored it superficially, I did not attempt to run it from my laptop. However, since the authors are using and Rmd file with the paper's text and plenty of inline comments, code reusability seems pretty high. In my opinion, the authors have made a great job in making their study reproducible, and I would like to congratulate them for such effort.