Review of
"Investigating the replicability of preclinical cancer biology"

Review of "Investigating the replicability of preclinical cancer biology"

Submitted by mjbeyeler  

June 27, 2024, 8:09 a.m.

Lead reviewer


Review Body


Did you manage to reproduce it?
Partially Reproducible
Reproducibility rating
How much of the paper did you manage to reproduce?
4 / 10
Briefly describe the procedure followed/tools used to reproduce it

The authors shared their code on a dedicated OSF website, that you can find here:

We browsed this website and found individual scripts reproducing results from individual papers, and a shared Github repository that produces a central meta-analysis table in the paper.

We first thought of making the shared scripts work on Google Colab, to make the work more accessible to interested researchers. Due to Colab limitations, we then moved to our inhouse Servers, using RStudio Server, and then finally switched to rocker (Docker for R), to work with a compatible old R version.

Briefly describe your familiarity with the procedure/tools used by the paper.

I'm not very familiar with cancer research, but this meta-analysis is mostly an aggregation of statistical effect sizes, which I am familar with, and it is done in R, a language I am familiar with.

Which type of operating system were you working in?
Linux/FreeBSD or other Open Source Operating system
What additional software did you need to install?
  • Docker container with a compatible R version.
What software did you use

R inside a rocker Docker container, with R version 4.01 installed in it.

What were the main challenges you ran into (if any)?
  • dealing with dependencies to get the scripts working
  • Docker container with the old R version not working anymore, so we had to switch to rocker
  • making system-wide installs work inside the Docker container
What were the positive features of this approach?

Having a Docker container with all the necessary dependencies installed within will increase the reproducibility of the work by a lot.

It will be much easier for researchers to inspect the figures, make slight changes to methods and inspect how that affects results.

It took us almost a full day to make some key scripts working. with a functional Docker container this would be immediate.

Any other comments/suggestions on the reproducibility approach?

The researchers should have shared a Docker container or something similar. Their R lockfile was not enough, because it's not compatible with current R version.


Documentation rating
How well was the material documented?
7 / 10
How could the documentation be improved?

Create documentation as step by step guide to get scripts working from scratch, and they should test that this is actually working.

What do you like about the documentation?

All the scripts and data are findable on the OSF repository, this is often not the case, and a big positive!

After attempting to reproduce, how familiar do you feel with the code and methods used in the paper?
8 / 10
Any suggestions on how the analysis could be made more transparent?

ChatGPT comments on lines of code :)


Reusability rating
Rate the project on reusability of the material
7 / 10
Permissive Data license included:  
Permissive Code license included:  

Any suggestions on how the project could be more reusable?

Repo doesn't have a license at all, which means ALL RIGHTS RESERVED, project cannot be reused and built on at all!!

Any final comments

This reprohack was a great idea. I hope it will be continued.