The results of this paper have been used in multiple subsequent studies as a benchmark against which other methods of performing the same calculation have been tested. Other groups have challenged the results as suffering from finite size effects, in particular the calculations on mixtures of cubic and hexagonal ice. Should there be time during in the event, participants could check this by performing calculations on larger unit cells. Each individual calculation should converge adequately within 96 hours making it amenable to a HPC ReproHack. Given modern HPC hardware many such calculations could be run concurrently on a single HPC node.
Metadata annotation is key to reproducibility in sequencing experiments. Reproducing this research using the scripts provided will also show the current level of annotation in years since 2015 when the paper was published.
The current code is written in Torch, which is no longer actively maintained. Since deep learning in nanophotonics is an area of active interest (e.g. for the design of new metamaterials), it is important to update the code to use a more modern deep learning library such as tensorflow/keras
If all went right, the analysis should be fully reproducible without the need to make any adjustments. The paper aims to find optimal locations for new parkruns, but we were not 100% sure how 'optimal' should be defined. We provide a few examples, but the code was meant to be flexible enough to allow potential decision makers to specify their own, alternative objectives. The spatial data set is also quite interesting and fun to play around with. Cave: The full analysis takes a while to run (~30+ min) and might require >= 8gb ram.