You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This page evaluates the extent to which the author-published research artefacts meet the criteria of badges related to reproducibility from various organisations and journals.
10
10
11
-
*Caveat: Please note that these criteria are based on available information about each badge online, and that we have likely differences in our procedure (e.g. allowed troubleshooting for execution and reproduction, not under tight time pressure to complete). Moreover, we focus only on reproduction of the discrete-event simulation, and not on other aspects of the article. We cannot guarantee that the badges below would have been awarded in practice by these journals.*
11
+
*Caveat: Please note that these criteria are based on available information about each badge online. Moreover, we focus only on reproduction of the discrete-event simulation, and not on other aspects of the article. We cannot guarantee that the badges below would have been awarded in practice by these journals.*
12
12
13
13
## Criteria
14
14
@@ -19,35 +19,34 @@ import pandas as pd
19
19
20
20
# Criteria and their definitions
21
21
criteria = {
22
-
'archive': 'Stored in a permanent archive that is publicly and openly accessible',
23
-
'id': 'Has a persistent identifier',
24
-
'license': 'Includes an open license',
25
-
'relevant': '''Artefacts are relevant to and contribute to the article's results''',
26
-
'complete': 'Complete set of materials shared (as would be needed to fully reproduce article)',
27
-
'structure': 'Artefacts are well structured/organised (e.g. to the extent that reuse and repurposing is facilitated, adhering to norms and standards of research community)',
28
-
'documentation_sufficient': 'Artefacts are sufficiently documented (i.e. to understand how it works, to enable it to be run, including package versions)',
29
-
'documentation_careful': 'Artefacts are carefully documented (more than sufficient - i.e. to the extent that reuse and repurposing is facilitated - e.g. changing parameters, reusing for own purpose)',
30
-
# This criteria is kept seperate to documentation_careful, as it specifically requires a README file
31
-
'documentation_readme': 'Artefacts are clearly documented and accompanied by a README file with step-by-step instructions on how to reproduce results in the manuscript',
22
+
'archive': 'Artefacts are archived in a repository that is: (a) public (b) guarantees persistence (c) gives a unique identifier (e.g. DOI)',
'docs1': 'Documents (a) how code is used (b) how it relates to article (c) software, systems, packages and versions',
26
+
'docs2': 'Documents (a) inventory of artefacts (b) sufficient description for artefacts to be exercised',
27
+
'relevant': 'Artefacts relevant to paper',
32
28
'execute': 'Scripts can be successfully executed',
33
-
'regenerated': 'Independent party regenerated results using the authors research artefacts',
34
-
'hour': 'Reproduced within approximately one hour (excluding compute time)',
29
+
'careful': 'Artefacts are carefully documented and well-structured to the extent that reuse and repurposing is facilitated, adhering to norms and standards',
30
+
'reproduce': 'Reproduced results (assuming (a) acceptably similar (b) reasonable time frame (c) only minor troubleshooting)',
31
+
'readme': 'README file with step-by-step instructions to run analysis',
Revisited and revised the badge criteria to (a) make them up-to-date, and (b) make sure they are *specific* to the descriptions from each badge. Hence, redoing evaluations for all eight studies.
17
+
18
+
Notes:
19
+
20
+
* Reproduction - no, as added reasonable assumption that would expect this within a reasonable time (e.g. a few hours) and with only minor troubleshooting - but this reproduction required a large time investment and extensive troubleshooting (e.g. writing code) to reproduce
0 commit comments