The IEEE Access Reproducibility Initiative
IEEE Access is committed to enabling reproducible research through transparency, and the availability and potential reuse of code associated with its published research. We are piloting a post-publication peer review of code whereby authors of published IEEE Access articles submit code associated with their article for post-publication peer review. Once peer-reviewed, published articles can earn a reproducibility badge which helps make published code more visible and credible.Â
As part of these efforts, IEEE Access offers the following badges:
Code Available
The code, including any associated data and documentation, provided by the authors is reasonable and complete and can potentially be used to support reproducibility of the published results.
Code Reviewed
The code, including any associated data and documentation, provided by the authors is reasonable and complete, runs to produce the outputs described, and can support reproducibility of the published results
This call aims to attract authors who have already published in IEEE Access in the past and seek to improve the reproducibility of their published research. Please read the IEEE Access Reproducibility Author Instructions below.
If awarded, the reproducibility badge will appear with the article in the IEEE Xplore digital library.
Please see this published article for an example of an IEEE Access code reviewed badge.
Call for Reproducibility Artifacts–Submission Instructions
We are inviting submissions of reproducibility artifacts from authors who have an article published in or accepted for publication in IEEE Access and have associated code published in a repository that provides persistent DOI and versioning, such as CodeOcean. CodeOcean is a reproducibility platform. It provides a containerized approach to run artifacts on demand within CodeOcean resources. Your submission should primarily consist of your accepted or published IEEE Access article and all elements required to reproduce the experiments in the article. In essence, the artifacts can include software, datasets, environment configuration, mechanized proofs, benchmarks, test suites with scripts, etc. Instructions or documentation describing the contents and how to use them are also required. The submission necessarily should have everything required for our reviewers to compile and execute the code and reproduce the results. A link to the artifact on a hosting platform and the DOI should also be included.
Please submit your computational artifacts and associated documentation to the IEEE Access artifact review system.
Artifact Description and Structure
The artifact description must be included in a README file along with the artifact, and it must include the following aspects:
- Artifact Identification: Including (i) the article’s title, (ii) the author’s names and affiliations, and (iii) an abstract describing the main contributions of the article and how the role of the artifact in these contributions. The abstract may include a software architecture or data models and its description to help the reader understand the artifact and a clear description on to what extent the artifact contributes to the reproducibility of the experiments in the article.
- Artifact Dependencies and Requirements:Â Including (i) a description of the hardware resources required, (ii) a description of the operating systems required, (iii) the software libraries needed, (iv) the input dataset needed to execute the code or when the input data is generated, and (v) optionally, any other dependencies or requirements. Best practices to facilitate the understanding of the descriptions indicate that unnecessary dependencies and requirements should be suppressed from the artifact.
- Artifact Installation and Deployment Process:Â Including (i) the process description to install and compile the libraries and the code, and (ii) the process description to deploy the code in the resources. The description of these processes should include an estimation of the installation, compilation, and deployment times. When any of these times exceed what is reasonable, authors should provide some way to alleviate the effort required by the potential recipients of the artifacts. For instance, capsules with the compiled code can be provided, or a simplified input dataset that reduces the overall experimental execution time. On the other hand, best practices indicate that, whenever it is possible, the actual code of software dependencies (libraries) should not be included in the artifact, but scripts should be provided to download them from a repository and perform the installation.
- Reproducibility of Experiments: Including (i) a complete description of the experiment workflow that the code can execute, (ii) an estimation of the execution time to execute the experiment workflow, (iii) a complete description of the expected results and an evaluation of them, and most importantly (iv) how the expected results from the experiment workflow relate to the results found in the article. Best practices indicate that, to facilitate the understanding of the scope of the reproducibility, the expected results from the artifact should be in the same format as the ones in the article. For instance, when the results in the article are depicted in a graph figure, ideally, the execution of the code should provide a (similar) figure (there are open-source tools that can be used for that purpose such as gnuplot). It is critical that authors devote their efforts on these aspects of the reproducibility of experiments to minimize the time needed for their understanding and verification.
- Other notes:Â Including other related aspects that can be important and were not addressed in the previous points.
The Code Reproducibility initiative of IEEE Access follows the steps taken by IEEE Transactions on Parallel and Distributed Systems (TPDS) journal for the same and more detailed guidelines can be found by clicking here.