IEEE Access Reproducibility Initiative

IEEE Access Reproducibility Initiative

Reproducibility is at the core of solid scientific and technical research. Consistent with its broad mission to advance technology for the benefit of humanity, IEEE is committed to the principles of open science. For more information on IEEE’s open research initiative, please visit the IEEE Author Center.

IEEE Access is committed to enabling reproducible research through transparency, and the availability and potential reuse of code associated with its published research. IEEE Access is piloting a post-publication peer review of code whereby authors of published IEEE Access articles submit code associated with their article for post-publication peer review.  Once peer-reviewed, published articles can earn a reproducibility badge which helps make published code more visible and credible.

IEEE Access offers the following badges:

  • Code Available: The code, including any associated data and documentation, provided by the authors is reasonable and complete and can potentially be used to support reproducibility of the published results.
  • Code Reviewed: The code, including any associated data and documentation, provided by the authors is reasonable and complete, runs to produce the outputs described, and can support reproducibility of the published results.

This call aims to attract authors who have already published in IEEE Access in the past and seek to improve the reproducibility of their published research. Please read the IEEE Access Reproducibility Author Instructions below.

If awarded, the reproducibility badge will appear with the article in the IEEE Xplore digital library. 

IEEE Access Reproducibility Editors:

Manish Parashar, University of Utah

Porfirio Tramontana, Università degli Studi di Napoli Federico II

Information for Reproducibility Reviewers

As a reviewer, you have a crucial role in supporting research integrity in the peer review and publishing process. Please see our website for guidelines on reviewing for IEEE Access, as well as general best practices for reviewers.

Reviewers will assess the details of the research artifact based on the following criteria:

  1. Documentation: Assess whether the description of the artifact is sufficiently documented to enable them to be exercised by readers of the paper. In particular, keep in mind the accessibility of the code, the code dependencies and requirements, the description of the installation and deployment processes, and the description of the experiments that the artifact implements.
  2. Completeness: Check that the artifact includes all the key components described in the article and to what extent the artifact contributes to the reproducibility of the experiments in the article.
  3. Exercisability: Examine whether the submitted artifact includes the scripts and data needed to execute the experiments described in the article, and whether the software can be successfully executed.

Call for Reproducibility Artifacts: Submission Instructions 

We are inviting submissions of reproducibility artifacts from authors who have an article published in or accepted for publication in IEEE Access and have associated code published in a repository that provides persistent DOI and versioning, such as CodeOcean. CodeOcean is a reproducibility platform. It provides a containerized approach to run artifacts on demand within CodeOcean resources. Your submission should primarily consist of your accepted or published IEEE Access article and all elements required to reproduce the experiments in the article. In essence, the artifacts can include software, datasets, environment configuration, mechanized proofs, benchmarks, test suites with scripts, etc. Instructions or documentation describing the contents and how to use them are also required. The submission necessarily should have everything required for our reviewers to compile and execute the code and reproduce the results. A link to the artifact on a hosting platform and the DOI should also be included.

Please submit your computational artifacts and associated documentation to the IEEE Access artifact review system.

Artifact Description and Structure

The artifact description must be included in a README file along with the artifact, and it must include the following aspects:

  1. Artifact Identification: Including (i) the article’s title, (ii) the author’s names and affiliations, and (iii) an abstract describing the main contributions of the article and how the role of the artifact in these contributions. The abstract may include a software architecture or data models and its description to help the reader understand the artifact and a clear description on to what extent the artifact contributes to the reproducibility of the experiments in the article.
  2. Artifact Dependencies and Requirements: Including (i) a description of the hardware resources required, (ii) a description of the operating systems required, (iii) the software libraries needed, (iv) the input dataset needed to execute the code or when the input data is generated, and (v) optionally, any other dependencies or requirements. Best practices to facilitate the understanding of the descriptions indicate that unnecessary dependencies and requirements should be suppressed from the artifact.
  3. Artifact Installation and Deployment Process: Including (i) the process description to install and compile the libraries and the code, and (ii) the process description to deploy the code in the resources. The description of these processes should include an estimation of the installation, compilation, and deployment times. When any of these times exceed what is reasonable, authors should provide some way to alleviate the effort required by the potential recipients of the artifacts. For instance, capsules with the compiled code can be provided, or a simplified input dataset that reduces the overall experimental execution time. On the other hand, best practices indicate that, whenever it is possible, the actual code of software dependencies (libraries) should not be included in the artifact, but scripts should be provided to download them from a repository and perform the installation.
  4. Reproducibility of Experiments: Including (i) a complete description of the experiment workflow that the code can execute, (ii) an estimation of the execution time to execute the experiment workflow, (iii) a complete description of the expected results and an evaluation of them, and most importantly (iv) how the expected results from the experiment workflow relate to the results found in the article. Best practices indicate that, to facilitate the understanding of the scope of the reproducibility, the expected results from the artifact should be in the same format as the ones in the article. For instance, when the results in the article are depicted in a graph figure, ideally, the execution of the code should provide a (similar) figure (there are open-source tools that can be used for that purpose such as gnuplot). It is critical that authors devote their efforts on these aspects of the reproducibility of experiments to minimize the time needed for their understanding and verification.
  5. Other notes: Including other related aspects that can be important and were not addressed in the previous points.

Please see the following article for an example of an IEEE Access code reviewed badge. 

The Code Reproducibility initiative of IEEE Access follows the steps taken by IEEE Transactions on Parallel and Distributed Systems (TPDS) journal for the same and more detailed guidelines can be found by clicking here.