Start Submission Become a Reviewer

Editorial Policies

Peer Review Process

Software Metapapers

Two general principles guide the reviewing process at the Journal of Open Research Software:

  1. We are reviewing the accuracy and quality of the metadata rather than the software, however there will be a minimum level of quality of software required so that it is possible to review
  2. We expect all metapapers to be able to pass after revisions, unless the software is not openly available and/or extremely difficult to reuse.

After submission, a software metapaper is sent to multiple independent reviewers.

Each reviewer attempts to download the software based on the information in the submitted paper and:

  • checks that the software behaves as described
  • checks that the information in the metapaper is correct, in particular the contributors, license, and limitations
  • writes up a review

Reviews are combined to provide a decision (accept, accept after minor revisions, re-review after major revisions, reject) and a checklist of revisions and suggestions are sent to the author, along with the reviews. Reviewers are also encouraged, but not required, to sign their reviews.

All JORS software papers are peer reviewed according to the following standardised review form:

 The paper contents
  1. Is the title of the paper descriptive and objective?
  2. Does the Abstract give an indication of the software's functionality, and where it would be used?
  3. Do the keywords enable a reader to search for the software?
  4. Does the Introduction give enough background information to understand the context of the software's development and use?
  5. Does the Implementation and Architecture section give enough information to get an idea of how the software is designed, and any constraints that may be placed on its use?
  6. Does the Quality Control section adequately explain how the software results can be trusted?
  7. Does the Reuse section provide concrete and useful suggestions for reuse of the software, for instance: other potential applications, ways of extending or modifying the software, integration with other software?
  8. Are figures and diagrams used to enhance the description? Are they clear and meaningful?
  9. Do you believe that another researcher could take the software and use it, or take the software and build on it?
 The deposited software
  1. Is the software in a suitable repository? (see http://openresearchsoftware.metajnl.com/about/editorialPolicies#custom-0 for more information)
  2. Does the software have a suitable open licence? (see our FAQs for more information)*
  3. If the Archive section is filled out, is the link in the form of a persistent identifier, e.g. a DOI? Can you download the software from this link?
  4. If the Code Repository section is filled out, does the identifier link to the appropriate place to download the source code? Can you download the source code from this link?
  5. Is the software license included in the software in the repository? Is it included in the source code?
  6. Is sample input and output data provided with the software?
  7. Is the code adequately documented? Can a reader understand how to build/deploy/install/run the software, and identify whether the software is operating as expected?
  8. Does the software run on the systems specified? (if you do not have access to a system with the prerequisite requirements, let us know).
  9. Is it obvious what the support mechanisms for the software are?

 

Issues in Research Software

The following review criteria relate to our long-form articles submitted under to the Issues in Research Software section. Articles should not exceed 3-4000 words with a constrained list of references. The article can be structured with subheadings as the author sees fit, though an abstract is required. Please see below for more detailed criteria.

General Peer-review Criteria

  • Does the paper provide information that is useful for the community?
  • Is the paper clear to read and defines unfamiliar concepts?
  • Does the paper understand and recognise other efforts in the area in order to frame the discussion?
  • Does the paper appear credible and trustworthy?
  • Does the paper summarise the experiences of the authors, including reflection on the importance of the arguments and conclusions?
  • Is the paper in scope?
  • Is the paper written for a general research audience?

Criteria for position papers:

  • Is the viewpoint expressed clearly and in a well-organised way?
  • Does the paper provide a foundation for others to resolve challenges in the area?
  • Does the paper present a unique, though biased, solution or a unique approach to solving a problem?
  • Does the paper demonstrate a command of the issues and research behind them?
  • Does the paper provide useful ideas which are backed by personal experience?
  • Does the paper show that supporting evidence for both sides of an argument has been considered?

Criteria for survey papers:

  • Does the paper present a concise but broad summary of an area that is accessible to a general reader?
  • Does the paper cover a sufficiently large proportion of the area being surveyed?
  • Does the author present a deep knowledge of the field, including main directions and controversies?
  • Does the author provide a summary of the main challenges that are being addressed by the papers / methods being surveyed?
  • Does the paper provide a critical analysis of each paper or method included?
  • Does the paper convey not just a list of papers/method/results but increases the understanding of the structure and direction of the area?

Criteria for experience reports:

  • Does the paper clearly state the research problem to which their efforts relating to software were applied?
  • Does the paper summarise the challenges faced and the decision process for choosing solutions to these challenges?
  • Does the paper distill the authors experiences in a way that they can be understood by readers in the same or related fields?
  • Does the paper distill the authors experiences in a way that can be understood by readers undertaking similar types of work?
  • Does the paper present clear recommendations based on the authors experiences that can be applied by other readers?

Section Policies

Software Metapapers

  • Open Submissions
  • Indexed
  • Peer Reviewed

Issues in Research Software

  • Open Submissions
  • Indexed
  • Peer Reviewed

Correction

  • Open Submissions
  • Indexed
  • Peer Reviewed

Quick links