MSR 2019 Mining Challenge Reviewing Guidelines

The following instructions cover the main aspects of the MSR 2019 Mining Challenge review process. We provide these instructions to create a common ground for decision making, limit the number of stalled discussions, and help reach a quality threshold consensus for accepted papers.

Review Criteria

While we do not enforce a structured review format, reviewers may still find the following assessment criteria useful when evaluating the contributions of a paper:

  • Soundness: How well are the paper’s contributions supported by rigorous application of appropriate research methods?
  • Significance: To which extent are the paper’s contributions novel, original, and important, with respect to the existing body of knowledge?
  • Verifiability: To what degree does the paper adhere to the open science policy described in the call for papers? Do the authors provide sufficient information to support independent verification or replication of the paper’s claimed contributions? Please note that adherence to the open science policy is required for a paper to be nominated for the best paper award.
  • Presentation Quality: Does the paper’s quality of writing meet the standards of a scientific research paper, including clear descriptions and explanations, adequate use of the English language, absence of major ambiguity, clearly readable figures and tables, and adherence to the formatting instructions?
  • Scope: How well does the paper fit the scope of the mining challenge, as outlined in the call for papers and the challenge proposal? In particular, do authors use features specific to SOTorrent or just rely on data from the official Stack Overflow dump?

The proposed criteria support the methodical evaluation of submissions along (relatively) decoupled dimensions. However, there is a natural progression of importance between the dimensions. If a submission is clearly shown to be unsound, the other criteria become irrelevant. Significance and Verifiability increase the acceptability of a submission that describes a sound contribution. Having presentation problems or being out of scope decreases the acceptability of a submission.

Review Authorship

We ask reviewers to personally author each review. In particular, we ask reviewers to avoid the “contact subreviewer” feature or “subreviewer” fields because, in practice, this reassigns authorship to a different person. Reviewers can ask students and/or colleagues for feedback, but their input should be integrated into the reviewer’s own review.

Interpretation of the Reviews

We ask everyone involved in the evaluation process to take into account the following principles when authoring or interpreting reviews:

  • A review is not a vote: The review form requires the input of an overall recommendation that ranges from “reject” to “accept”. The importance of these recommendations should be weighted in proportion to the strength of the argumentation provided in the corresponding review. We wish to avoid directly using the numerical score that aggregates the review ratings as part of the decision process. Also, in making final decisions, we wish to avoid arguments whose sole basis is the number of recommendations for or against a paper. Ultimately, it is the content of the reviews and discussions that should matter.
  • Subjectivity is noise: General arguments based on personal feelings, positive or negative, about a research topic, research method, application domain, types of stakeholders, etc., should not be a factor in the decision process.
  • Sympathy without indulgence: The evaluation of a submission should be respectful and appreciative of the work submitted and report the strengths of a submission as precisely and extensively as possible. At the same time, appreciation for the work described in a submission should not be cause for indulgence. We wish to select submissions by appreciating their strengths while acknowledging their weaknesses; We do not wish to accept submissions by overlooking their weaknesses.

Other Recommendations

  • Depersonalizing reviews: We ask reviewers to comment on the work, not the authors. This should naturally lead to more constructive reviews. We recommend avoiding the use of the second person (“you”).
  • Constructive reviews: We encourage reviewers to rather look for reasons to accept the papers than reasons to reject them. Of course, reviewers can and should point to issues and make suggestions for improving a paper.
  • Elaborate reviews: Reviews should usually be longer than just a few sentences. Please acknowledge that authors put effort into their work and hence deserve detailed feedback.
  • Proofreading reviews: All versions of all reviews will be retained and visible in the conference management system, so we recommend taking a minute to proofread reviews before submitting them.
  • Reporting ethical issues: Any evidence of professional misconduct related to a submission or its evaluation (plagiarism, concurrent submissions, conflict of interest, etc.) should be reported directly to the program co-chairs.

These guidelines are based on the ICSE 2017 reviewing guidelines, the ICSE 2019 call for papers, and the MSR 2019 mining challenge call for papers.