Back

Minisymposium Presentation

Studying Artifact Evaluation to Improve Reproducibility and Usability

Monday, June 3, 2024
15:30
-
16:00
CEST
Climate, Weather and Earth Sciences
Climate, Weather and Earth Sciences
Climate, Weather and Earth Sciences
Chemistry and Materials
Chemistry and Materials
Chemistry and Materials
Computer Science and Applied Mathematics
Computer Science and Applied Mathematics
Computer Science and Applied Mathematics
Humanities and Social Sciences
Humanities and Social Sciences
Humanities and Social Sciences
Engineering
Engineering
Engineering
Life Sciences
Life Sciences
Life Sciences
Physics
Physics
Physics

Description

Researchers support reproducibility and rigorous science by sharing and reviewing research artifacts—the documentation and code necessary to replicate a computational study. Creating quality research artifacts and conducting reviews for conferences and journals are both considered to be time consuming and poorly rewarded activities. To simplify these scholarly tasks, we studied the work of artifact evaluation (i.e., artifact reviewing) for a recent ACM conference. Through analysis of reviewers’ comments and their responses to a set of three surveys distributed throughout the evaluation process, we recognized common issues reviewers faced and the features of high quality artifacts. To lessen the time and difficulty of artifact creation and evaluation, we identify design implications for infrastructures like the testbeds used to execute research artifacts. By applying the knowledge gleaned through our study, we hope to improve the usability of research infrastructure and, consequently, the reproducibility of research artifacts.

Authors