Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorporating reproducibility #12

Closed
6 tasks
cassgvp opened this issue Nov 19, 2023 · 16 comments
Closed
6 tasks

Incorporating reproducibility #12

cassgvp opened this issue Nov 19, 2023 · 16 comments

Comments

@cassgvp
Copy link
Collaborator

cassgvp commented Nov 19, 2023

Summary Sentence

The organising committee are keen to emphasise the importance of reproducibility in our community.

The following activities would provide opportunities to celebrate and build capacity for reproducibility in this meeting.

  • Reproducibility co-chairs
  • Training (repro 101 before/after meeting?)
  • Repro challenge (e.g. as 2023)
  • Specific awards/mentions/tracks in the programme
  • Badges on submissions (via Cambridge University Press, publishers of proceedings)
  • any others?

See also:

What needs to be done?

Org committee to decide what activites to include and nominate leads for each.

Update after the issue was opened

[Add details]

@acocac
Copy link
Member

acocac commented Nov 27, 2023

I'd suggest we try the awards route. We could add an additional checklist during the abstract submission if the authors would like to participate in the prize, please share code/data. I'm happy to champion reproducibility in the 2024 CI conference with @dorchard. We should build upon existing resources e.g. CI2023 Reproducibility Challenge, AGILE Reproducibility Checklist and others.

@MarionBWeinzierl
Copy link
Collaborator

MarionBWeinzierl commented Nov 28, 2023

Happy to get involved in this discussion :)

@dorchard
Copy link
Collaborator

dorchard commented Dec 7, 2023

Sorry for being slow on the uptake- a very busy month, to wit, perhaps its best having a team of reproducibility champions! I would be interested in leading efforts around reproducibility in peer review process, e.g., having an additional artefact evaluation committee that reviews artefacts submitted alongside accepted papers. I couldn't organise it on my own though!

@dorchard
Copy link
Collaborator

dorchard commented Dec 7, 2023

My postdoc Roly Perera would be able to help too I think and would be a great addition. He and I have both been involved in artefact evaluation in the past (both as committee members and as punters being evaluated!)

@acocac
Copy link
Member

acocac commented Dec 8, 2023

My postdoc Roly Perera would be able to help too I think and would be a great addition. He and I have both been involved in artefact evaluation in the past (both as committee members and as punters being evaluated!)

@rolyp would be a great addition. Roly, please feel free to confirm your interest here (:

@rolyp
Copy link
Collaborator

rolyp commented Dec 8, 2023

Count me in – sounds great.

@geo-rao
Copy link
Collaborator

geo-rao commented Dec 8, 2023

@dorchard the reproducibility review is interesting - but thinking about the composition of submissions, we often have much more abstracts than papers. The reproducibility review probably only apply to the full paper submissions.

@dorchard
Copy link
Collaborator

dorchard commented Dec 9, 2023

Yes I was thinking just for full paper submissions and on those that are accepted, i.e., this wouldn't be about accepting/rejecting based on the artefact, but about assessing its reproducibility and trying to encourage and enable authors to have available and reusable artefacts.

@cassgvp
Copy link
Collaborator Author

cassgvp commented Dec 12, 2023

Hi @dorchard @rolyp @acocac @MarionBWeinzierl @geo-rao - Love where this discussion is going!

We'll need to include some note on this in the submission guidelines (which are going live asap!) if the full paper submissions are going to be assessed on this, even informally. Could you have a look at the submission info PR and make any suggestions on the file? I'll add you all to the repo now so you have write access (let me know if you can't access!). I suggest you work on something quite light-touch (or non-specific!) for timelines :)

@MarionBWeinzierl
Copy link
Collaborator

Here is the description of the AE/AD evaluation at SC23, including what they ask for for the badges: https://sc23.supercomputing.org/program/papers/reproducibility-appendices-badges/

@rolyp
Copy link
Collaborator

rolyp commented Dec 12, 2023

Hi @cassgvp, I’m off this afternoon and most of tomorrow but will earmark some time on Thurs to take a look. Thanks.

@rolyp rolyp changed the title incorporating reproducibility Incorporating reproducibility Feb 2, 2024
@rolyp
Copy link
Collaborator

rolyp commented Feb 2, 2024

A few random notes/questions in advance of today’s meeting:

  • Had a look at AD/ process for SC23. Impressions:
    • Looks comprehensive (perhaps too heavyweight to emulate for CI 2024 but we could borrow from it)
    • AD (Artifact Description) stage seems mandatory (which I assume we wouldn’t want)
  • What resources do we expect to have for the AE process?
    • AEC with ~n committee members?
    • How many papers do we anticipate accepting and what therefore is likely expected workload per reviewer?
  • Useful How-to Guide for AE submitters (written by PL people so we could perhaps adapt)
  • Require persistent DOIs (e.g. via Zenodo)?
  • ACM seems to use variants of the following badges, may we could use similar:
    • Artifact Available
    • Artifact Evaluated – Functional
    • Artifact Evaluation – Reproducible
  • Is CUP/EDS on board with idea of adding badges to certain papers?
  • Are there (non-branded) badges we can reuse from another conference?
  • What is the timeframe for AE (in relation to publication in EDS etc)?
  • Reproducibility [Co-]Chairs sounds better than Reproducibility Champions (sounds more serious :)
  • Should/could we require a standard platform to submit artefacts (e.g. Code Ocean)?

@rolyp
Copy link
Collaborator

rolyp commented Feb 15, 2024

@acocac @cassgvp Quick note in advance of today’s meeting. I noticed the #3 issue. Is it worth now creating a separate Artifact Evaluation Committee issue to track the following:

  • Potential channels to find committee members (e.g. Cambridge/ICCS RSE, Turing REG)
  • People we’ve invited/have expressed interest/have accepted
  • What’s involved in being an AEC member
    • Expected workload (I think @dorchard did a v. quick estimate of ~2 reviews pp based on 20 submissions and an AEC of 20 people)
    • Training provided? (if I understood @cassgvp correctly)
    • Credit and benefits
      • AEC listed on conference website
      • “Validation” CRediT for any accepted papers whose artifact you reviewed (Andrew mentioned this)
      • Free book from CUP?
  • AEC guidelines for submitters (although perhaps that deserves an issue of its own!)

@dorchard
Copy link
Collaborator

Sorry I did not do my homework yet of writing an overview of the approach we will use.

@acocac
Copy link
Member

acocac commented Feb 15, 2024

Sharing some areas of work and questions discussed in today's meeting. Additional inputs or resources would be welcome 🙏

- Reviewers: discussed strategies to recruit volunteers. Should we start with the existing list of reviewers for abstract/full paper submissions? Shall we have a separate call for reviewers of computational artefacts?

- Checklist: which elements are essential? will we evaluate reproducibility according to a tier-based system ACM artefact review badging? which other components should we adopt/adapt from existing checklists: SC, AGILE, Machine Learning, ReproHack?

- Communication: how do we ensure a safe and healthy communication space between authors and reviews (and moderators)? are we adhering to an existing code of conduct e.g. EDS, Climate Informatics or do we need a custom one?

- Badges: if using an existing badging system, which permission do we require e.g. ACM? do we prefer badges successfully implemented in the wider informatics/computer science community e.g. ACM in SPLASH2024 or emerging ones in other domains e.g. CodeCheck?

- Platforms: do we prefer simple platforms: slack, email and Google Forms, or paid ones?

- Computational artefacts: can we anticipate or investigate which are the types of artefacts commonly used in computational climate science? will the reproducibility assessment change according to the typology of computational artefacts?

- Activities:

  • Panel (CI2024) We agreed on hosting a panel for CI2024 focused on the assessment of computational artefacts and their relevance to improve the reproducibility, replicability and extensibility of Climate Science research. The panel would benefit from lightning talks.
  • Post-conference sessions: focused on improving skills to review computational artefacts.
  • Others: a blog post that provides some context of existing initiatives. The blog post should be highlighted on the CI2024 website.

@rolyp
Copy link
Collaborator

rolyp commented May 19, 2024

@acocac @cassgvp Closing this as “completed” – in the sense that the Artifact Evaluation process is well under way at https://github.com/orgs/alan-turing-institute/projects/246.

@rolyp rolyp closed this as completed May 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: ✅ Done
Development

No branches or pull requests

6 participants