Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: Module on dust aerosol detection, monitoring and forecasting #200

Open
21 of 44 tasks
whedon opened this issue Mar 7, 2023 · 58 comments
Open
21 of 44 tasks

Comments

@whedon
Copy link

whedon commented Mar 7, 2023

Submitting author: @jwagemann (Julia Wagemann)
Repository: https://gitlab.eumetsat.int/eumetlab/atmosphere/dust-monitoring
Branch with paper.md (empty if default branch):
Version: v0.1
Editor: @yabellini
Reviewers: @RomiNahir, @cosimameyer, @yabellini
Archive: Pending
Paper kind: learning module

Status

status

Status badge code:

HTML: <a href="https://jose.theoj.org/papers/52505bf5ea349268151066953d284b0d"><img src="https://jose.theoj.org/papers/52505bf5ea349268151066953d284b0d/status.svg"></a>
Markdown: [![status](https://jose.theoj.org/papers/52505bf5ea349268151066953d284b0d/status.svg)](https://jose.theoj.org/papers/52505bf5ea349268151066953d284b0d)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@RomiNahir & @s-m-e, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/jose-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @yabellini know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @RomiNahir

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release (v0.1)?
  • Authorship: Has the submitting author (@jwagemann) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @sbanchero

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release (v0.1)?
  • Authorship: Has the submitting author (@jwagemann) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon
Copy link
Author

whedon commented Mar 7, 2023

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @RomiNahir, @s-m-e it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/jose-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/jose-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Mar 7, 2023

Wordcount for paper.md is 1640

@whedon
Copy link
Author

whedon commented Mar 7, 2023

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@whedon
Copy link
Author

whedon commented Mar 7, 2023

Software report (experimental):

github.com/AlDanial/cloc v 1.88  T=0.44 s (64.1 files/s, 273242.9 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Jupyter Notebook                24              0         117214           1717
Markdown                         2             84              0            205
YAML                             1              0              0             45
TeX                              1              0              0             37
-------------------------------------------------------------------------------
SUM:                            28             84         117214           2004
-------------------------------------------------------------------------------


Statistical information for the repository 'c644813d48822785d482fb19' was
gathered on 2023/03/07.
The following historical commit information, by author, was found:

Author                     Commits    Insertions      Deletions    % of changes
Julia Wagemann                   7         28169          28265           99.59
jwagemann                        2           163             67            0.41

Below are the number of rows from each author that have survived and are still
intact in the current revision:

Author                     Rows      Stability          Age       % in comments

@whedon
Copy link
Author

whedon commented Mar 7, 2023

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- None

MISSING DOIs

- None

INVALID DOIs

- None

@RomiNahir
Copy link
Collaborator

Overall, this article is an excellent guide to learn and implement dust monitoring and detection with clear objectives and comprehensible modules. The learning platform has an easy access with high quality exercises and examples. I think the implementation of this algorithms in other world regions won't be complicated to adapt.

@whedon
Copy link
Author

whedon commented Mar 21, 2023

👋 @s-m-e, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented Mar 21, 2023

👋 @RomiNahir, please update us on how your review is going (this is an automated reminder).

@yabellini
Copy link
Member

@jwagemann @RomiNahir finished her review and mention you need to add this point:

  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support
    Please, work on it while I look for a new second reviewer.

@yabellini
Copy link
Member

Hi @s-m-e, since we have not hard from you in several weeks, we are now looking for a new reviewer. Thank you for your original willingness to contribute a review.

@yabellini
Copy link
Member

@whedon remove @s-m-e as reviewer

@whedon whedon assigned RomiNahir and yabellini and unassigned yabellini and RomiNahir Apr 24, 2023
@whedon
Copy link
Author

whedon commented Apr 24, 2023

OK, @s-m-e is no longer a reviewer

@yabellini
Copy link
Member

Hi @yxqd, you volunteer to review for JOSE. Will you be willing to review this submission about: Module on dust aerosol detection, monitoring and forecasting ?

@yabellini
Copy link
Member

Hi @sbanchero, thanks for agreeing to review this work :-)

@yabellini
Copy link
Member

@whedon add @sbanchero as reviewer

@whedon
Copy link
Author

whedon commented May 29, 2023

OK, @sbanchero is now a reviewer

@yabellini
Copy link
Member

@cosimameyer thank you so much for you review.

@yabellini
Copy link
Member

@jwagemann the reviewers finished their work and let some comments for you. Please let us know when you work on these observations.

@jwagemann
Copy link

Hi @cosimameyer and @RomiNahir for your great reviews.
I'll work on the refinements as suggested and hope to respond to this thread by end of January.

@yabellini: the review checklist for @cosimameyer is slightly different to the one that is defined above. Have the review checklists changed or is there a confusion with the checklist from JOSS?

@yabellini
Copy link
Member

@yabellini: the review checklist for @cosimameyer is slightly different to the one that is defined above. Have the review checklists changed or is there a confusion with the checklist from JOSS?

I know the bot changes from when the review starts to when Cosima becomes a reviewer, so perhaps the checklist also changes, but I need to ask to be sure.

@labarba
Copy link
Member

labarba commented Jan 23, 2024

@openjournals/dev We have a little mystery here: the review checklist added for the second reviewer (post change to editorialbot) looks like a software checklist, not a learning module checklist. JOSE has two article types, with slightly different checklists. Is editorialbot able to generate the right checklist, depending on article type?

@xuanxu
Copy link
Member

xuanxu commented Jan 24, 2024

@labarba in order to decide which checklist to use editorialbot reads the headers in the body of the issue looking for the paper kind info (see for example here). It looks like this review is missing it so it added the default checklist (software). I'm going to add the paper kind header with the correct value.

@xuanxu
Copy link
Member

xuanxu commented Jan 24, 2024

Added paper kind header with learning module as value.
Re-running the checklist command should add the right one. The old one can be deleted.

@yabellini
Copy link
Member

yabellini commented Jan 24, 2024

Thanks @xuanxu ! I would not delete the previews one because already have the review.

@yabellini
Copy link
Member

@editorialbot generate my checklist

@editorialbot
Copy link
Collaborator

@yabellini I can't do that because you are not a reviewer

@yabellini
Copy link
Member

@editorialbot add @yabellini as reviewer

@editorialbot
Copy link
Collaborator

@yabellini added to the reviewers list!

@yabellini
Copy link
Member

yabellini commented Feb 12, 2024

Review checklist for @yabellini

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the https://gitlab.eumetsat.int/eumetlab/atmosphere/dust-monitoring?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release?
  • Authorship: Has the submitting author (@jwagemann) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

@yabellini
Copy link
Member

I will trespass @cosimameyer's review to the format of JOSE in the checklist I created for myself.

@yabellini
Copy link
Member

@cosimameyer Because the bot created the checklist for JOSS papers and not JOSE, we still need to review some aspects of the paper related to the pedagogical points. Can you review and comment on those? Based on your checklist and comments, I created the right checklist and already completed the points I can. I'm sorry for this inconvenience.

@cosimameyer
Copy link

No worries 🤗 I seem not to be able to tick any boxes in your review, @yabellini. I'll give it a try to generate new one for myself.

@cosimameyer
Copy link

cosimameyer commented Feb 12, 2024

Review checklist for @cosimameyer

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the https://gitlab.eumetsat.int/eumetlab/atmosphere/dust-monitoring?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release?
  • Authorship: Has the submitting author (@jwagemann) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

@cosimameyer
Copy link

@yabellini @jwagemann please find the correct reviewer checklist above ☺️ I adjusted my comments slightly to fit the new checklist (if they are already tackled, please dismiss them):

General Checks

Documentation

  • Usage: While I really appreciate that the content is also hosted on a platform, it looks like you have to sign up for a user account to access the exercises. In this case, I would like to get a (short) tutorial from the authors that explains to new users how to use the open source materials (for the hosted content: what are the login requirements when using the standalone website/platform? (e.g., is the account free? Is account creation limited to a certain group of people (academics, students of institute X?); for the material hosted on GitLab: What are the steps a user needs to follow to use the Jupyter notebooks? What version of Python is required? What libraries are needed and how to get them? (The authors are using conda and the environment.yml, so I'd recommend giving a short tutorial on how to set things up (including how to set up conda (or at least some helpful links pointing to tutorials that explain how to set it up on different operating systems as it may significantly increase the entry barriers for new users). Alternatively, as a low-barrier approach, adding a requirements.txt file and explaining how to use it might be sufficient).
  • Community guidelines: Similar to Romi, I would like to see some community guidelines ("Clear guidelines for third parties who wish to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support"). Maybe I haven't seen them (which might also indicate that they need to be made more visible).

JOSE paper

  • General idea (only if possible to implement but I believe it will help a lot!): It wasn't always clear to me which model/data needs to be downloaded to perform each exercise (I am not a topical expert, which may complicate the decision but I wondered if - unless for pedagogical reasons or permission issues - it doesn't make sense to host the data somewhere else while referring to the original data source?) If not, I recommend adding a specific description of which file needs to be downloaded (ideally, if possible, also a direct link).

Again, I'm very much looking forward to seeing your work published 🙌

@yabellini
Copy link
Member

Thank you so much @cosimameyer !
@jwagemann now we have the two revisions ready.

@jwagemann
Copy link

@whedon generate pdf

@editorialbot
Copy link
Collaborator

My name is now @editorialbot

@jwagemann
Copy link

@editorialbot generate pdf

1 similar comment
@jwagemann
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

1 similar comment
@editorialbot
Copy link
Collaborator

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@jwagemann
Copy link

jwagemann commented May 7, 2024

Hi @cosimameyer , @RomiNahir and @yabellini ,
thank you for reviewing our submission and improving our content with your valuable comments. I finally got around to integrate your comments.

I made the following changes:

  • I fixed the broken links under https://gitlab.eumetsat.int/eumetlab/atmosphere/dust-monitoring/-/blob/main/03_practical_case_study/01_exercise.ipynb
  • On the Gitlab README as well as the 00_index.ipynb notebook, I added information on usage and community guidelines:
    • there is a header How to use the content in which I give detailed explanations on how to use the content either via the Jupyterhub-based platform or how it can be cloned and setup locally
    • I also added a Contact & Support section, in which we invite learners to get in touch with the EUMETSAT training team if they have questions, want to report an issue or provide feedback.
  • regarding the point to integrate more details on how the data can be accessed, I elaborated more in detail on each notebook what data file we downloaded from the respective data system. Some notebooks already had the data access as optional notebook part. For others in which this part cannot be integrated to the notebook, I elaborate in the How to access the data section in detail how we made the data selection. The comment also proposed to mention this more in detail in the paper. I thought about it and it might be too much detail for the overview paper. For this reason, I added these details on the notebooks directly.

I further used the occasion to update the content and make sure that the content is up-to-date.

Hope I was able to address all comments and concerns. Please let me know if you have any further questions. I look forward to seeing this work published.

Best regards,
Julia

@cosimameyer
Copy link

Thanks so much, @jwagemann! This addresses the points I raised. I am very happy to see these additions and believe they make your submission even more accessible to users who will benefit from the content.

I don't have any additional comments from my side and I'm very happy to see it published! 🎉

@jwagemann
Copy link

Thank you @cosimameyer for your valuable inputs and your time reviewing our submission! Highly appreciated.

@jwagemann
Copy link

@yabellini Could you advise me on the next steps, now that revisions have been accepted? Thanks.

@yabellini
Copy link
Member

@RomiNahir, can you please confirm you are ok with the changes @jwagemann did based on the review recommendations? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants