Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include information on language support, transferable skills, contributors, installation, and purpose of codes #92

Open
ltalirz opened this issue Sep 23, 2021 · 1 comment
Labels
enhancement New feature or request policy Scope, organisation, etc.

Comments

@ltalirz
Copy link
Owner

ltalirz commented Sep 23, 2021

Comment provided by reviewer of the article

The listed metrics of citation count and license are pretty primitive and will drive novice users to the most popular packages without supporting metrics, examples:

  1. Integration with other languages for scripting (e.g., Python)
  2. The ability to curate skills that are transferable to more industries (e.g., not Fortran or domain-specific language inputs)
  3. Number of contributors and stars on a VCS (e.g., GitHub) for OSS codes
  4. Ease of access (e.g., do you compile it yourself or pull from conda?)
  5. What is the purpose of the code? Research, production, development?
@ltalirz
Copy link
Owner Author

ltalirz commented Sep 23, 2021

I sincerely thank the reviewer for these suggestions.

Re 1.: I agree that information on the presence of APIs would be useful to visitors of the site and may be a reason for selecting one code over another one. I believe a great place to start would be to document support for API standards used in multiple codes. Following this suggestion, I have added a new “APIs” column (hidden by default so far) that documents which codes support the QCSchema standard (are there more?).
One could extend the APIs column to record the presence of code-specific python/C/... interfaces but this would muddy the waters a bit. For example, lammps has a python interface that simply takes a regular lammps input file and runs it (to be fair, it can also parse the output file to some extent). Other codes have python interfaces that allow for very detailed control of the execution (or may even embed Python inside the regular input file, as NWChem). I have opened issue #86 to discuss this question further.

Re 2.: While I have some idea of where the reviewer is coming from, I do not see how to represent this in an objective metric that can be applied to all codes. Suggestions welcome!

Re 3.: Following this suggestion, I have added information on contributors to the sublist of open-source codes mentioned in the article. In order to be fair towards younger codes, I have counted only the contributors during the year 2020 (also, early contributors are often lost in migrations between version control systems). Although I agree that it would be nice to have this information in the interactive list as well, I currently see no low-maintenance way of keeping this information up to date (besides the fact that I won’t be able to obtain this information for most commercial codes) and am therefore not going to add it.

Re 4.: I did have some reservation concerning adding information on the software distribution channels since I feared that those may change frequently and thus incur a significant maintenance burden. On the other hand, I fully agree that this information can provide a valuable filter criterion for users who search for a code. Following the prompt of the reviewer, I have therefore now gone through the process of collecting the distribution channels for every code in the list (to the best of my knowledge) and will display the corresponding column by default. I will reevaluate in one year whether this turns out to be useful and how big the fluctuation in this column is.

Re 5.: In my experience, there is a broad spectrum in the purpose of simulation codes that is not easily categorized. The terms “research, production, development” used by the reviewer seem to suggest that the question was mainly about the maturity of the software. What I can say is that all of the codes on the list have at least 100 citations per year, i.e. they are used by more than just one research group, which implies a certain level of maturity. Any specific label beyond that should be based on objective criteria (since it would be seen as a measure of quality), and I think performing such an assessment goes beyond the scope of atomistic.software.

Todo:

  • reevaluate usefulness/maintenance effort for installation routes in 2021

@ltalirz ltalirz added policy Scope, organisation, etc. enhancement New feature or request labels Sep 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request policy Scope, organisation, etc.
Projects
None yet
Development

No branches or pull requests

1 participant