Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Paraview for ICON on Santis #93

Draft
wants to merge 11 commits into
base: main
Choose a base branch
from
Draft

Conversation

albestro
Copy link
Contributor

@albestro albestro commented May 10, 2024

This PR is a work in progress to get Paraview built for ICON. I'm not sure it is the most maintainable solution (no reference to icon-wcp, just copied the folder), in case anyone has any suggestion, please tell me.

TODO:

Main features enabled in Paraview:

  • mpi
  • fortran
  • python
  • tbb
  • nvindex
  • CDI
  • raytracing
  • libcatalyst

paraview@5.12.0~adios2~advanced_debug~catalyst+cdi+cuda+development_files+egl~examples~eyedomelighting+fortran+hdf5~ipo+kits+libcatalyst+mpi+nvindex+opengl2~openpmd~osmesa~pagosa+python~qt+raytracing~rocm+shared+tbb~visitbridge build_edition=canonical build_system=cmake build_type=Release cuda_arch=90 generator=ninja patches=02253c7,f45d9e9 use_vtkm=default

FYI @jfavre

Alberto Invernizzi and others added 8 commits May 8, 2024 18:53
still no CDI and no raytracing
release 0.21 does not have last required udpates to:
- egl
- glew
- opengl
- paraview
since environment prefers +cuda, concretizer was selecting an older
version of LLVM (i.e. 14) that does not support new CUDA architecture.
older ones had some build problem
- patchelf%gcc
- gmake%gcc
- paraview@5.12%gcc +mpi +tbb +egl ~qt +fortran +python +hdf5 +nvindex +cdi +raytracing
- llvm~cuda
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a workaround, because of cuda variant which has been set as obsolete starting from llvm@15 by means of a conflict.

https://github.com/spack/spack/blob/e9149cfc3cfc94bd66aa0e0901ce562cf8b75713/var/spack/repos/builtin/packages/llvm/package.py#L380

This was constraining the concretization to llvm@14, which does not support Santis new CUDA architecture.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my cmake-based compilation does not need, or use any llvm. So, there must be something [un-necessary] in the way your spack compilation goes. John Biddiscombe would know more. Please ping him.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

llvm is a dependency of ispc, which is itself a dependency of ospray and openimagedenoise.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh yeah, I forgot that one.

specifcy it as root so that I can enable fortran, python and mpi
just for it and not for every package supporting it
it requires a new version with a fix
e.g. mpi4py would have not been accessible without this change
@@ -0,0 +1,11 @@
# ICON PE

A programming environment for development of both the Fortran+OpenACC and DSL versions of ICON.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note-to-self: update this

- gmake
views:
default:
link: run
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has been changed to run so that all dependencies are exposed in the view.

In particular, without this paraview would not get all dependencies in the view, e.g. mpi4py python package would not be exposed (while it should).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bcumming apparently you are the author of the ICON image from where I started.

Do you think this change is problematic? I can try thinking about alternatives, but I don't have the full view of the thing (e.g. should we have a separate view just for paraview?!)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

link: run is problematic, because it breaks many command line utilities because they will find dependencies in /user-environment/env/$envname/lib{64}, when they were linked against the versions in /user/lib etc.

The preferred approach is to add all of the specs that you need to expose as roots.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A bigger question, is why are we building an environment with both paraview and ICON dependencies together?

The workflow of creating a separate tool environment in /user-tools is designed for this type of use case.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we use python, and netcdf from the ICON image (and perhaps other things I forgot). I don't understand enough about the setup to tell if things must be re-build indipendently. But for sure, the ICON-ParaView interface would only be used by ICON users, while ParaView will be used by multiple communities. ParaView must cater to many use-cases (other netcdf users without ICON) (in-situ is another example where the need to have mpi4py exposed forced us to use link: run.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because Paraview should be used by multiple communities, it makes sense to make a separate paraview uenv.
The image can provide its own netcdf etc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants