Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

installing pumi with spack #367

Open
Thomas-Ulrich opened this issue Jul 19, 2022 · 9 comments
Open

installing pumi with spack #367

Thomas-Ulrich opened this issue Jul 19, 2022 · 9 comments

Comments

@Thomas-Ulrich
Copy link
Contributor

Hi,
I'm trying to install pumi with:

spack install pumi@master +int64 simmodsuite=kernels +zoltan ~fortran ~simmodsuite_version_check %intel@21.4.0 ^intel-mpi@2019.12.320

For that, I updated this line
https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/pumi/package.py#L97
to mpi_id = 'mpich3'
(else spack will look for SimPartitionedMesh-intel-mpi-2019.12.320)

~/.spack/packages.yaml looks like that:

packages:
  simmetrix-simmodsuite:
    externals:
    - spec: simmetrix-simmodsuite@15.0-210220
      prefix: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/

Then I got the following errors when linking:

/dss/lrzsys/sys/spack/release/22.2.1/opt/skylake_avx512/intel-mpi/2019.12.320-intel-asahktg/compilers_and_libraries_2020.4.320/linux/mpi/intel64/bin/mpiicpc  -O2 -g  -O2 -g -DNDEBUG -rdynamic CMakeFiles/repartition.dir/repartition.cc.o -o repartition   -L/hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/acisKrnl  -L/hppfs/work/pr63qo/di73yeq4/myLibs/spack-packages/linux-sles15-skylake_avx512/zoltan/3.83-intel-21.4.0-bcsuxbm/lib  -Wl,-rpath,/hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/psKrnl:/hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/acisKrnl:/hppfs/work/pr63qo/di73yeq4/myLibs/spack-packages/linux-sles15-skylake_avx512/zoltan/3.83-intel-21.4.0-bcsuxbm/lib:/dss/lrzsys/sys/spack/release/22.2.1/opt/skylake_avx512/parmetis/4.0.3-intel-ucz5it6/lib:/dss/lrzsys/sys/spack/release/22.2.1/opt/skylake_avx512/metis/5.1.0-intel-4gr6lep/lib::::::::::::::::::::::::: ../pumi/libpumi.a ../crv/libcrv.a ../spr/libspr.a ../ree/libree.a ../phasta/libph.a ../apf_sim/libapf_sim.a ../gmi_sim/libgmi_sim.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimPartitionedMesh-mpi.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimDiscrete.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimAcis2020.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimParasolid320.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/psKrnl/libpskernel.so -lSpaACIS /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimField.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimAdvMeshing.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimPartitionedMesh-mpi.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimDiscrete.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimAcis2020.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimParasolid320.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/psKrnl/libpskernel.so -lSpaACIS /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimField.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimAdvMeshing.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimMeshing.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimMeshTools.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimPartitionWrapper-mpich3.a ../ma/libma.a ../mds/libmds.a ../parma/libparma.a ../zoltan/libapf_zoltan.a -lzoltan /dss/lrzsys/sys/spack/release/22.2.1/opt/skylake_avx512/parmetis/4.0.3-intel-ucz5it6/lib/libparmetis.so /dss/lrzsys/sys/spack/release/22.2.1/opt/skylake_avx512/metis/5.1.0-intel-4gr6lep/lib/libmetis.so ../sam/libsam.a ../apf/libapf.a ../gmi/libgmi.a ../lion/liblion.a ../mth/libmth.a ../pcu/libpcu.a
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::makeMem(int)':
SXDRBuf.cc:(.text+0x8e2): undefined reference to `xdrmem_create'
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::SXDRBuf(_IO_FILE*, SSBuf*)':
SXDRBuf.cc:(.text+0x9e0): undefined reference to `xdrstdio_create'
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::get(char&, int)':
SXDRBuf.cc:(.text+0xb7c): undefined reference to `xdr_char'
ld: SXDRBuf.cc:(.text+0xbb1): undefined reference to `xdr_char'
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::get(int&)':
SXDRBuf.cc:(.text+0xc08): undefined reference to `xdr_int'
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::get(unsigned int&)':
SXDRBuf.cc:(.text+0xc58): undefined reference to `xdr_u_int'
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::get(long&)':
SXDRBuf.cc:(.text+0xca8): undefined reference to `xdr_int64_t'
ld: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/15.0-210220/lib/x64_rhel7_gcc48/libSimModel.a(SXDRBuf.o): in function `SXDRBuf::get(unsigned long&)':
SXDRBuf.cc:(.text+0xcf8): undefined reference to `xdr_uint64_t'

Any idea on what could be the cause?
(I tried adding depends_on('libtirpc') to pumi but this does not fixes the problem).

Thomas.

@KennethEJansen
Copy link
Contributor

@cwsmith do we have any evidence of a successful build of this toolchain with intel compilers? In any success I have been involved with it was with gnu compilers.

@cwsmith
Copy link
Contributor

cwsmith commented Jul 19, 2022

Hi @KennethEJansen. Good question. I just successfully built the exGenMesh.cc example from SimModSuite 18.0-220605dev with Intel 19 on a RHEL7 system using the following commands:

mkdir -p obj/x64_rhel7_gcc48
icc -O2 -std=c++11 -Isim//include -c exGenMesh.cc -o obj/x64_rhel7_gcc48/exGenMesh.o
mkdir -p bin/x64_rhel7_gcc48
icc  obj/x64_rhel7_gcc48/exGenMesh.o -o exGenMesh -Lsim//lib/x64_rhel7_gcc48 -lSimMeshing -lSimMeshTools -lSimModel -lpthread -lm 

and it appears to run without any obvious errors.

Hi @Thomas-Ulrich.

If I understand correctly, SimModSuite was installed manually (without using spack) using MPICH and you hit a linking error when building PUMI, via Spack with Intel Compilers and Intel MPI, using that existing SimModSuite install. Is that correct?

Does manually building PUMI with this config (Intel Compiler, Intel MPI, existing SimModSuite install) hit the same error?

libSimModel.a appears to be the only SimModSuite library that is looking for the xdr* symbols. On our RHEL7 system these symbols are defined in the /lib64/libc.so library. In one of the example Makefiles provided with the SimModSuite release they have the following logic to deal with RHEL8 systems:

ifneq ($(findstring x64_rhel8,$(PLATFORM)),)
  LIBS := $(LIBS) -ltirpc
endif

Based on your comment about libtirpc I'm guessing that you are building on a RHEL8 system. Is that correct? If so, we'll have to add logic to the SimModSuite Spack package (and possibly in the PUMI CMake FindSimModSuite.cmake) to support both RHEL7 and RHEL8 (see discussion spack/spack#8730 (comment))

https://github.com/spack/spack/blob/43673fee808f9e02efcb4330c6a7fa2c9b80c14c/var/spack/repos/builtin/packages/simmetrix-simmodsuite/package.py#L239

and add a linking dependency on libtirpc when RHEL8 is enabled. Fortunately, Spack already defines a package for libtirpc. Once that was fixed I think we'd be able to reliably support a GCC build on PUMI with SimModSuite on a RHEL8 system. Supporting linking with the Intel compilers may take additional work.

I'll work on the Spack SimModSuite package changes shortly. ... going to wait for confirmation on the use of RHEL8

@Thomas-Ulrich
Copy link
Contributor Author

Hi,
SimModSuite was not built, I just used the precompiled binaries (which used to work before, I don't know what changed that messed the install up).
Yes, building PUMI with this config hits the same error.

di73yeq4@login03:~> lsb_release -a
LSB Version:    n/a
Distributor ID: SUSE
Description:    SUSE Linux Enterprise Server 15 SP3
Release:        15.3
Codename:       n/a

@cwsmith
Copy link
Contributor

cwsmith commented Jul 20, 2022

Thanks for the OS info.

It looks like Sun RPC, which provides the xdr api/types, was removed from glibc in version 2.32, and was optional starting with version 2.26. In Spack, it will hopefully be easier to detect the glibc version or then try to maintain a list of operating system versions that are known to work. I'll try adding a dependency on rpc from the SimModSuite package as is done in the hdf package (libtirpc appears to be the only provider).

Running ldd --version on your system is one way to determine which glibc version you have (there are other, possibly more robust, ways). As expected, on my Arch Linux system that does not have the linking problem when compiling a SimModSuite example code I get ldd (GNU libc) 2.35 and on a SCOREC RHEL7 workstation that has the linking problem it returns ldd (GNU libc) 2.17.

SimModSuite was not built, I just used the precompiled binaries (which used to work before, I don't know what changed that messed the install up).

Right, Simmetrix does not provide source code for their libs. When I said 'manual install' I meant the install was done without using Spack. Spack can extract the distributed tarballs for SimModSuite, build the MPI wrapper library, install things in places that downstream libraries like PUMI expect, and provide Lua module files.

@cwsmith cwsmith self-assigned this Jul 20, 2022
@Thomas-Ulrich
Copy link
Contributor Author

di73yeq4@login03:~> ldd --version
ldd (GNU libc) 2.31

Yes, I tried build pumi manually I got similar issue.

@cwsmith
Copy link
Contributor

cwsmith commented Jul 20, 2022

@Thomas-Ulrich I just pushed a branch to spack (https://github.com/spack/spack/tree/cws/simmodsuiteRpc) that adds the rpc dependency. This doesn't yet resolve the issue as PUMI needs to know to look for the RPC lib. Ideally, this would be embedded into something provided by SimModSuite, but I don't see an obvious option besides having Spack install a cmake config file.

@cwsmith
Copy link
Contributor

cwsmith commented Jul 28, 2022

@Thomas-Ulrich An initial version of a CMake build system for SimModSuite is here:

https://github.com/SCOREC/simmodsuiteCmake

It currently installs a CMake config file that can be used to successfully compile and link the example in the example directory of that repo. Note, it currently does not handle detecting when the tirpc library dependency is required; if it finds tirpc it will use it. This FindXDR.cmake script may give us a way to detect if it is required.

This will eventually be used by Spack to install SimModSuite so packages like PUMI can more easily resolve its dependencies.

@Thomas-Ulrich
Copy link
Contributor Author

Hi,
Thank you!
So I tried running the script to install a cmake config file but get:

-- The libtirpc library was not found.  It defines xdr symbols (e.g., xdrmem_create) that are need by SimModSuite on systems using glibc newer than 2.32.  Note, glibc starting with 2.26 could optionally have been built without the xdr symbols.

Here is the full log:

di73yeq4@login02:/hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib> cmake -S simmodsuiteCmake -B buildSimModSuite -DCMAKE_PREFIX_PATH=/hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/16.0-220101/ -DSIM_MPI=mpich3 -DSIM_ARCHOS=x64_rhel8_gcc83 -DCMAKE_INSTALL_PREFIX=/hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/16.0-220101/
-- SIM_ARCHOS x64_rhel8_gcc83
-- SIM_MPI mpich3
-- SIMMODSUITE_INCLUDE_DIR /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/16.0-220101/include/
-- simVersion /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/16.0-220101/include/SimModSuiteVersion.h
-- minor 220101
-- major 16.0
-- dot 16.0.220101
-- lib SimModel
-- lib SimDiscrete
-- lib SimField
-- lib SimAdvMeshing
-- lib SimPartitionedMesh-mpi
-- lib SimMeshing
-- lib SimMeshTools
-- lib SimPartitionWrapper-mpich3
-- The libtirpc library was not found.  It defines xdr symbols (e.g., xdrmem_create) that are need by SimModSuite on systems using glibc newer than 2.32.  Note, glibc starting with 2.26 could optionally have been built without the xdr symbols.
-- Configuring done
-- Generating done
-- Build files have been written to: /hppfs/work/pr63qo/di73yeq4/myLibs/SimModelerLib/buildSimModSuite

I also tried to copy FindXDR.cmake to simmodsuiteCmake
(should I rename FindXDR.cmake to Findtirpc.cmake?). But then got:
-- Cannot find RPC headers (rpc/types.h). Persistence will be disabled!
Any idea on how to fix that?

@cwsmith cwsmith added the v2.2.8 label Sep 26, 2022
@sebwolf-de
Copy link
Contributor

Hi @Thomas-Ulrich, it seems to me that rpc is not installed on our cluster ;-)
What I did:
1.) install libtirpc manually into my home directory https://www.linuxfromscratch.org/blfs/view/svn/basicnet/libtirpc.html
2.) find test -name link.txt | xargs -n1 sed -i 's/$/ \/path\/to\/lib\/libtirpc.so/g' 🤓

@cwsmith cwsmith added v2.2.9 and removed v2.2.8 labels Sep 18, 2023
cwsmith added a commit to SCOREC/rhel9-spack-config that referenced this issue Oct 26, 2023
there is a problem installing pumi - see SCOREC/core#367
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants