Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make test failure in SVD test #144

Open
TiborGY opened this issue Apr 15, 2022 · 3 comments
Open

make test failure in SVD test #144

TiborGY opened this issue Apr 15, 2022 · 3 comments

Comments

@TiborGY
Copy link

TiborGY commented Apr 15, 2022

Single threaded test. Fully Intel build (icc+icpc+ifort+IntelMPI+MKL), with compiler versions 2021.5.0 from the OneAPI bundles.

export CC=icc
export FC=ifort
./configure CXX="mpiicpc -cxx=icpc" --build-scalapack --build-hptt LINKFLAGS="-lifcore" LDFLAGS="-lifcore"

Testing result: 55/56 passed

Testing QR with m = 36 n = 6:
{ A = QR and Q^TQ = I } passed
Testing SVD with m = 36 n = 7 k=7:
SVD orthogonality check returned 1 (0.000001, 0.000001), residual check 0 (4.831810)
{ A = USVT and U^TU = I } failed
Testing symmetric eigensolve n = 37:
{ AX = XD and X^HX = I } passed

Multithreaded tests typically hang on one test or the other.

@solomonik
Copy link
Collaborator

Is this with 1 MPI process or more? I think this may be associated with a problem in MKL ScaLAPACK, which may be flag/architecture dependent. I spent a long time debugging CTF SVD test failures with a particular compiler/arch combo a couple of years ago, and concluded it was on the MKL side. The CTF SVD code has not changed in quite some time and has been tested in a variety of contexts/apps. It may help to build with -no-ipo or to build ScaLAPACK separately and configure accordingly (CTF can try to do it for you if configured with --build-scalapack). But unfortunately from what I recall I was not able to resolve the problem last time I investigated (but I think we were further constrained as we were trying to resolve this MKL issue as part of the Python CTF version, which is trickier due to requiring dynamic linking to MKL).

@TiborGY
Copy link
Author

TiborGY commented Apr 15, 2022

It was make test which should be 1 MPI process AFAIK.
Now that you mention it, the ./configure script does not seem to detect the presence of Intel's Scalapack build, so this build is actually using the netlib Scalapack built with ifort/MKL BLAS/MKL LAPACK/Intel MPI. See below for ./configureoutput if I leave --build-scalapack out of the options:

~/nfs_zpool/ctf$ ./configure CXX="mpiicpc -cxx=icpc" --build-hptt LINKFLAGS="-lifcore" LDFLAGS="-lifcore"
Checking compiler type/version... Using Intel compilers.
Checking whether __APPLE__ is defined... no.
Checking compiler (CXX)... successful.
Checking flags (CXXFLAGS)... successful.
Checking availability of C++11... successful.
Checking for MPI... MPI works.
Checking for OpenMP... OpenMP works.
Checking for static BLAS library... detected that -mkl works, speculatively using -mkl.
Checking for availability of static batched gemm... available, will build with -DUSE_BATCH_GEMM.
Checking for dynamic BLAS library... detected that -mkl works, speculatively using -mkl.
Checking for availability of dynamic batched gemm... Checking for static LAPACK library... static LAPACK found.
Checking for dynamic LAPACK library... dynamic LAPACK found.
Checking for sparse MKL routines... sparse MKL found.
Checking for static ScaLAPACK...   static ScaLAPACK not found, some functionality and tests will be unavailable,
  to fix reconfigure and add --with-scalapack and the appropriate library path
  (LIB_PATH/LIBS for static, LD_LIB_PATH/LD_LIBS for dynamic) or configure with
  --build-scalapack to attempt to automatically download and build scalapack
Checking for dynamic ScaLAPACK...   dynamic ScaLAPACK not found, some functionality and tests will be unavailable,
  to fix reconfigure and add --with-scalapack and the appropriate library path
  (LD_LIB_PATH/LD_LIBS for dynamic, LD_LD_LIB_PATH/LD_LD_LIBS for dynamic)

@TiborGY
Copy link
Author

TiborGY commented Apr 15, 2022

FYI, Scalapack 2.2.0 was released this february, but the ./configure still downloads 2.1.0. I will try building Scalapack 2.2.0 separately, with gfortran, and do a full GNU+OpenMPI+OpenBLAS build, to see if that is still problematic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants