Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Symbol 'mpi_complex' at (1) has no IMPLICIT type #415

Open
barracuda156 opened this issue Aug 12, 2023 · 14 comments
Open

Error: Symbol 'mpi_complex' at (1) has no IMPLICIT type #415

barracuda156 opened this issue Aug 12, 2023 · 14 comments

Comments

@barracuda156
Copy link

This is a new error, apparently. Build from master 2023.07.12 was okay; build from master 2023.08.10 is broken:

[ 64%] Building CXX object ElmerGUI/Application/CMakeFiles/ElmerGUI.dir/src/summaryeditor.cpp.o
cd /opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/build/ElmerGUI/Application && /opt/local/bin/g++-mp-12 -DEG_PLUGIN -DEG_QWT -DHAVE_EXECUTECOMMANDLINE -DQT_CORE_LIB -DQT_GUI_LIB -DQT_NO_DEBUG -DQT_OPENGL_LIB -DQT_SCRIPT_LIB -DQT_XML_LIB -DUSE_ARPACK -DUSE_ISO_C_BINDINGS -I/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/build/ElmerGUI/Application -I/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/ElmerGUI/Application -I/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/build/ElmerGUI/Application/ElmerGUI_autogen/include -I/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/ElmerGUI/netgen/libsrc/interface -I/opt/local/libexec/qt4/include/qwt -isystem /opt/local/libexec/qt4/include -isystem /opt/local/libexec/qt4/include/QtOpenGL -isystem /opt/local/libexec/qt4/include/QtScript -isystem /opt/local/libexec/qt4/include/QtGui -isystem /opt/local/libexec/qt4/include/QtXml -isystem /opt/local/libexec/qt4/include/QtCore -pipe -Os -DNDEBUG -isystem/opt/local/include/LegacySupport -I/opt/local/include -fopenmp -arch ppc -mmacosx-version-min=10.6 -fPIE   -DCONTIG= -framework OpenGL -framework GLU -MD -MT ElmerGUI/Application/CMakeFiles/ElmerGUI.dir/src/summaryeditor.cpp.o -MF CMakeFiles/ElmerGUI.dir/src/summaryeditor.cpp.o.d -o CMakeFiles/ElmerGUI.dir/src/summaryeditor.cpp.o -c /opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/ElmerGUI/Application/src/summaryeditor.cpp
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:5019:51:

 5019 |      CALL MPI_ALLREDUCE( ssum, tsum, 1, MPI_COMPLEX, &
      |                                                   1
Error: Symbol 'mpi_complex' at (1) has no IMPLICIT type
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:4989:25:

 4989 |      CALL MPI_ALLREDUCE( isum, tsum, 1, MPI_INTEGER, MPI_SUM, comm, ierr )
      |                         1
......
 5135 |   CALL MPI_ALLREDUCE( dsum, dres, 1, MPI_DOUBLE_COMPLEX, &
      |                      2   
Warning: Type mismatch between actual argument at (1) and actual argument at (2) (INTEGER(4)/COMPLEX(8)).
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:4989:31:

 4989 |      CALL MPI_ALLREDUCE( isum, tsum, 1, MPI_INTEGER, MPI_SUM, comm, ierr )
      |                               1
......
 5135 |   CALL MPI_ALLREDUCE( dsum, dres, 1, MPI_DOUBLE_COMPLEX, &
      |                            2   
Warning: Type mismatch between actual argument at (1) and actual argument at (2) (INTEGER(4)/COMPLEX(8)).
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:4711:19:

 4711 |     CALL MPI_BSEND(L(nj),1,MPI_INTEGER,j,6000,ELMER_COMM_WORLD, ierr)
      |                   1
......
 4785 |        CALL MPI_BSEND( VecL(nj) % rbuf, L(nj), MPI_DOUBLE_PRECISION, &
      |                       2
Warning: Type mismatch between actual argument at (1) and actual argument at (2) (INTEGER(4)/REAL(8)).
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:4657:23:

 4657 |         CALL MPI_RECV( DPBuffer, VecLen, MPI_DOUBLE_PRECISION, &
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)).
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:4485:21:

 4485 |       CALL MPI_RECV( Gindices, veclen, MPI_INTEGER, sproc, &
      |                     1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2 
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:4325:24:

 4272 |     CALL MPI_iRECV( recv_size(i), 1, MPI_INTEGER, &
      |                    2    
......
 4325 |         CALL MPI_iRECV( recv_buf(i) % vec, datalen, MPI_DOUBLE_PRECISION, &
      |                        1
Warning: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)).
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:2204:23:

 2204 |         CALL MPI_RECV( gindices, 3*DataSize, MPI_INTEGER, i-1, 400, ELMER_COMM_WORLD, status, ierr )
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:2306:23:

 2306 |         CALL MPI_RECV( IntArray, DataSize, MPI_INTEGER, i-1, 950, ELMER_COMM_WORLD, status, ierr )
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:2368:23:

 2368 |         CALL MPI_RECV( IntArray, DataSize, MPI_INTEGER, i-1, 1800, ELMER_COMM_WORLD, status, ierr )
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:2451:51:

 2451 |         CALL MPI_FINALIZE( ELMER_COMM_WORLD, ierr )
      |                                                   1
Warning: More actual than formal arguments in procedure call at (1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:1578:23:

 1578 |         CALL MPI_RECV( gindices, 4*DataSize, MPI_INTEGER, &
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:1621:22:

 1621 |        CALL MPI_RECV( gindices, DataSize, MPI_INTEGER, &
      |                      1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2  
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:1821:23:

 1821 |         CALL MPI_RECV( buf, DataSize*2, MPI_INTEGER, &
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:949:24:

  949 |          CALL MPI_RECV( gindices, 3*DataSize, MPI_INTEGER, &
      |                        1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2    
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:998:21:

  998 |       CALL MPI_RECV( gindices, DataSize, MPI_INTEGER, &
      |                     1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2 
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:1095:24:

 1095 |          CALL MPI_RECV( gindices, DataSize, MPI_INTEGER, &
      |                        1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2    
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:1287:23:

 1287 |         CALL MPI_RECV( buf, DataSize*2, MPI_INTEGER, &
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:363:23:

  363 |         CALL MPI_RECV( Active, n, MPI_INTEGER, MinActive, &
      |                       1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2   
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:384:25:

  384 |           CALL MPI_RECV( Active, n, MPI_INTEGER, proc, &
      |                         1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2     
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:478:67:

  478 |         CALL MPI_BSEND(j,1,MPI_INTEGER,i-1, 20000,ELMER_COMM_WORLD,status,ierr)
      |                                                                   1
......
 4786 |                 Neigh(nj), 6001, ELMER_COMM_WORLD, Ierr )
      |                                                   2                
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:478:79:

  478 |         CALL MPI_BSEND(j,1,MPI_INTEGER,i-1, 20000,ELMER_COMM_WORLD,status,ierr)
      |                                                                               1
Warning: More actual than formal arguments in procedure call at (1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:479:77:

  479 |         IF (j>0) CALL MPI_BSEND(buf,j,MPI_INTEGER,i-1,20001,ELMER_COMM_WORLD,status,ierr)
      |                                                                             1
......
 4786 |                 Neigh(nj), 6001, ELMER_COMM_WORLD, Ierr )
      |                                                   2                          
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:479:89:

  479 |         IF (j>0) CALL MPI_BSEND(buf,j,MPI_INTEGER,i-1,20001,ELMER_COMM_WORLD,status,ierr)
      |                                                                                         1
Warning: More actual than formal arguments in procedure call at (1)
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-d1d49839b70a6a71336a55f512d3cc03b1d2fc78/fem/src/SParIterComm.F90:489:25:

  489 |           CALL MPI_RECV( buf,sz,MPI_INTEGER,proc,20001,ELMER_COMM_WORLD,status,ierr)
      |                         1
......
 4813 |      CALL MPI_RECV( Veclen, 1, MPI_INTEGER, neigh(i), &
      |                    2     
Warning: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)
make[2]: *** [fem/src/CMakeFiles/elmersolver.dir/SParIterComm.o] Error 1
make[2]: Leaving directory `/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/build'
make[1]: *** [fem/src/CMakeFiles/elmersolver.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
@barracuda156
Copy link
Author

Not sure why it even uses MPICH when it is disabled for it. Settings can be seen here: https://github.com/macports/macports-ports/blob/master/science/elmerfem/Portfile

@richb2k
Copy link
Contributor

richb2k commented Aug 12, 2023

Also, see this forum post with similar error:

http://www.elmerfem.org/forum/viewtopic.php?p=28919#p28919

Building Elmer without MPI in Windows 10 using yesterday's devel branch, gives the same error.

Rich.

@barracuda156
Copy link
Author

@richb2k I see, thank you for the info. Looks like the code has actually been messed up (I had no time to go through commit history and Git blame, perhaps something identifiable).
If the decision was taken that MPICH is not optional anymore, CMake should force it and fail if MPICH not found.

I can’t recall why I have set it to off in the port though. Maybe there was some strong reason (a failure with GCC, or with Clang, or dependencies did not support MPICH), maybe just decided to make the port simple initially. I can try using MPICH, normally it works.

@richb2k
Copy link
Contributor

richb2k commented Aug 12, 2023

Here is the error message, compiled without MPI, in Windows 10:

C:/Elmer/elmerfem-Copy/fem/src/SParIterComm.F90:5019:51:
5019 | CALL MPI_ALLREDUCE( ssum, tsum, 1, MPI_COMPLEX, &
| 1
Error: Symbol 'mpi_complex' at (1) has no IMPLICIT type

Looks like it may be related to a commit on 3 August 2023.

Also, the last successful Windows binary installer build with 'no-mpi' is dated 3 August 2023.

@barracuda156
Copy link
Author

@raback Could you please take a look? It seems that 3442c3c has broken at least non-MPICH builds on different platforms.

@richb2k
Copy link
Contributor

richb2k commented Aug 12, 2023

Looking in the file(s) mpif_stub.h, there is a macro for 'MPI_DOUBLE_COMPLEX', but not for 'MPI_COMPLEX'. Making that change in three places, then allows compilation to finish without an error. Whether that is the proper fix is another question.

Rich.

@juharu
Copy link
Contributor

juharu commented Aug 12, 2023 via email

@richb2k
Copy link
Contributor

richb2k commented Aug 12, 2023

The other way to fix this issue would be to add an entry for MPI_COMPLEX into mpif_stub.h.

@raback
Copy link
Contributor

raback commented Aug 12, 2023

Thanx Rich! I just added the suggested fix. Hope it works.

@barracuda156
Copy link
Author

Thanx Rich! I just added the suggested fix. Hope it works.

I will test on macOS now.

@barracuda156
Copy link
Author

@raback @richb2k Looks like we got another instance of the same problem:

/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-79e6d51eb81b6316c87b816190c127fa34439271/fem/src/modules/SaveGridData.F90:1045:53:

 1045 |             CALL MPI_BCAST(WorkChar, 1, MPI_CHARACTER, 0, ELMER_COMM_WORLD, ierr)
      |                                                     1
Error: Symbol 'mpi_character' at (1) has no IMPLICIT type
/opt/local/var/macports/build/_opt_PPCSnowLeopardPorts_science_elmerfem/elmerfem/work/elmerfem-79e6d51eb81b6316c87b816190c127fa34439271/fem/src/modules/SaveGridData.F90:1169:60:

 1169 |             CALL MPI_REDUCE(Array,PArray,nx*ny*nz,MPI_DOUBLE,MPI_MAX,0,ELMER_COMM_WORLD, ierr)
      |                                                            1
Error: Symbol 'mpi_double' at (1) has no IMPLICIT type

@juharu
Copy link
Contributor

juharu commented Aug 14, 2023 via email

@barracuda156
Copy link
Author

@juharu Where to change those? Just replace names across the source?

@juharu
Copy link
Contributor

juharu commented Aug 14, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants