problem with recent updates on cluster #31165
-
Check these boxes if you have followed the posting rules.
Issue or question about MOOSEDear all, (Optional) code in question / simulation log / errorsFramework Information:
MOOSE Version: git commit 41140f2b24 on 2025-07-29
LibMesh Version: f0a69d16b43414eba6d1c1bfe8482ccd6bda220b
PETSc Version: 3.23.0
SLEPc Version: 3.23.0
Current Time: Wed Jul 30 14:21:31 2025
Executable Timestamp: Wed Jul 30 14:14:21 2025
....
Time Step 3, time = 70000, dt = 40000
Performing automatic scaling calculation
0 Nonlinear |R| = ^[[32m3.440295e-02^[[39m
1 Nonlinear |R| = ^[[32m2.615456e-03^[[39m
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Assembling ADD_VALUES, but rank 4 requested INSERT_VALUES
[0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-i value: input.i source: command line
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: PETSc Development Git Revision: v3.23.0-9-g95934b0d393 Git Date: 2025-04-02 04:27:25 +0000
[0]PETSC ERROR: /home/cacace/projects/golem_petsc_3_23_update/golem-opt with 10 MPI process(es) and PETSC_ARCH on node007 by cacace Wed Jul 30 14:21:19 2025
[0]PETSC ERROR: Configure options: --with-64-bit-indices --with-cxx-dialect=C++17 --with-debugging=no --with-fortran-bindings=0 --with-mpi=1 --with-openmp=1 --with-strict-petscerrorcode=1 --with-shared-libraries=1 --with-sowing=0 --download-fblaslapack=1 --download-hpddm=1 --download-hypre=1 --download-metis=1 --download-mumps=1 --download-ptscotch=1 --download-parmetis=1 --download-scalapack=1 --download-slepc=1 --download-strumpack=1 --download-superlu_dist=1 --download-kokkos=1 --download-kokkos-kernels=1 --download-libceed=1 --with-make-np=6 --download-hdf5=1 --with-hdf5-fortran-bindings=0 --download-hdf5-configure-arguments='--with-zlib' --prefix=/home/cacace/moose-compilers/petsc-3.23_update --prefix=/home/cacace/moose-compilers/petsc-3.23_update
[0]PETSC ERROR: #1 MatStashScatterGetMesg_BTS() at /storage/vast-gfz-hpc-01/home/cacace/projects/moose_petsc_3_23_update/petsc/src/mat/utils/matstash.c:955
[0]PETSC ERROR: #2 MatStashScatterGetMesg_Private() at /storage/vast-gfz-hpc-01/home/cacace/projects/moose_petsc_3_23_update/petsc/src/mat/utils/matstash.c:623
[0]PETSC ERROR: #3 MatAssemblyEnd_MPIAIJ() at /storage/vast-gfz-hpc-01/home/cacace/projects/moose_petsc_3_23_update/petsc/src/mat/impls/aij/mpi/mpiaij.c:788
[0]PETSC ERROR: #4 MatAssemblyEnd() at /storage/vast-gfz-hpc-01/home/cacace/projects/moose_petsc_3_23_update/petsc/src/mat/interface/matrix.c:5905
^[[31m
*** ERROR ***
The following error occurred in the Problem 'MOOSE Problem' of type FEProblem.
A libMesh::PetscSolverException was raised during FEProblemBase::computeJacobianTags
Assembling ADD_VALUES, but rank 4 requested INSERT_VALUES^[[39m
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[warn] Epoll MOD(1) on fd 30 failed. Old events were 6; read change was 0 (none); write change was 2 (del); close change was 0 (none): Bad file descriptorEncountering Errors? Please include diagnostic output |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Hello
you could install a newer python using conda and run the tests
do they test the MPI they installed? |
Beta Was this translation helpful? Give feedback.
@GiudGiud my feeling were right. It is indeed something wrong with openmpi wrapper on the cluster. I just installed all via conda and it works smoothly.
This said, I will close the issue for now.
Thanks for the help anyway!