Skip to content
Discussion options

You must be logged in to vote

So you're saying that if you use

[Preconditioning]
  [Direct]
    type = SMP
    full = true
    petsc_options = '-ksp_converged_reason'
    petsc_options_iname = '-pc_type -pc_factor_mat_solver_type -pc_factor_shift_type -mat_mumps_icntl_7 -mat_mumps_icntl_24'
    petsc_options_value = 'lu       mumps                      NONZERO               7                  1'
  []
[]

then you get something like FACTOR_OUTMEMORY when you run in parallel? If that is the case you might additionally add the option -mat_mumps_icntl_14 300 which would bump your memory working space by 300%. You can look at this PETSc man page for all the mumps options

Replies: 1 comment 15 replies

Comment options

You must be logged in to vote
15 replies
@bo-qian
Comment options

@lindsayad
Comment options

@bo-qian
Comment options

@lindsayad
Comment options

Answer selected by bo-qian
@bo-qian
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants