There is parallelization via MPI available from Trixi; however, there one would need to implement calc_mpi_interface_flux! and calc_mpi_mortars_flux! functions for systems with nonconservative terms in the file src/solvers/dgsem_p4est/dg_2d_parallel.jl. These routines already exist for the 3D P4estMesh in Trixi, so such implementations would be straightforward. One just needs a use case.
Also, moving to MPI simulations one may need to adapt the sqrt function similar to the strategy taken in Trixi's math.jl to avoid deadlock if one process crashes (due to something like a negative water height) while another does not.
There is parallelization via MPI available from Trixi; however, there one would need to implement
calc_mpi_interface_flux!andcalc_mpi_mortars_flux!functions for systems with nonconservative terms in the filesrc/solvers/dgsem_p4est/dg_2d_parallel.jl. These routines already exist for the 3DP4estMeshin Trixi, so such implementations would be straightforward. One just needs a use case.Also, moving to MPI simulations one may need to adapt the
sqrtfunction similar to the strategy taken in Trixi'smath.jlto avoid deadlock if one process crashes (due to something like a negative water height) while another does not.