PhaseFieldPet is an open-source phase-field simulation software for heterogeneous architectures (such as CPUs and GPUs). It is built on top of PETSc, and hence wide variety of numerical solvers are available at run time. PhaseFieldPet software simulates multiphase-field models (pfe_mpfl or pfe_mpf) and multi-order parameter models, also called continuum field models (pfe_mop).
In its current version (v1.0.0), PhaseFieldPet comes with three gradient energy terms (grad_dot, grad_weighted, grad_interpolated), five potential energy terms (pot_toth, pot_moelans, pot_garacke,pot_nestler, pot_steinbach) and three driving forces (bulk_b0,bulk_b1,bulk_b2). Not all combinations give a phsicaly sound simulation, so the user has to experiment with combinations and reason out why. For brief description of PhaseFieldPet, see the associated paper Chota et al., 2025. For further details and explanations regarding gradient energy terms, potential energy terms and the phase field equation as presented in PhaseFieldPet, we encourage readers to see the paper Daubner et al., (2023). For explanations and expressions of bulk driving forces as used in PhaseFieldPet, see Hoffrogge et al., 2025.
PhaseFieldPet runs on wide variety of hardware and it supports:
- Multicore, Multinode CPUs via MPI (distributed computing capability by default).
- GPUs ( NVIDIA, AMD): via CUDA, HIP, Kokkos, openCL.
- Multicore CPUs via OpenMP (using third party package integration with petsc).
- Pthreads.
One can take source code PhaseFieldPet.c, compile and run it, visualize the results (default in vtk format) using visualizations softwares such as ParaView. The steps to download and install varies, but we give a general directions to do so here.
There are many ways to install PETSc. See PETSc Installation.
On debian based such as Ubuntu Linux for instance
sudo apt install petsc-devSteps to install vary based on what softwares are available in the machine you are connected to. For instance on the system where gcc, g++, gfortran are available , but not mpi implmentation or lapack, PETSc will install it for you following
- git clone -b release https://gitlab.com/petsc/petsc.git petsc
- cd petsc
- ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack
- make all checkIf in addition you have NVIDIA GPU with compute capability, add --with-cuda in ./configure above. Do similarly to kokkos , OpenMP or Pthread.
If your system has mpi installed ( or available, say via module load openmpi in HPC machines), find the directory where it is installed ( eg which mpiexec), by optionally adding optimization flags, do the following to configure PETSc
- ./configure PETSC_ARCH=arch-optimized --with-debugging=0 COPTFLAGS='-O3 -march=native -mtune=native' CXXOPTFLAGS='-O3 -mtune=native' FOPTFLAGS='-O3 -march=native -mtune=native' --download-fblaslapack --with-mpi-dir=/Path/to/your/MPI/DirNote that if you installed mpi implmentation with PETSc, mpiexec or mpirun will be available in directory $PETSC_DIR/$PETSC_ARCH/bin/. You can append PETSC_DIR and PETSC_ARCH in a start up files (like in ~/.bashrc) and make an alias to mpiexec or put it in your $PATH variable for it to be available in every shell you open.
If the makefile (Makefile) and PhaseFieldPet.c are in one directory,
make PhaseFieldPetGeneral format (while using mpiexec) is
mpiexec -n # ./PhaseFieldPet [options]Unless otherwise overriden explicitily with options, PhaseFieldPet simulates the benchmark case introduced by Daubner et al., (2023), considering a stationary triple junction problem. The default model configuration is the usage of dot gradient term (grad_dot), the well potential of Toth (pot_toth) and a Lagrange multiplier based multiphase-field equation (MPFL). The default time stepping solver being Adaptive Runge Kutta Implicit-Explicit (ARKIMEX) method, where the stiff part of the differential equation is treated implicitly.
- Boundary condition in x direction is pinned.
Add option -ts_monitor to print information about each time step to the console.
- mpiexec -n 4 ./PhaseFieldPet -ts_monitorRestrict each simplex restrict each
- mpiexec -n 4 ./PhaseFieldPet -simplexUse obstacle potentials due to Steinbach
- mpiexec -n 4 ./PhaseFieldPet -pot_steinbach -simplexSave output results approximately every 10 seconds (default is 100 seconds).
- mpiexec -n 4 ./PhaseFieldPet -pot_steinbach -simplex -twrite 10Use multi-phase-field model (MPF)
- mpiexec -n 4 ./PhaseFieldPet -pot_steinbach -pfe_mpf -simpex -ts_monitorChange the underlying non linear (Newton) solver to one iteration
- mpiexec -n 4 ./PhaseFieldPet -snes_type ksponlyUse multi-order parameter model (MOP)
- mpiexec -n 4 ./PhaseFieldPet -grad_interpolated -pot_moelans -pfe_mop -snes_type ksponly -ts_monitorUse Matrix Free Non linear solver
- mpiexec -n 4 ./PhaseFieldPet -snes_mf Use Fully Implicit adaptive backward Differentiation Formula
- mpiexec -n 4 ./PhaseFieldPet -ts_type bdf Increase grid points to 256 x 256 x3
- mpiexec -n 80 PhaseFieldPet -simplex -ts_type bdf -da_grid_x 256 -da_grid_y 256The example simulation result given in the associated paper is obtained with this execution. See the whole time evolution animation in Results folder.
- mpiexec -n 1 PhaseFieldPet -simplex -ts_type bdf -da_grid_x 256 -da_grid_y 256 -dm_mat_type aijcusparse -dm_vec_type cuda- Boundary condition in x is set to be homogenous Neumann.
- mpiexec -n 4 ./PhaseFieldPet -bcx_neumann -snes_type ksponly -ts_monitorAll other options that have experimented in static triple junction can also be used here.
- mpiexec -n 4 ./PhaseFieldPet -bulk_b2All other options that have experimented in static triple junction can also be used here. Note that the bulk term is responsible for various other applications (thermal, chemical, mechanical,...) keeping the gradient and potential energy contributions fixed.
Strong scalabilty of PhaseFieldPet is achieved for simulations of static triple junctions for example simulation result in the paper. See image below for log-plot plot of execution time vs MPI processes. The result indicates excellent agreement with ideal expectations that the log-log plot is a straight line with slope of -1.

This results is obtaine on Meggie cluster at NHR@FAU obtained by running up to 80 mpi processes on four compute nodes, where each node has two Intel Xeon E5-2630v4 “Broadwell” chips (10 cores per chip) running at 2.2 GHz with 25 MB Shared Cache per chip and 64 GB of RAM.
If you have your own energy expressions: gradient , potential and bulk driving term, you can add it to PhaseFieldPet easily by including the respective case clause in the stiff IRHSLocal() and / or non stiff terms in RHSLocal() functions in PhaseFieldPet.c. You can also change the InitialMicrostructure() function in PhaseFieldPet.c to suit other simulations than example triple junction application described here or read initial phase field data available from other software or experimental data.
- Include your own gradient and/or potential terms and solve the triple junction problem. Contact us if you need help.
PhaseFieldPet is distributed under a 2-clause BSD license (as of PETSc).
" Copyright (c) 1991-2025, UChicago Argonne, LLC and the PETSc Developers and Contributors All rights reserved. PETSc is distributed under a 2-clause BSD license which allows for free use, modification, and distribution of the software. For full details, see the PETSc license. "