Parallel simulations using MPI.jl#141
Parallel simulations using MPI.jl#141marinlauber wants to merge 45 commits intoWaterLily-jl:masterfrom
Conversation
|
OUTDATED COMMENT @weymouth @b-fg @TzuYaoHuang My initial idea is to incorporate this into the main solver as an extension and use the custom type The flow will be constructed using We could also bind the |
|
SOLVED And the function where the function |
| end | ||
| function vtkWriter(fname="WaterLily";attrib=default_attrib(),dir="vtk_data",T=Float32) | ||
| function vtkWriter(fname="WaterLily";attrib=default_attrib(),dir="vtk_data",T=Float32,extents=[(1:1,1:1)]) | ||
| !isdir(dir) && mkdir(dir) |
There was a problem hiding this comment.
this can actually be an issue if all rank try to create the dir.
This pull request is a work in progress.
I open it now to see how we can add parallel capabilities to
WaterLily.jlefficiently, keeping in mind that ultimately we want to be able to do multi-CPU/GPU simulations.I have run most of the
examples/TwoD_*files with the double ghost again, which works in serial.This pull request changes/adds many files, and I will briefly describe the changes made.
changed files
src/WaterLily: enables passing the type of the Poisson solver to the simulations (mainly to simplify my testing), preliminary MPI extension (not used, to be discussed).src/Flow.jl: implement double ghost cells and remove the special QUICK/CD scheme on the boundaries, as these are no longer needed.src/MultiLevelPoisson.jl: implement downsampling for double ghost arrays and change all utilities functions accordingly. Explicitly define the dot product functions to overload these with the MPI functions later on. Also, change thesolver!function as thePoissonMPI.jltest did not converge properly with theseLinftycriteria.src/Poisson.jl: add aperBC!call in Jacobi (not needed, I think) and adjust the solver.src/util.jl: adjust all theinsidefunctions andlocto account for the double ghost cells. AdjustBC!,perBC!andexitBC!functions for the double ghost cells. This also introduce a custom Array typeMPIArraythat allocates send and receive buffers to avoid allocating them at everympi_swapcall. This new array type allows type dispatch within the extension.New files
examples/TwoD_CircleMPI.jl: Script to simulate the flow around a 2D circle using MPI.examples/TwoD_CircleMPIArray.jl: the classical flow around a 2D circle but using the customMPIArraytype to demonstrate the fact that in serial (if MPI is now loaded) the flow solver works fine.ext/WaterLilyMPIExt.jl: MPI extension that defines new functions using type dispatches for the WaterLily function that are now parallel.test/test_mpi.jlinitial MPI test should be changed to use this instead.WaterLilyMPI.jl: contains all the function overload needed to perform parallel WaterLily simulations. Define anMPIGridtype that stores information about the decomposition (globalfor now) and thempi_swapfunction that performs the message passing together with some MPI utils.MPIArray.jl: a custom Array type that also allocates send and receive buffers to avoid allocating them at everympi_swapcall. This is an idea for the final implementation and has not been tested yet.FlowSolverMPI.jl: tests for some critical part of the flow solver, fromsdfmeasures tosim_step. Use withvis_mpiwaterlily.jlto see plot the results on the different ranks.PoisonMPI.jl: parallel Poisson solver test on an analytical solution. Use withvis_mpiwaterlily.jlto see plot the results on the different ranks.diffusion_2D_mpi.jl: initial test of MPI function, deprecatedvis_diffusion.jl: use to the the results ofdiffusion_2D_mpi.jldeprecatedtest/poisson.jla simple Poisson test, will be removedThe things that remain to do
AllReduceinPoisson.residuals!)@views()for the send-receive buffer. This could be avoided if we allocate the send and receive buffer with the arrays using something similar to what it is in the fileMPIArray.jlVTKextension to enable the writing of parallel files.Some of the results from
FlowSolverMPI.jlbasic rank and sdf check

zeroth kernel moment vector with and without halos

full

sim_stepcheck