Very basic question about MFiter #4779
-
|
This is almost silly, but I was going through the guided tutorials for the Heat Equation (explicit parabolic solver for for (int step = 1; step <= nsteps; ++step)
{
// fill periodic ghost cells
// Ok cool so this is how we do periodic boundary conditions in AMReX.
phi_old.FillBoundary(geom.periodicity());
// new_phi = old_phi + dt * Laplacian(old_phi)
// loop over boxes
// One question I have is:
// We are iterating over phi_old field here, but how do we ensure that
// phi_new and phi_old correspond to the same box/grid?
// i.e. how do we ensure that all the variables are in perfect sync?
for ( amrex::MFIter mfi(phi_old); mfi.isValid(); ++mfi )
{
const amrex::Box& bx = mfi.validbox();
// grab the Array4's for old and new phi
const amrex::Array4<amrex::Real>& phiOld = phi_old.array(mfi);
const amrex::Array4<amrex::Real>& phiNew = phi_new.array(mfi);
// advance the data by dt
amrex::ParallelFor(bx, [=] AMREX_GPU_DEVICE (int i, int j, int k)
{
// **********************************
// EVOLVE VALUES FOR EACH CELL
// **********************************
// I find division scary, so I always recommend storing dxi2 = 1/(dx[i]*dx[i]) etc. first
// but thats just me lol - Sahaj Jain
// Alternatively, one could precompute coeff = dt/(dx*dx) outside the loop too.
// And store the operators: A_east, A_west, A_north, A_south, A_top, A_bottom etc. in a multifab!!!
// This would mean marching as Phi_current = A_East*Phi_East_old + A_West*Phi_West_old + ....
// _ A_current*Phi_current_old
phiNew(i,j,k) = phiOld(i,j,k) + dt *
( (phiOld(i+1,j,k) - 2.*phiOld(i,j,k) + phiOld(i-1,j,k)) / (dx[0]*dx[0])
+(phiOld(i,j+1,k) - 2.*phiOld(i,j,k) + phiOld(i,j-1,k)) / (dx[1]*dx[1])
+(phiOld(i,j,k+1) - 2.*phiOld(i,j,k) + phiOld(i,j,k-1)) / (dx[2]*dx[2])
);
});
}Here we have two fields (Multfabs), corresponding to |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
@SahajSJain -- because you defined both MultiFabs using the same BoxArrays and DistributionMaps, you know that they have the same boxes and that for each MultiFab, box "i" is on the same MPI rank. The way you know you are accessing the same box at the same time is these lines: const amrex::Array4amrex::Real& phiOld = phi_old.array(mfi); which tell you that you are accessing the data in phiOld and phiNew that is stored in box "i" where we can know "i" = mfi.index() -- writing phi_old.array(mfi) is the same as saying I want a pointer to the data of phi_old that is stored in box mfi.index() |
Beta Was this translation helpful? Give feedback.
@SahajSJain -- because you defined both MultiFabs using the same BoxArrays and DistributionMaps, you know that they have the same boxes and that for each MultiFab, box "i" is on the same MPI rank. The way you know you are accessing the same box at the same time is these lines:
const amrex::Array4amrex::Real& phiOld = phi_old.array(mfi);
const amrex::Array4amrex::Real& phiNew = phi_new.array(mfi);
which tell you that you are accessing the data in phiOld and phiNew that is stored in box "i" where we can know "i" = mfi.index() -- writing phi_old.array(mfi) is the same as saying I want a pointer to the data of phi_old that is stored in box mfi.index()