Adding material depletion function#3420
Conversation
|
Correct me if I'm wrong, but isn't this what IndependentOperator does? |
It does wrap / build upon that functionality. It uses functionality of MicroXS.from_multigroup_flux, deplete.IndependentOperator and PredictorIntegrator. Just using those functions in python file is however inefficient as that repeatedly loads up the cross sections for each change in material, flux or timesteps. So this PR brings them all together into an openmc.lib call so that it can irradiate multiple materials while only loading the cross sections once which is now fast like the other inventory codes out there. |
|
This function has been used to produce these comparison results between CoNDERC experimental data and FISPACT simulations https://shimwell.github.io/openmc_activator |
|
Looks great! Thanks @shimwell. Would it be worth having the option for the user to store the intermediate data like the collapsed cross section? |
|
taking this PR back to the drawing board, looks like there are some improvements to be made. I agree @jbae11 we should just return a list of micro_xs instances I also noticed that I'm currently doing the collapse for all nuclides and not just the ones in the material I shall tidy up, retest on the @jbae11 openmc-activator repo and modify this PR soon |
|
To mimic openmc_activator / fispact and other inventory codes the usage of this function is now like this. The new Feedback welcome import openmc
from pathlib import Path
import numpy as np
from openmc.deplete import get_microxs_from_multigroup
from openmc.mgxs import GROUP_STRUCTURES
mat_ni58 = openmc.Material()
mat_ni58.add_nuclide("Ni58", 0.95)
mat_ni58.set_density("g/cm3", 7.87)
mat_ni58.depletable = True
mat_ni58.temperature = 293.6
mat_ni60 = openmc.Material()
mat_ni60.add_nuclide("Ni60", 1.0)
mat_ni60.set_density("g/cm3", 11.34)
mat_ni60.depletable = True
mat_ni60.temperature = 293.6
chain_file_path=Path(__file__).parents[1] / "chain_ni.xml"
multigroup_flux = np.array([0.5e11] * 1102) # gets normalized in function
all_micro_xs = get_microxs_from_multigroup( # <----- this PR adds this function here
materials=openmc.Materials([mat_ni58, mat_ni60]),
multigroup_fluxes=[multigroup_flux, multigroup_flux],
energy_group_structures=[
GROUP_STRUCTURES["UKAEA-1102"],
"UKAEA-1102",
],
chain_file=chain_file_path,
reactions=None,
)
for micro_xs, material in zip(all_micro_xs, [mat_ni58, mat_ni60]):
print('next material')
material.volume = 1
operator = openmc.deplete.IndependentOperator(
materials=openmc.Materials([material]),
fluxes=[material.volume],
micros=[micro_xs],
normalization_mode="source-rate",
reduce_chain_level=5,
chain_file=chain_file_path,
)
integrator = openmc.deplete.PredictorIntegrator(
operator=operator,
timesteps=[1., 10., 10.],
source_rates=[1e19, 0 ,0],
timestep_units='s',
)
integrator.integrate(path="depletion_results.h5")
results = openmc.deplete.ResultsList.from_hdf5(
filename="depletion_results.h5"
)
for result in results:
mat = result.get_material(str(material.id))
print(mat)updated to use UKAEA-1102 group structure. |
MicahGale
left a comment
There was a problem hiding this comment.
This is an interesting capability. This seems like you are adding utility functions to make O-D transmutation simpler, ala COUPLE-ORIGEN. Is that right?
I dug into how the multi-group XS collapse is happening, and I see you are using reaction::collapse_rate. From reading the code it doesn't seem to assume any a-priori flux shape. I'm not sure this would be appropriate to use then for relatively sparse energy-group structures (< 1000 groups) like it seems like you are trying to do. Have you looked into if this appropriate collapse treatment?
Yes, just like ALARA, FISPACT, ORIGEN, ACAB etc
Yes I think you are right, more groups is better. I have updated the example to use UKAEA-1102 group structure. I have kept the test to a low number to help keep the test quick. We have been able to produce results that are close to fispact and experimental results using 709 groups so I guess the collapse is reasonable. |
|
I think this kind of functionality is great. Allowing for simple, straightforward activation. |
dbdcd9a to
bf271bd
Compare
paulromano
left a comment
There was a problem hiding this comment.
@shimwell Thanks for this PR. I really like the concept but I do see that the get_microxs_from_multigroup has quite a bit of duplicated code from the from_multigroup_flux classmethod. I have an idea for how to eliminate this duplication with a new context manager, but it might require expanding the scope of this PR, so I think it's probably better left as a separate PR. Are you OK with me submitting a new PR separately and then we can come back to this PR and refactor?
Yes please I would be keen to make use of a new context manager. I shall look out for your PR and refactor this branch after your context manager gets merged in. Many thanks |
|
I have now made use of the temporary session context manager. This also allowed me to remove a lot of code introduced earlier in the PR. |
paulromano
left a comment
There was a problem hiding this comment.
@shimwell Thanks for this PR and sorry for the delay on it. I've made some updates on your branch. Namely, IndependentOperator can work with multiple materials (and better expose parallelism), so rather than creating a separate operator for each one, now we pass all materials to IndependentOperator with a list of micros and fluxes. One notable change with that is that I removed the nuclides=material.nuclides() argument; the reason being that if you only compute cross sections for the nuclides in the material, you can't predict reaction rates in any other nuclide (hence any multi-step reactions in depletion won't happen).
If you're good with these changes, I think we can go ahead and merge.
|
Thanks Paul that is some really nice tidying up. I particularly appreciate the speed increase and the nuclide part. Yes very happy to merge. (Auto Merge now enabled) |
Description
This PR adds a function that can be used to deplete a bunch of materials with different neutron fluxes, timesteps
Minimal example
@jbae11 and myself have been working on this for a while and I think we have finally brought everything together in the last few weeks.
We have been using the IAEA CoNDERC benchmarks which contain FNS experimental data (decay heat) and FISPACT simulation inputs and outputs for the same irradiation. This benchmark suit is part of the reason that the function API is so flexible as it has irradiation of different materials, with different irradiation schedules with different source rates and different timesteps.
This needed a few additions to PyPact to get the FISPACT inputs parsed nicely.
@jbae11 and myself have been working on this since Dec 2023 when we started trying to use the deplete functions fusion-energy/neutronics-workshop#257
It was recently that @jbae11 was able to iron out the main differences between fispact and openmc by making a customized chain file which can be found on the openmc-activator repo
I've worked through the openmc-activator converting functions to use pypact and recently adding the same sort of function as I'm adding in this PR. @jbae11 the main differences between this one and the one I added to your repo is that this is more generic as it returns depleted materials and it does a lot of input checking. The one I made for the openmc-activator repo was super useful for testing and produced these comparison plots but I think this one is better in openmc like we discussed. If it is merged in we could make use of it on the openmc-activator repo to continue developing that repo into the main openmc deplete verification and validation repo 🚀 .
Checklist