Skip to content

feat: Initialize FitRecipe with a results file or object#166

Open
cadenmyers13 wants to merge 13 commits intodiffpy:v3.3.0from
cadenmyers13:init-w-results
Open

feat: Initialize FitRecipe with a results file or object#166
cadenmyers13 wants to merge 13 commits intodiffpy:v3.3.0from
cadenmyers13:init-w-results

Conversation

@cadenmyers13
Copy link
Contributor

No description provided.

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for review

if hasattr(results, "print_results"):
params_dict = utils.get_dict_from_results_object(results)
elif isinstance(results, (str, Path)):
params_dict = utils.get_dict_from_results_file(results)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handles either a results object or a path to a results file

param.name: param.getValue()
for param in self._parameters.values()
}
self._pretty_print_results_dict(set_parameters_dict)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Prints the parameters found in the results and the parameters set

recipe.add_variable(contribution.wave_number, 3)
recipe.add_variable(contribution.phase_shift, 2)
return recipe

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

had to wrap this in a second function because I realized calling the same fixture twice after the first one was refined led to initial values in the second call being the values of the previously refined recipe

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can simply change the scope of the fixture to remove this behavior

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sbillinge Unfortunately that doesn't work here. It was already set to scope="function" which is the lowest level so to speak.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, but I am confused. in this case it would reset every time through a pytest.mark.parametrize Are we not initializing something correctly? I am only banging on about this because it maybe suggests something may be wrong with our tests which would not be good.

Copy link
Contributor Author

@cadenmyers13 cadenmyers13 Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sbillinge In the current version, the test fixture is not being called twice when you do,

recipe1 = build_recipe_one_contribution
recipe2 = build_recipe_one_contribution

What is happening here is that it is assigning the same fixture value to two variable names so recipe1 == recipe2 would return True even if one was refined and the other wasnt.

When we wrap it in another function like in the incoming version, each recipe object can be created

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may be indicating a weakness with our code, either the test or the code itself. Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?

Copy link
Contributor Author

@cadenmyers13 cadenmyers13 Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sbillinge I tested this without the conftest fixture and built the recipes manually like so,

def test_initialize_recipe_from_results_object():
    # Case: User initializes a FitRecipe from a FitResults object
    # expected: recipe is initialized with variables from previous fit
    profile1 = Profile()
    x = linspace(0, pi, 10)
    y = sin(x)
    profile1.set_observed_profile(x, y)
    contribution1 = FitContribution("c1")
    contribution1.set_profile(profile1)
    contribution1.set_equation("amplitude*sin(wave_number*x + phase_shift)")
    recipe1 = FitRecipe()
    recipe1.add_contribution(contribution1)
    recipe1.add_variable(contribution1.amplitude, 4)
    recipe1.add_variable(contribution1.wave_number, 3)
    recipe1.add_variable(contribution1.phase_shift, 2)
    optimize_recipe(recipe1)
    results1 = FitResults(recipe1)
    expected_values = np.round(results1.varvals, 5)
    expected_names = results1.varnames

    profile2 = Profile()
    x = linspace(0, pi, 10)
    y = sin(x)
    profile2.set_observed_profile(x, y)
    contribution2 = FitContribution("c2")
    contribution2.set_profile(profile2)
    contribution2.set_equation("amplitude*sin(wave_number*x + phase_shift)")
    recipe2 = FitRecipe()
    recipe2.add_contribution(contribution2)
    recipe2.add_variable(contribution2.amplitude, 4)
    recipe2.add_variable(contribution2.wave_number, 3)
    recipe2.add_variable(contribution2.phase_shift, 2)
    recipe2.create_new_variable(
        "extra_var", 5
    )  # should be included in the initialized recipe
    actual_values_before_init = [val for val in recipe2.get_values()]
    actual_names_before_init = recipe2.get_names()
    expected_names_before_init = [
        "amplitude",
        "extra_var",
        "phase_shift",
        "wave_number",
    ]
    expected_values_before_init = [
        4,
        3,
        2,
        5,
    ]  # the three variables + the extra_var

    assert actual_values_before_init == expected_values_before_init
    assert sorted(actual_names_before_init) == sorted(
        expected_names_before_init
    )

    recipe2.initialize_recipe_with_results(results1)
    optimize_recipe(recipe2)
    results2 = FitResults(recipe2)
    actual_values = np.round(results2.varvals, 5)
    actual_names = results2.varnames

    expected_names = expected_names + [
        "extra_var"
    ]  # add the new variable name to expected names
    expected_values = list(expected_values) + [
        5
    ]  # add the value of the new variable to expected values
    assert sorted(expected_names) == sorted(actual_names)
    assert sorted(expected_values) == sorted(list(actual_values))

Doing this passes the test meaning its a fixture related thing and not a code related thing, what we could do is have it like this for this specific test (and i think one other) and revert the conftest fixture back to original.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?

@sbillinge and no, this doesnt fix it

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks very good. Please see a couple of comments

The recipe from which the results were generated.

cov : numpy.ndarray or None
Covariance matrix of the refined variables. None if unavailable.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We seen to have lost a bunch of "The"s

recipe.add_variable(contribution.wave_number, 3)
recipe.add_variable(contribution.phase_shift, 2)
return recipe

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can simply change the scope of the fixture to remove this behavior

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for review

@codecov
Copy link

codecov bot commented Feb 27, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 72.76%. Comparing base (cee67de) to head (73704f4).
⚠️ Report is 1 commits behind head on v3.3.0.

Additional details and impacted files
@@            Coverage Diff             @@
##           v3.3.0     #166      +/-   ##
==========================================
+ Coverage   72.36%   72.76%   +0.39%     
==========================================
  Files          25       25              
  Lines        3832     3888      +56     
==========================================
+ Hits         2773     2829      +56     
  Misses       1059     1059              
Files with missing lines Coverage Δ
tests/conftest.py 92.00% <100.00%> (+0.10%) ⬆️
tests/test_fitrecipe.py 99.85% <100.00%> (+0.01%) ⬆️
tests/test_fitresults.py 99.35% <100.00%> (+<0.01%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for revie

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One comment. I am still a bit concerned with our testing workaround that I am worried is indicating a weakness in either our code or our test. I commented on that.

Also, I am not sure whether my other comment to always use an odd number for npoints in linspace actually went through, so let me say it here.....


def _build_recipe():
profile = Profile()
x = linspace(0, pi, 10)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good habit is to use odd numbers in linspace

recipe.add_variable(contribution.wave_number, 3)
recipe.add_variable(contribution.phase_shift, 2)
return recipe

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may be indicating a weakness with our code, either the test or the code itself. Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?

@cadenmyers13
Copy link
Contributor Author

@sbillinge Ready for review

One comment. I am still a bit concerned with our testing workaround that I am worried is indicating a weakness in either our code or our test. I commented on that.
Also, I am not sure whether my other comment to always use an odd number for npoints in linspace actually went through, so let me say it here.....

It didn't but now i see it and got it fixed 👍

@sbillinge
Copy link
Contributor

I am having a hard time understanding this. Let's maybe jump on a call when we can and discuss it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants