Skip to content

Conversation

@VolodyaCO
Copy link
Collaborator

When using pseudo_kl_divergence_loss, it would be nice to get the spinstrings sampled inside this function so that one can perform other calculations with those samples.

@VolodyaCO VolodyaCO self-assigned this Nov 7, 2025
@VolodyaCO VolodyaCO added the enhancement New feature or request label Nov 7, 2025
Be able to pass spinstring samples to pseudo_kl_divergence_loss

Test return samples in pseudo_kl_divergence_loss

Return samples in pseudo_kl_divergence_loss
@VolodyaCO VolodyaCO force-pushed the pseudo_kl_loss-returns-samples branch from 812fb5c to 19508b3 Compare November 19, 2025 18:59
Copy link
Collaborator

@kevinchern kevinchern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This pull request looks good to me.

--

My suggestions are:

  1. Make samples a required argument. The loss function should be decoupled from sampling; no sampling should be done in this loss function.
  2. Make prefactor a required argument. In other parts of the code base where prefactor is an argument, it is defined as a mandatory keyword arg. The motivation was to mitigate bugs introduced by from using incorrect prefactor. We should be consistent and enforcing that.

--

@thisac should these suggestions addressed in separate PR?

Comment on lines +4 to +5
The ``pseudo_kl_divergence_loss`` got spinstring samples from the GRBM. Now,
it is possible to pass spinstrings to the ``pseudo_kl_divergence_loss``. This
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Clarify the difference between

The pseudo_kl_divergence_loss got spinstring samples from the GRBM.

and

Now, it is possible to pass spinstrings to the pseudo_kl_divergence_loss

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants