Skip to content

Conversation

@haison19952013
Copy link

What?

  • Supported and tested backend: tensorflow
  • Details:
    • Provide a way to change the loss weights dynamically via callbacks without the need to recompile the model
    • Currently works for non-gradient-based adaptive loss weight scheme

Why?

  • Motivation
  • Help deepxde users can formulate their non-gradient-based adaptive loss weights scheme

How?

  • In model.py:
    • Add loss_weights as instances for functions that work for tensorflow backend
  • In callbacks.py, give some examples on how to define a calback to change the loss_weights
    • Add ManualDynamicLossWeight: to change the loss weights based on the specified index
    • Add PrintLossWeight: to display the loss weights based on the specified period

Testing?

  • A working example is given in deepxde\examples\pinn_inverse\elliptic_inverse_field_manual_dynamic_loss_weights.py

Future work

  • Work on gradient-based adaptive loss weight scheme

@lululxvi
Copy link
Owner

Format the code via black https://github.com/psf/black

@haison19952013
Copy link
Author

Format the code via black https://github.com/psf/black

Updated

self.opt_name = optimizer
loss_fn = losses_module.get(loss)
self.loss_weights = loss_weights
self.loss_weights = tf.convert_to_tensor(
Copy link
Owner

@lululxvi lululxvi Mar 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • How about loss weights is None?
  • Using tf here will break other backends.

@pescap
Copy link
Contributor

pescap commented Jun 5, 2024

@haison19952013, do you plan to keep working on this PR? If not, I will continue the work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants