Trying to find the correct weights for BCs, seems not working #690
mercury-4600
started this conversation in
General
Replies: 1 comment
-
|
Here is the code of Laplace equation on a disk: https://deepxde.readthedocs.io/en/latest/demos/pinn_forward/laplace.disk.html |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Dr. Lu,
First, Thanks for your amazing contribution.
I would like to use DeepXDE to solve a Laplace equation on a disk, I have settled all of it's boundary conditions(it gots 3 BCs in codes) and used specific training points, it can work normally without any report errors, but still failed to get the correct solution. And so far, I have tried some methods in the issue to reduce my error, like :
And I tried adding weights as you have mentioned in
1D simple transient diffusion PDE with 2 Neumann BCs #40
The weights I added is :
model.compile("adam", lr=1e-3, metrics=["l2 relative error"],loss_weights=[3,1.3e-1,1.3e-1,1.3e-1,1.3e-1,7.8e-2,7.8e-2])The solution is still unsatisfactory, I thought its because I didn't set a correct distribution of weights for each item, so I added a simple traversal search method to help me find the suitable distributions:
The simple search method I added is a nested for loop:
for w1 in range(10, 100, 10):for w2 in range(10, 100, 10):for w3 in range(10, 100, 10):model.compile("adam", lr=1e-3, metrics=["l2 relative error"], loss_weights=[w, w2,w2,w2,w2,w3,w3])losshistory, train_state = model.train(epochs= 5000)if error_max <= s:print("Error ends at:",error_max)print("W ends at:",w1, w2 , w2 , w2 , w2, w3 , w3 )The
sis a ideal difference from the correct solution. After few day's trying, it seems not working, because the error I got at each epoch is similar, or saying that the change steps ofw1,w2,w3have imperceptible effect on error.So I wonder:
Beta Was this translation helpful? Give feedback.
All reactions