Skip to content

Commit b202996

Browse files
committed
Merge branch 'dev'
2 parents ac61391 + fe08767 commit b202996

File tree

17 files changed

+35284
-15
lines changed

17 files changed

+35284
-15
lines changed

docs/conf.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@
1818
# -- Project information -----------------------------------------------------
1919

2020
project = "idrlnet"
21-
copyright = "2021, IDRL"
21+
copyright = "2023, IDRL"
2222
author = "IDRL"
2323

2424
# The full version, including alpha/beta/rc tags
25-
release = "0.1.0"
25+
release = "0.0.2-rc3"
2626

2727
# -- General configuration ---------------------------------------------------
2828

docs/user/get_started/10_deepritz.md

Lines changed: 123 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,123 @@
1+
# Deepritz
2+
3+
This section repeats the Deepritz method presented by [Weinan E and Bing Yu](https://link.springer.com/article/10.1007/s40304-018-0127-z).
4+
5+
Consider the 2d Poisson's equation such as the following:
6+
7+
$$
8+
\begin{equation}
9+
\begin{aligned}
10+
-\Delta u=f, & \text { in } \Omega \\
11+
u=0, & \text { on } \partial \Omega
12+
\end{aligned}
13+
\end{equation}
14+
$$
15+
16+
Based on the scattering theorem, its weak form is that both sides are multiplied by$ v \in H_0^1$(which can be interpreted as some function bounded by 0),to get
17+
18+
$$
19+
\int f v=-\int v \Delta u=(\nabla u, \nabla v)
20+
$$
21+
22+
The above equation holds for any $v \in H_0^1$. The bilinear part of the right-hand side of the equation with respect to $u,v$ is symmetric and yields the bilinear term:
23+
24+
$$
25+
a(u, v)=\int \nabla u \cdot \nabla v
26+
$$
27+
28+
By the Poincaré inequality, the $a(\cdot, \cdot)$ is a positive definite operator. By positive definite, we mean that there exists $\alpha >0$, such that
29+
30+
$$
31+
a(u, u) \geq \alpha\|u\|^2, \quad \forall u \in H_0^1
32+
$$
33+
34+
The remaining term is a linear generalization of $v$, which is $l(v)$, which yields the equation:
35+
36+
$$
37+
a(u, v) = l(v)
38+
$$
39+
40+
For this equation, by discretizing $u,v$ in the same finite dimensional subspace, we can obtain a symmetric positive definite system of equations, which is the family of Galerkin methods, or we can transform it into a polarization problem to solve it.
41+
42+
To find $u$ satisfies
43+
44+
$$
45+
a(u, v) = l(v), \quad \forall v \in H_0^1
46+
$$
47+
48+
For a symmetric positive definite $a$ , which is equivalent to solving the variational minimization problem, that is, finding $u$, such that holds, where
49+
50+
$$
51+
J(u) = \frac{1}{2} a(u, u) - l(u)
52+
$$
53+
54+
Specifically
55+
56+
$$
57+
\min _{u \in H_0^1} J(u)=\frac{1}{2} \int\|\nabla u\|_2^2-\int f v
58+
$$
59+
60+
The DeepRitz method is similar to the PINN approach, replacing the neural network with u, and after sampling the region, just solve it with a solver like Adam. Written as
61+
62+
$$
63+
\begin{equation}
64+
\min _{\left.\hat{u}\right|_{\partial \Omega}=0} \hat{J}(\hat{u})=\frac{1}{2} \frac{S_{\Omega}}{N_{\Omega}} \sum\left\|\nabla \hat{u}\left(x_i, y_i\right)\right\|_2^2-\frac{S_{\Omega}}{N_{\partial \Omega}} \sum f\left(x_i, y_i\right) \hat{u}\left(x_i, y_i\right)
65+
\end{equation}
66+
$$
67+
68+
Note that the original $u \in H_0^1$, which is zero on the boundary, is transformed into an unconstrained problem by adding the penalty function term:
69+
70+
$$
71+
\begin{equation}
72+
\begin{gathered}
73+
\min \hat{J}(\hat{u})=\frac{1}{2} \frac{S_{\Omega}}{N_{\Omega}} \sum\left\|\nabla \hat{u}\left(x_i, y_i\right)\right\|_2^2-\frac{S_{\Omega}}{N_{\Omega}} \sum f\left(x_i, y_i\right) \hat{u}\left(x_i, y_i\right)+\beta \frac{S_{\partial \Omega}}{N_{\partial \Omega}} \\
74+
\sum \hat{u}^2\left(x_i, y_i\right)
75+
\end{gathered}
76+
\end{equation}
77+
$$
78+
79+
Consider the 2d Poisson's equation defined on $\Omega=[-1,1]\times[-1,1]$, which satisfies $f=2 \pi^2 \sin (\pi x) \sin (\pi y)$.
80+
81+
### Define Sampling Methods and Constraints
82+
83+
For the problem, boundary condition and PDE constraint are presented and use the Identity loss.
84+
85+
```python
86+
@sc.datanode(sigma=1000.0)
87+
class Boundary(sc.SampleDomain):
88+
def __init__(self):
89+
self.points = geo.sample_boundary(100,)
90+
self.constraints = {"u": 0.}
91+
92+
def sampling(self, *args, **kwargs):
93+
return self.points, self.constraints
94+
95+
96+
@sc.datanode(loss_fn="Identity")
97+
class Interior(sc.SampleDomain):
98+
def __init__(self):
99+
self.points = geo.sample_interior(1000)
100+
self.constraints = {"integral_dxdy": 0,}
101+
102+
def sampling(self, *args, **kwargs):
103+
return self.points, self.constraints
104+
```
105+
106+
### Define Neural Networks and PDEs
107+
108+
In the PDE definition section, based on the DeepRitz method we add two types of PDE nodes:
109+
110+
```python
111+
def f(x, y):
112+
return 2 * sp.pi ** 2 * sp.sin(sp.pi * x) * sp.sin(sp.pi * y)
113+
114+
dx_exp = sc.ExpressionNode(
115+
expression=0.5*(u.diff(x) ** 2 + u.diff(y) ** 2) - u * f(x, y), name="dxdy"
116+
)
117+
net = sc.get_net_node(inputs=("x", "y"), outputs=("u",), name="net", arch=sc.Arch.mlp)
118+
119+
integral = sc.ICNode("dxdy", dim=2, time=False)
120+
```
121+
122+
The result is shown as follows:
123+
![deepritz](https://github.com/xiangzixuebit/picture/raw/3d73005f3642f10400975659479e856fb99f6518/deepritz.png)
Lines changed: 227 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,227 @@
1+
# Navier-Stokes equations
2+
3+
This section repeats the Robust PINN method presented by [Peng et.al](https://deepai.org/publication/robust-regression-with-highly-corrupted-data-via-physics-informed-neural-networks).
4+
5+
## Steady 2D NS equations
6+
7+
The prototype problem of incompressible flow past a circular cylinder is considered.
8+
![image](https://github.com/xiangzixuebit/picture/raw/3d73005f3642f10400975659479e856fb99f6518/NS1.png)
9+
10+
The velocity vector is set to zero at all walls and the pressure is set to p = 0 at the outlet. The fluid density is taken as $\rho = 1kg/m^3$ and the dynamic viscosity is taken as $\mu = 2 · 10^{−2}kg/m^3$ . The velocity profile on the inlet is set as $u(0, y)=4 \frac{U_M}{H^2}(H-y) y$ with $U_M = 1m/s$ and $H = 0.41m$.
11+
12+
The two-dimensional steady-state Navier-Stokes equation is equivalently transformed into the following equations:
13+
14+
$$
15+
\begin{equation}
16+
\begin{aligned}
17+
\sigma^{11} &=-p+2 \mu u_x \\
18+
\sigma^{22} &=-p+2 \mu v_y \\
19+
\sigma^{12} &=\mu\left(u_y+v_x\right) \\
20+
p &=-\frac{1}{2}\left(\sigma^{11}+\sigma^{22}\right) \\
21+
\left(u u_x+v u_y\right) &=\mu\left(\sigma_x^{11}+\sigma_y^{12}\right) \\
22+
\left(u v_x+v v_y\right) &=\mu\left(\sigma_x^{12}+\sigma_y^{22}\right)
23+
\end{aligned}
24+
\end{equation}
25+
$$
26+
27+
We construct a neural network with six outputs to satisfy the PDE constraints above:
28+
29+
$$
30+
\begin{equation}
31+
u, v, p, \sigma^{11}, \sigma^{12}, \sigma^{22}=net(x, y)
32+
\end{equation}
33+
$$
34+
35+
### Define Symbols and Geometric Objects
36+
37+
For the 2d problem, we define two coordinate symbols`x`and`y`, six variables$ u, v, p, \sigma^{11}, \sigma^{12}, \sigma^{22}$ are defined.
38+
39+
The geometry object is a simple rectangle and circle with the operator `-`.
40+
41+
```python
42+
x = Symbol('x')
43+
y = Symbol('y')
44+
rec = sc.Rectangle((0., 0.), (1.1, 0.41))
45+
cir = sc.Circle((0.2, 0.2), 0.05)
46+
geo = rec - cir
47+
u = sp.Function('u')(x, y)
48+
v = sp.Function('v')(x, y)
49+
p = sp.Function('p')(x, y)
50+
s11 = sp.Function('s11')(x, y)
51+
s22 = sp.Function('s22')(x, y)
52+
s12 = sp.Function('s12')(x, y)
53+
```
54+
55+
### Define Sampling Methods and Constraints
56+
57+
For the problem, three boundary conditions , PDE constraint and external data are presented. We use the robust-PINN model inspired by the traditional LAD (Least Absolute Derivation) approach, where the L1 loss replaces the squared L2 data loss.
58+
59+
```python
60+
@sc.datanode
61+
class Inlet(sc.SampleDomain):
62+
def sampling(self, *args, **kwargs):
63+
points = rec.sample_boundary(1000, sieve=(sp.Eq(x, 0.)))
64+
constraints = sc.Variables({'u': 4 * (0.41 - y) * y / (0.41 * 0.41)})
65+
return points, constraints
66+
67+
@sc.datanode
68+
class Outlet(sc.SampleDomain):
69+
def sampling(self, *args, **kwargs):
70+
points = geo.sample_boundary(1000, sieve=(sp.Eq(x, 1.1)))
71+
constraints = sc.Variables({'p': 0.})
72+
return points, constraints
73+
74+
@sc.datanode
75+
class Wall(sc.SampleDomain):
76+
def sampling(self, *args, **kwargs):
77+
points = geo.sample_boundary(1000, sieve=((x > 0.) & (x < 1.1)))
78+
#print("points3", points)
79+
constraints = sc.Variables({'u': 0., 'v': 0.})
80+
return points, constraints
81+
82+
@sc.datanode(name='NS_external')
83+
class Interior_domain(sc.SampleDomain):
84+
def __init__(self):
85+
self.density = 2000
86+
87+
def sampling(self, *args, **kwargs):
88+
points = geo.sample_interior(2000)
89+
constraints = {'f_s11': 0., 'f_s22': 0., 'f_s12': 0., 'f_u': 0., 'f_v': 0., 'f_p': 0.}
90+
return points, constraints
91+
92+
@sc.datanode(name='NS_domain', loss_fn='L1')
93+
class NSExternal(sc.SampleDomain):
94+
def __init__(self):
95+
points = pd.read_csv('NSexternel_sample.csv')
96+
self.points = {col: points[col].to_numpy().reshape(-1, 1) for col in points.columns}
97+
self.constraints = {'u': self.points.pop('u'), 'v': self.points.pop('v'), 'p': self.points.pop('p')}
98+
99+
def sampling(self, *args, **kwargs):
100+
return self.points, self.constraints
101+
```
102+
103+
### Define Neural Networks and PDEs
104+
105+
In the PDE definition part, we add these PDE nodes:
106+
107+
```python
108+
net = sc.MLP([2, 40, 40, 40, 40, 40, 40, 40, 40, 6], activation=sc.Activation.tanh)
109+
net = sc.get_net_node(inputs=('x', 'y'), outputs=('u', 'v', 'p', 's11', 's22', 's12'), name='net', arch=sc.Arch.mlp)
110+
pde1 = sc.ExpressionNode(name='f_s11', expression=-p + 2 * nu * u.diff(x) - s11)
111+
pde2 = sc.ExpressionNode(name='f_s22', expression=-p + 2 * nu * v.diff(y) - s22)
112+
pde3 = sc.ExpressionNode(name='f_s12', expression=nu * (u.diff(y) + v.diff(x)) - s12)
113+
pde4 = sc.ExpressionNode(name='f_u', expression=u * u.diff(x) + v * u.diff(y) - nu * (s11.diff(x) + s12.diff(y)))
114+
pde5 = sc.ExpressionNode(name='f_v', expression=u * v.diff(x) + v * v.diff(y) - nu * (s12.diff(x) + s22.diff(y)))
115+
pde6 = sc.ExpressionNode(name='f_p', expression=p + (s11 + s22) / 2)
116+
```
117+
118+
### Define A Solver
119+
120+
Direct use of Adam optimization is less effective, so the LBFGS optimization method or a combination of both (Adam+LBFGS) is used for training:
121+
122+
```python
123+
s = sc.Solver(sample_domains=(Inlet(), Outlet(), Wall(), Interior_domain(), NSExternal()),
124+
netnodes=[net],
125+
init_network_dirs=['network_dir_adam'],
126+
pdes=[pde1, pde2, pde3, pde4, pde5, pde6],
127+
max_iter=300,
128+
opt_config = dict(optimizer='LBFGS', lr=1)
129+
)
130+
```
131+
132+
The result is shown as follows:
133+
![image](https://github.com/xiangzixuebit/picture/raw/3d73005f3642f10400975659479e856fb99f6518/NS11.png)
134+
135+
## Unsteady 2D N-S equations with unknown parameters
136+
137+
A two-dimensional incompressible flow and dynamic vortex shedding past a circular cylinder in a steady-state are numerically simulated. Respectively, the Reynolds number of the incompressible flow is $Re = 100$. The kinematic viscosity of the fluid is $\nu = 0.01$. The cylinder diameter D is 1. The simulation domain size is
138+
$[-15,25] × [-8,8]$. The computational domain is much smaller: $[1,8] × [-2,2]× [0,20]$.
139+
140+
![image](https://github.com/xiangzixuebit/picture/raw/3d73005f3642f10400975659479e856fb99f6518/NS2.png)
141+
142+
$$
143+
\begin{equation}
144+
\begin{aligned}
145+
&u_t+\lambda_1\left(u u_x+v u_y\right)=-p_x+\lambda_2\left(u_{x x}+u_{y y}\right) \\
146+
&v_t+\lambda_1\left(u v_x+v v_y\right)=-p_y+\lambda_2\left(v_{x x}+v_{y y}\right)
147+
\end{aligned}
148+
\end{equation}
149+
$$
150+
151+
where $\lambda_1$ and $\lambda_2$ are two unknown parameters to be recovered. We make the assumption that $u=\psi_y, \quad v=-\psi_x$
152+
153+
for some stream function $\psi(x, y)$. Under this assumption, the continuity equation will be automatically satisfied. The following architecture is used in this example,
154+
155+
$$
156+
\begin{equation}
157+
\psi, p=net\left(t, x, y, \lambda_1, \lambda_2\right)
158+
\end{equation}
159+
$$
160+
161+
### Define Symbols and Geometric Objects
162+
163+
We define three coordinate symbols `x`, `y` and `t`, three variables $u,v,p$ are defined.
164+
165+
```python
166+
x = Symbol('x')
167+
y = Symbol('y')
168+
t = Symbol('t')
169+
geo = sc.Rectangle((1., -2.), (8., 2.))
170+
u = sp.Function('u')(x, y, t)
171+
v = sp.Function('v')(x, y, t)
172+
p = sp.Function('p')(x, y, t)
173+
time_range = {t: (0, 20)}
174+
```
175+
176+
### Define Sampling Methods and Constraints
177+
178+
This example has only two equation constraints, while the former has six equation constraints. We also use the LAD-PINN model. Then the PDE constrained optimization model is formulated as:
179+
180+
$$
181+
\min _{\theta, \lambda} \frac{1}{\# \mathbf{D}_u} \sum_{\left(t_i, x_i, u_i\right) \in \mathbf{D}_u}\left|u_i-u_\theta\left(t_i, x_i ; \lambda\right)\right|+\omega \cdot L_{p d e} .
182+
$$
183+
184+
```python
185+
@sc.datanode(name='NS_domain', loss_fn='L1')
186+
class NSExternal(sc.SampleDomain):
187+
def __init__(self):
188+
points = pd.read_csv('NSexternel_sample.csv')
189+
self.points = {col: points[col].to_numpy().reshape(-1, 1) for col in points.columns}
190+
self.constraints = {'u': self.points.pop('u'), 'v': self.points.pop('v'), 'p': self.points.pop('p')}
191+
192+
def sampling(self, *args, **kwargs):
193+
return self.points, self.constraints
194+
195+
@sc.datanode(name='NS_external')
196+
class NSEq(sc.SampleDomain):
197+
def sampling(self, *args, **kwargs):
198+
points = geo.sample_interior(density=2000, param_ranges=time_range)
199+
constraints = {'continuity': 0, 'momentum_x': 0, 'momentum_y': 0}
200+
return points, constraints
201+
```
202+
203+
### Define Neural Networks and PDEs
204+
205+
IDRLnet defines a network node to represent the unknown Parameters.
206+
207+
```python
208+
net = sc.MLP([3, 20, 20, 20, 20, 20, 20, 20, 20, 3], activation=sc.Activation.tanh)
209+
net = sc.get_net_node(inputs=('x', 'y', 't'), outputs=('u', 'v', 'p'), name='net', arch=sc.Arch.mlp)
210+
var_nr = sc.get_net_node(inputs=('x', 'y'), outputs=('nu', 'rho'), arch=sc.Arch.single_var)
211+
pde = sc.NavierStokesNode(nu='nu', rho='rho', dim=2, time=True, u='u', v='v', p='p')
212+
```
213+
214+
### Define A Solver
215+
216+
Two nodes trained together
217+
218+
```python
219+
s = sc.Solver(sample_domains=(NSExternal(), NSEq()),
220+
netnodes=[net, var_nr],
221+
pdes=[pde],
222+
network_dir='network_dir',
223+
max_iter=10000)
224+
```
225+
226+
Finally, the real velocity field and pressure field at t=10s are compared with the predicted results:
227+
![image](https://github.com/xiangzixuebit/picture/raw/3d73005f3642f10400975659479e856fb99f6518/NS22.png)

docs/user/get_started/tutorial.rst

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ To make full use of IDRLnet. We strongly suggest following the following example
1414
6. :ref:`Parameterized poisson equation <Parameterized Poisson>`. The example introduces how to train a surrogate with parameters.
1515
7. :ref:`Variational Minimization <Variational Minimization>`. The example introduces how to solve variational minimization problems.
1616
8. :ref:`Volterra integral differential equation <Volterra Integral Differential Equation>`. The example introduces the way to solve IDEs.
17+
9. :ref:`Navier-Stokes equation <Navier-Stokes equations>`. The example introduces how to use the LBFGS optimizer.
18+
10. :ref:`Deepritz method <Deepritz>`. The example introduces the way to solve PDEs with the Deepritz method.
1719

1820

1921

@@ -28,3 +30,5 @@ To make full use of IDRLnet. We strongly suggest following the following example
2830
6_parameterized_poisson
2931
7_minimal_surface
3032
8_volterra_ide
33+
9_navier_stokes_equation
34+
10_deepritz

0 commit comments

Comments
 (0)