You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Refine problems and add direct prob
* Update docs and remove pkg loading
* Clean dispatch for koopman
* Clean up tests
* Scalarize parameters
* Update docs
* Add more tests for problem
* More koopman tests
Copy file name to clipboardExpand all lines: docs/src/prob_and_solve.md
+22-10Lines changed: 22 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,26 +2,28 @@
2
2
3
3
As can be seen from the [introduction examples](@id Quickstart), [DataDrivenDiffEq.jl](https://github.com/SciML/DataDrivenDiffEq.jl) tries to structurize the workflow in a similar fashion to other [SciML](https://sciml.ai/) packages by defining a [`DataDrivenProblem`](@ref), dispatching on the `solve` command to return a [`DataDrivenSolution`](@ref).
4
4
5
-
A problem in the sense of identification, estimation or inference is defined by the data describing it. This data contains at least measurements of the states `X`, which would be sufficient to describe a `DiscreteDataDrivenProblem` with unit time steps similar to the [first example on dynamic mode decomposition](@ref Linear-Systems-via-Dynamic-Mode-Decomposition). Of course we can extend this to include time points `t`, consecutive measurements `X̃` at the next time point or control signals `U` or a function describing those `u(x,p,t)`. Additionally, any parameters `p` known a priori can be included in the problem. In practice, this looks like
5
+
A problem in the sense of identification, estimation or inference is defined by the data describing it. This data contains at least measurements of the states `X`, which would be sufficient to describe a `DiscreteDataDrivenProblem` with unit time steps similar to the [first example on dynamic mode decomposition](@ref Linear-Systems-via-Dynamic-Mode-Decomposition). Of course we can extend this to include time points `t`, control signals `U` or a function describing those `u(x,p,t)`. Additionally, any parameters `p` known a priori can be included in the problem. In practice, this looks like
6
6
7
7
```julia
8
8
problem =DiscreteDataDrivenProblem(X)
9
9
problem =DiscreteDataDrivenProblem(X, t)
10
-
problem =DiscreteDataDrivenProblem(X, t, X̃)
11
-
problem =DiscreteDataDrivenProblem(X, t, X̃, U = U)
12
-
problem =DiscreteDataDrivenProblem(X, t, X̃, U = U, p = p)
13
-
problem =DiscreteDataDrivenProblem(X, t, X̃, U = (x,p,t)->u(x,p,t))
10
+
problem =DiscreteDataDrivenProblem(X, t, U)
11
+
problem =DiscreteDataDrivenProblem(X, t, U, p = p)
12
+
problem =DiscreteDataDrivenProblem(X, t, (x,p,t)->u(x,p,t))
14
13
```
15
14
16
15
Similarly, a `ContinuousDataDrivenProblem` would need at least measurements and time-derivatives (`X` and `DX`) or measurements, time information and a way to derive the time derivatives(`X`, `t` and a [Collocation](@ref) method). Again, this can be extended by including a control input as measurements or a function and possible parameters.
17
16
18
17
```julia
18
+
problem =ContinuousDataDrivenProblem(X, DX)
19
+
problem =ContinuousDataDrivenProblem(X, t, DX)
20
+
problem =ContinuousDataDrivenProblem(X, t, DX, U, p = p)
21
+
problem =ContinuousDataDrivenProblem(X, t, DX, (x,p,t)->u(x,p,t))
22
+
# Using collocation
19
23
problem =ContinuousDataDrivenProblem(X, t, InterpolationMethod())
20
-
problem =ContinuousDataDrivenProblem(X, DX = DX)
21
-
problem =ContinuousDataDrivenProblem(X, t, DX = DX)
22
-
problem =ContinuousDataDrivenProblem(X, t, DX = DX, U = U)
23
-
problem =ContinuousDataDrivenProblem(X, t, DX = DX, U = U, p = p)
24
-
problem =ContinuousDataDrivenProblem(X, t, DX = DX, U = (x,p,t)->u(x,p,t))
24
+
problem =ContinuousDataDrivenProblem(X, t, GaussianKernel())
25
+
problem =ContinuousDataDrivenProblem(X, t, U, InterpolationMethod())
26
+
problem =ContinuousDataDrivenProblem(X, t, U, GaussianKernel(), p = p)
25
27
```
26
28
27
29
You can also directly use a `DESolution` as an input to your [`DataDrivenProblem`](@ref):
@@ -33,6 +35,16 @@ problem = DataDrivenProblem(sol; kwargs...)
33
35
which evaluates the function at the specific timepoints `t` using the parameters `p` of the original problem instead of
34
36
using the interpolation. If you want to use the interpolated data, add the additional keyword `use_interpolation = true`.
35
37
38
+
An additional type of problem is the `DirectDataDrivenProblem`, which does not assume any kind of causal relationship. It is defined by `X` and an observed output `Y` in addition to the usual arguments:
39
+
40
+
```julia
41
+
problem =DirectDataDrivenProblem(X, Y)
42
+
problem =DirectDataDrivenProblem(X, t, Y)
43
+
problem =DirectDataDrivenProblem(X, t, Y, U)
44
+
problem =DirectDataDrivenProblem(X, t, Y, p = p)
45
+
problem =DirectDataDrivenProblem(X, t, Y, (x,p,t)->u(x,p,t), p = p)
46
+
```
47
+
36
48
Next up, we choose a method to `solve` the [`DataDrivenProblem`](@ref). Depending on the input arguments and the type of problem, the function will return a result derived via [`Koopman`](@ref) or [`Sparse Optimization`](@ref) methods. Different options can be provided as well as a [`Basis`](@ref) used for lifting the measurements, to control different options like rounding, normalization or the progressbar depending on the inference method. Possible options are provided [below](@ref optional_arguments).
To estimate the underlying operator in the states ``u_1, u_2``, we simply define a discrete [`DataDrivenProblem`](@ref) using the measurements and time and `solve` the estimation problem using the [`DMDSVD`](@ref) algorithm for approximating the operator.
30
30
31
31
```@example 4
32
-
X = Array(sol)
33
32
34
-
prob = DiscreteDataDrivenProblem(X, t = sol.t)
33
+
prob = DiscreteDataDrivenProblem(sol)
35
34
36
35
res = solve(prob, DMDSVD(), digits = 1)
37
36
system = result(res)
@@ -114,6 +113,7 @@ using ModelingToolkit
114
113
using OrdinaryDiffEq
115
114
using Plots
116
115
using Random
116
+
using Symbolics: scalarize
117
117
118
118
Random.seed!(1111) # Due to the noise
119
119
@@ -166,8 +166,11 @@ and returns a pareto optimal solution of the underlying [`sparse_regression!`](@
166
166
```@example 1
167
167
@variables u[1:2] c[1:1]
168
168
@parameters w[1:2]
169
+
u = scalarize(u)
170
+
c = scalarize(c)
171
+
w = scalarize(w)
169
172
170
-
h = Num[sin(w[1]*u[1]);cos(w[2]*u[1]); polynomial_basis(u, 5); c]
173
+
h = Num[sin.(w[1].*u[1]);cos.(w[2].*u[1]); polynomial_basis(u, 5); c]
171
174
172
175
basis = Basis(h, u, parameters = w, controls = c)
173
176
@@ -248,7 +251,7 @@ for (i, xi) in enumerate(eachcol(X))
0 commit comments