Skip to content

Commit 6f14b0b

Browse files
committed
hotfix
hotfix - missing pip installs in new notebooks for FE
1 parent 659d554 commit 6f14b0b

File tree

2 files changed

+28
-4
lines changed

2 files changed

+28
-4
lines changed

examples/function_encoder/Part_1_Intro_to_Function_Encoders.ipynb

Lines changed: 14 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
{
22
"cells": [
33
{
4-
"cell_type": "markdown",
54
"metadata": {},
5+
"cell_type": "markdown",
66
"source": [
77
"# Function Encoders: A principled approach to transfer and online adaptation\n",
88
"\n",
@@ -36,9 +36,21 @@
3636
"\\mathcal{H} = \\{f \\; | \\; f(x) = ax^2 + bx + c \\quad \\quad a,b,c \\in \\mathbb{R}\\}\n",
3737
"$$\n",
3838
"\n",
39-
"The first thing we will do is define the datasets. The following code segment defines the function $f(x) = ax^2 + bx + c$. Then it samples 100 values each of a,b, and c. Lastly, it samples data from (-1,1) and computes the output for each function. "
39+
"The first thing we will do is define the datasets. The following code segment defines the function $f(x) = ax^2 + bx + c$. Then it samples 100 values each of a,b, and c. Lastly, it samples data from (-1,1) and computes the output for each function."
4040
]
4141
},
42+
{
43+
"metadata": {},
44+
"cell_type": "markdown",
45+
"source": "### Install Neuromancer (Colab only)"
46+
},
47+
{
48+
"metadata": {},
49+
"cell_type": "code",
50+
"outputs": [],
51+
"execution_count": null,
52+
"source": "!pip install neuromancer"
53+
},
4254
{
4355
"cell_type": "code",
4456
"execution_count": 3,

examples/function_encoder/Part_2_Function_Encoder_Neural_ODE.ipynb

Lines changed: 14 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
{
22
"cells": [
33
{
4-
"cell_type": "markdown",
54
"metadata": {},
5+
"cell_type": "markdown",
66
"source": [
77
"# Function encoders with neural ODE basis functions\n",
88
"The function encoder is agnostic to the basis function, so long as it is differentiable. Consequently, the basis function architecture can be modified to fit the problem at hand. For example, for continuous-time systems, the best models are typically neural ODEs. Neural ODEs output the gradient of the system, and then integrate it to get the next state, rather than outputing the next state directly:\n",
@@ -13,9 +13,21 @@
1313
"\n",
1414
"This is extremely accurate because the inductive bias of this model matches the continuous-time nature of the dynamical system. However, neural ODEs are slow to train, and cannot adapt to new data at runtime. We can fix this by combining neural ODEs with function encoders. As we saw in the [first example](Part_1_Intro_to_Function_Encoders.ipynb), function encoders can adapt their model estimates based on small amounts of data. By combining the two approaches, we get the best of both worlds: Accurate, continuous-time estimates AND efficient, online model updates. All this requires is to define the basis functions as neural ODEs instead of neural networks. The algorithm is unchanged.\n",
1515
"\n",
16-
"We will demonstrate this on a simple Van Der Pol system, modeled with a simple RK4 integrator. "
16+
"We will demonstrate this on a simple Van Der Pol system, modeled with a simple RK4 integrator."
1717
]
1818
},
19+
{
20+
"metadata": {},
21+
"cell_type": "markdown",
22+
"source": "### Install Neuromancer (Colab only)"
23+
},
24+
{
25+
"metadata": {},
26+
"cell_type": "code",
27+
"outputs": [],
28+
"execution_count": null,
29+
"source": "!pip install neuromancer"
30+
},
1931
{
2032
"cell_type": "code",
2133
"execution_count": 1,

0 commit comments

Comments
 (0)