Skip to content

Commit d232d11

Browse files
authored
Implement nn_kernel function and visualize samples
Added Python code for neural network kernel and Gaussian process prior samples visualization.
1 parent 23d7a5a commit d232d11

File tree

1 file changed

+28
-0
lines changed

1 file changed

+28
-0
lines changed

chapter_gaussian-processes/gp-priors.md

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -148,6 +148,34 @@ In some cases, we can essentially evaluate this covariance function in closed fo
148148

149149
The RBF kernel is _stationary_, meaning that it is _translation invariant_, and therefore can be written as a function of $\tau = x-x'$. Intuitively, stationarity means that the high-level properties of the function, such as rate of variation, do not change as we move in input space. The neural network kernel, however, is _non-stationary_. Below, we show sample functions from a Gaussian process with this kernel. We can see that the function looks qualitatively different near the origin.
150150

151+
```{.python .input}
152+
def nn_kernel(x1, x2):
153+
x1 = x1.flatten()
154+
x2 = x2.flatten()
155+
N, M = len(x1), len(x2)
156+
cov_matrix = np.zeros((N, M))
157+
for i in range(N):
158+
for j in range(M):
159+
tilde_x_i = np.array([1, x1[i]])
160+
tilde_x_j = np.array([1, x2[j]])
161+
numerator = 2 * np.dot(tilde_x_i, tilde_x_j)
162+
term_i = 1 + 2 * np.dot(tilde_x_i, tilde_x_i)
163+
term_j = 1 + 2 * np.dot(tilde_x_j, tilde_x_j)
164+
arg = np.clip(numerator / np.sqrt(term_i * term_j), -1.0, 1.0)
165+
cov_matrix[i, j] = (2 / np.pi) * np.arcsin(arg)
166+
return cov_matrix
167+
168+
x_points = np.linspace(-5, 5, 100)
169+
meanvec = np.zeros(len(x_points))
170+
covmat = nn_kernel(x_points, x_points)
171+
prior_samples = np.random.multivariate_normal(meanvec, covmat, size=5)
172+
173+
d2l.plt.plot(x_points, prior_samples.T, alpha=0.7)
174+
d2l.plt.show()
175+
176+
```
177+
178+
151179
## Summary
152180

153181

0 commit comments

Comments
 (0)