-
Notifications
You must be signed in to change notification settings - Fork 32
Open
Description
Hi, @wil-j-wil
I randomly stumbled upon your work when researching temporal GPs and found this cool package. (Thanks for such an awesome work behind it)
I am running into the following issue when using your package
- whenever I use a more complex kernel my predictions that are outside of the X_train values are all NaNs.
For example airline passenger dataset using some default setup:
kernel = QuasiPeriodicMatern32()
likelihood = Gaussian()
model = MarkovVariationalGP(kernel=kernel, likelihood=likelihood, X=X_train, Y=y_train)
- Again, whenever I predict value "within" the X_train, it works well:
mean_in, var_in = model.predict(X=X_train[-1]) # this is inside X_train, E[f] = "perfetto"
- Whenever I try to "extrapolate" I get NaNs.
mean_out, var_out = model.predict(X=X_test[0]) # this is outside X_train, E[f] = nan
I also get NaN for the first observation of X_train, (see plots below)
- This does not happen when using basic kernels such as:
kernel = Matern12()
- Specifically, it does happen whenever I try to use Sum of kernels or any of the combination kernels.
Am I perhaps misunderstanding the purpose of the model or doing something wrong? (Thanks in advance for your help : ). I am just a beginner GP enthusiast looking into what these models are capable of doing)
P.S. I installed bayesnewton (hopefully) according to requirements:
[tool.poetry.dependencies]
python = "3.10.11"
tensorflow-macos = "2.13.0"
tensorflow-probability = "^0.21.0"
numba = "^0.58.0"
gpflow = "^2.9.0"
scipy = "^1.11.2"
pandas = "^2.1.1"
jax = "0.4.2"
jaxlib = "0.4.2"
objax = "1.6.0"
ipykernel = "^6.25.2"
plotly = "^5.17.0"
seaborn = "^0.12.2"
nbformat = "^5.9.2"
scikit-learn = "^1.3.1"
convertbng = "^0.6.43"
Metadata
Metadata
Assignees
Labels
No labels

