Skip to content
/ volterra Public
forked from mof2/volterra

Estimate Volterra series via Gaussian process regression using polynomial kernels

License

Notifications You must be signed in to change notification settings

MPT96/volterra

 
 

Repository files navigation

volterra

Estimate Volterra series via Gaussian process regression using polynomial kernels

We provide the Matlab package poly_reg and the Python package preg for the same purpose.

The preg package (Python)

preg is a Python module for doing Gaussian process regression [1] using a polynomial covariance function. It can be used to find a Volterra or Wiener series expansion of an unknown system where only pairs of vector-valued inputs and scalar outputs are given [2].

The package provides the following functionalities:

  • Automatic model selection according to one of three possible criteria:

    a. the log-likelihood of the observed data [1]; b. Geisser's surrogate predictive probability [3]; c. the analytically computed leave-one-out error on the training set [4].

  • Computation of the predictive distribution (i.e., predictive mean and variance) of the output for a new, previously unseen input.

  • Computation of the explicit nth-order Volterra operator from the implicitly given polynomial expansion (see [2])

The available polynomial covariance functions of order p are

1. the inhomogeneous polynomial kernel:  k(x, y) = (x*y + 1)^p
2. the summed polynomial kernel: k(x,y) = sum_i^p (x*y)^i
3. the adaptive polynomial kernel: k(x,y) = sum_i^p w_i (x*y)^i where each degree of
   nonlinearity receives an individual weight w_i that is found during model
   selection.

Example of use::

from preg import Preg, Logger

with Logger('linear') as lg:

    # init Gaussian process object
    gp = Preg(logger, covariance, degree, hyperparameters)

    # do automatic model selection for polynomial degree 1 to 4 and training
    gp.amsd(Xtrain, ytrain, [1,2,3,4], model_selection_method,
        number_of_iterations)

    # predict on test data
    predicted_test_outputs = gp.predict(test_inputs)

    # estimate third order Volterra kernel (i.e. a third order tensor of coefficients)
    volterra_kernel_3 = gp.volt(3)

A simple 1D toy example showing the basic regression functionality is given in the accompanying programming example 'sinc_example.py'. Further examples can be found in the test file 'test_preg.py' and in the notebook 'Volterra System Identification.ipynb'.

preg can be used together with scikit-learn, i.e. it implements the scikit-lean-API-functions set_params(), get_params(), fit(), and predict() for being included in the cross-validation estimator and pipelines. Console output is fed to a logger based on the standard python module logging.

To install, unpack the python distribution, then execute::

python setup.py install

The module is available as::

import preg

The tests can be run by typing in the installation directory (nose must be installed)::

nosetests -v

The documentation can be generated by typing in the subdirectory doc::

make html

The automatic model selection uses a minimization routine based on Carl Rasmussen's minimize.m MATLAB script (see copyright notice in the function code) and its Python adaptation by R. Memisevic (2008).

Further documentation on preg can be found here.

The poly_reg package (Matlab)

Authors: Matthias O. Franz and Peter V. Gehler

Poly_reg is a Matlab package for doing Gaussian process regression [1] using a polynomial covariance function. It can be used to find a Volterra or Wiener series expansion of an unknown system where only pairs of vector-valued inputs and scalar outputs are given [2]. The package provides the following functionalities:

  1. Automatic model selection according to one of three possible criteria: a. the log-likelihood of the observed data [1]; b. Geisser’s surrogate predictive probability [3]; c. the analytically computed leave-one-out error on the training set [4].
  2. Computation of the predictive distribution (i.e., predictive mean and variance) of the output for a new, previously unseen input.
  3. Computation of the explicit nth-order Volterra operator from the implicitly given polynomial expansion (see [2])

The available polynomial covariance functions of order p are

  1. the inhomogeneous polynomial kernel:
         k(x, y) = (x’*y + 1)^p
  1. the adaptive polynomial kernel:
         k(x,y) = sum_i^p (w_i x’y)^I

where each degree of nonlinearity receives an individual weight w_i that is found during model selection.

For installing the package, please download all files in the directory poly_reg. A simple 1D toy example showing the basic regression functionality is given in the programming example sinc_test.m (together with the plotting routine plot_predict.m). We included also a system identification example (volterra_system_identification.m) in which a first and second order Volterra is estimated from a time series generated by a dynamical system.

Further documentation on poly_reg can be found here.

References

[1] Rasmussen, C. E., & Williams, C. K. I. (2006). Gaussian processes for machine learning. Cambridge, MA: MIT Press.

[2] Franz, M. O. and B. Schölkopf (2006). A unifying view of Wiener and Volterra theory and polynomial kernel regression. Neural Computation. 18, 3097 – 3118.

[3] S. Sundararajan, S.S. Keerthi (2001). Predictive approaches for choosing hyperparameters in Gaussian processes. Neural Computation 13, 1103-1118.

[4] V. Vapnik (1982). Estimation of dependences based on empirical data. New York: Springer.

About

Estimate Volterra series via Gaussian process regression using polynomial kernels

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 51.1%
  • MATLAB 38.4%
  • Jupyter Notebook 10.5%