Skip to content

Commit 0e1acae

Browse files
committed
add Contents.m
1 parent 62279de commit 0e1acae

File tree

3 files changed

+126
-3
lines changed

3 files changed

+126
-3
lines changed

Contents.m

+124
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
% CHAPTER01
2+
% condEntropy - Compute conditional entropy z=H(x|y) of two discrete variables x and y.
3+
% entropy - Compute entropy z=H(x) of a discrete variable x.
4+
% jointEntropy - Compute joint entropy z=H(x,y) of two discrete variables x and y.
5+
% mutInfo - Compute mutual information I(x,y) of two discrete variables x and y.
6+
% nmi - Compute normalized mutual information I(x,y)/sqrt(H(x)*H(y)) of two discrete variables x and y.
7+
% nvi - Compute normalized variation information z=(1-I(x,y)/H(x,y)) of two discrete variables x and y.
8+
% relatEntropy - Compute relative entropy (a.k.a KL divergence) z=KL(p(x)||p(y)) of two discrete variables x and y.
9+
% CHAPTER02
10+
% logDirichlet - Compute log pdf of a Dirichlet distribution.
11+
% logGauss - Compute log pdf of a Gaussian distribution.
12+
% logKde - Compute log pdf of kernel density estimator.
13+
% logMn - Compute log pdf of a multinomial distribution.
14+
% logMvGamma - Compute logarithm multivariate Gamma function
15+
% logSt - Compute log pdf of a Student's t distribution.
16+
% logVmf - Compute log pdf of a von Mises-Fisher distribution.
17+
% logWishart - Compute log pdf of a Wishart distribution.
18+
% CHAPTER03
19+
% linReg - Fit linear regression model y=w'x+w0
20+
% linRegFp - Fit empirical Bayesian linear model with Mackay fixed point method (p.168)
21+
% linRegPred - Compute linear regression model reponse y = w'*X+w0 and likelihood
22+
% linRnd - Generate data from a linear model p(t|w,x)=G(w'x+w0,sigma), sigma=sqrt(1/beta)
23+
% CHAPTER04
24+
% binPlot - Plot binary classification result for 2d data
25+
% fda - Fisher (linear) discriminant analysis
26+
% logitBin - Logistic regression for binary classification optimized by Newton-Raphson method.
27+
% logitBinPred - Prediction of binary logistic regression model
28+
% logitMn - Multinomial regression for multiclass problem (Multinomial likelihood)
29+
% logitMnPred - Prediction of multiclass (multinomial) logistic regression model
30+
% sigmoid - Sigmod function
31+
% softmax - Softmax function
32+
% CHAPTER05
33+
% mlpReg - Train a multilayer perceptron neural network
34+
% mlpRegPred - Multilayer perceptron prediction
35+
% CHAPTER06
36+
% kn2sd - Transform a kernel matrix (or inner product matrix) to a squared distance matrix
37+
% knCenter - Centerize the data in the kernel space
38+
% knGauss - Gaussian (RBF) kernel K = exp(-|x-y|/(2s));
39+
% knKmeans - Perform kernel kmeans clustering.
40+
% knKmeansPred - Prediction for kernel kmeans clusterng
41+
% knLin - Linear kernel (inner product)
42+
% knPca - Kernel PCA
43+
% knPcaPred - Prediction for kernel PCA
44+
% knPoly - Polynomial kernel k(x,y)=(x'y+c)^o
45+
% knReg - Gaussian process (kernel) regression
46+
% knRegPred - Prediction for Gaussian Process (kernel) regression model
47+
% sd2kn - Transform a squared distance matrix to a kernel matrix.
48+
% CHAPTER07
49+
% rvmBinFp - Relevance Vector Machine (ARD sparse prior) for binary classification.
50+
% rvmBinPred - Prodict the label for binary logistic regression model
51+
% rvmRegFp - Relevance Vector Machine (ARD sparse prior) for regression
52+
% rvmRegPred - Compute RVM regression model reponse y = w'*X+w0 and likelihood
53+
% rvmRegSeq - Sparse Bayesian Regression (RVM) using sequential algorithm
54+
% CHAPTER08
55+
% MRF
56+
% mrfBethe - Compute Bethe energy
57+
% mrfBp - Undirected graph belief propagation for MRF
58+
% mrfGibbs - Compute Gibbs energy
59+
% mrfIsGa - Contruct a latent Ising MRF with Gaussian observation
60+
% mrfMf - Mean field for MRF
61+
% NaiveBayes
62+
% nbBern - Naive bayes classifier with indepenet Bernoulli.
63+
% nbBernPred - Prediction of naive Bayes classifier with independent Bernoulli.
64+
% nbGauss - Naive bayes classifier with indepenet Gaussian
65+
% nbGaussPred - Prediction of naive Bayes classifier with independent Gaussian.
66+
% CHAPTER09
67+
% kmeans - Perform kmeans clustering.
68+
% kmeansPred - Prediction for kmeans clusterng
69+
% kmeansRnd - Generate samples from a Gaussian mixture distribution with common variances (kmeans model).
70+
% kmedoids - Perform k-medoids clustering.
71+
% kseeds - Perform kmeans++ seeding
72+
% linRegEm - Fit empirical Bayesian linear regression model with EM (p.448 chapter 9.3.4)
73+
% mixBernEm - Perform EM algorithm for fitting the Bernoulli mixture model.
74+
% mixBernRnd - Generate samples from a Bernoulli mixture distribution.
75+
% mixGaussEm - Perform EM algorithm for fitting the Gaussian mixture model.
76+
% mixGaussPred - Predict label and responsibility for Gaussian mixture model.
77+
% mixGaussRnd - Genarate samples form a Gaussian mixture model.
78+
% rvmBinEm - Relevance Vector Machine (ARD sparse prior) for binary classification.
79+
% rvmRegEm - Relevance Vector Machine (ARD sparse prior) for regression
80+
% CHAPTER10
81+
% linRegVb - Variational Bayesian inference for linear regression.
82+
% mixGaussEvidence - Variational lower bound of the model evidence (log of marginal likelihood)
83+
% mixGaussVb - Variational Bayesian inference for Gaussian mixture.
84+
% mixGaussVbPred - Predict label and responsibility for Gaussian mixture model trained by VB.
85+
% rvmRegVb - Variational Bayesian inference for RVM regression.
86+
% CHAPTER11
87+
% dirichletRnd - Generate samples from a Dirichlet distribution.
88+
% discreteRnd - Generate samples from a discrete distribution (multinomial).
89+
% Gauss - Class for Gaussian distribution used by Dirichlet process
90+
% gaussRnd - Generate samples from a Gaussian distribution.
91+
% GaussWishart - Class for Gaussian-Wishart distribution used by Dirichlet process
92+
% mixDpGb - Collapsed Gibbs sampling for Dirichlet process (infinite) mixture model.
93+
% mixDpGbOl - Online collapsed Gibbs sampling for Dirichlet process (infinite) mixture model.
94+
% mixGaussGb - Collapsed Gibbs sampling for Dirichlet process (infinite) Gaussian mixture model (a.k.a. DPGM).
95+
% mixGaussSample - Genarate samples form a Gaussian mixture model with GaussianWishart prior.
96+
% CHAPTER12
97+
% fa - Perform EM algorithm for factor analysis model
98+
% pca - Principal component analysis
99+
% pcaEm - Perform EM-like algorithm for PCA (by Sam Roweis).
100+
% pcaEmC - Perform Constrained EM like algorithm for PCA.
101+
% ppcaEm - Perform EM algorithm to maiximize likelihood of probabilistic PCA model.
102+
% ppcaRnd - Generate data from probabilistic PCA model
103+
% ppcaVb - Perform variatioanl Bayeisan inference for probabilistic PCA model.
104+
% CHAPTER13
105+
% HMM
106+
% hmmEm - EM algorithm to fit the parameters of HMM model (a.k.a Baum-Welch algorithm)
107+
% hmmFilter - HMM forward filtering algorithm.
108+
% hmmRnd - Generate a data sequence from a hidden Markov model.
109+
% hmmSmoother - HMM smoothing alogrithm (normalized forward-backward or normalized alpha-beta algorithm).
110+
% hmmViterbi - Viterbi algorithm (calculated in log scale to improve numerical stability).
111+
% LDS
112+
% kalmanFilter - Kalman filter (forward algorithm for linear dynamic system)
113+
% kalmanSmoother - Kalman smoother (forward-backward algorithm for linear dynamic system)
114+
% ldsEm - EM algorithm for parameter estimation of linear dynamic system.
115+
% ldsPca - Subspace method for learning linear dynamic system.
116+
% ldsRnd - Generate a data sequence from linear dynamic system.
117+
% CHAPTER14
118+
% adaboostBin - Adaboost for binary classification (weak learner: kmeans)
119+
% adaboostBinPred - Prediction of binary Adaboost
120+
% mixLinPred - Prediction function for mxiture of linear regression
121+
% mixLinReg - Mixture of linear regression
122+
% mixLinRnd - Generate data from mixture of linear model
123+
% mixLogitBin - Mixture of logistic regression model for binary classification optimized by Newton-Raphson method
124+
% mixLogitBinPred - Prediction function for mixture of logistic regression

chapter07/rvmRegSeq.m

-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
function [model, llh] = rvmRegSeq(X, t)
2-
% TODO: beta is not updated.
32
% Sparse Bayesian Regression (RVM) using sequential algorithm
43
% Input:
54
% X: d x n data

chapter08/NaiveBayes/nbGauss.m

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
function model = nbGauss(X, t)
2-
% Naive bayes classifier with indepenet Gaussian, each dimension of data is
3-
% assumed from a 1d Gaussian distribution with independent mean and variance.
2+
% Naive bayes classifier with indepenet Gaussian
3+
% Each dimension of data is assumed from a 1d Gaussian distribution with independent mean and variance.
44
% Input:
55
% X: d x n data matrix
66
% t: 1 x n label (1~k)

0 commit comments

Comments
 (0)