Skip to content

Commit

Permalink
refined doc of ch01
Browse files Browse the repository at this point in the history
  • Loading branch information
sth4nth committed Feb 19, 2016
1 parent 19796f1 commit 46f2a90
Show file tree
Hide file tree
Showing 8 changed files with 34 additions and 14 deletions.
1 change: 0 additions & 1 deletion TODO.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
TODO:
chapter10: compute bound terms (entropy) inside each factors
chapter10/12: prediction functions for VB
chapter05: MLP
chapter08: BP, EP
Expand Down
7 changes: 5 additions & 2 deletions chapter01/condEntropy.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = condEntropy (x, y)
% Compute conditional entropy H(x|y) of two discrete variables x and y.
% x, y: two vectors of integers of the same length
% Compute conditional entropy z=H(x|y) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% Output:
% z: conditional entropy z=H(x|y)
% Written by Mo Chen ([email protected]).
assert(numel(x) == numel(y));
n = numel(x);
Expand Down
7 changes: 5 additions & 2 deletions chapter01/entropy.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = entropy(x)
% Compute entropy H(x) of a discrete variable x.
% x: a vectors of integers
% Compute entropy z=H(x) of a discrete variable x.
% Input:
% x: a vectors of integers
% Output:
% z: entropy z=H(x)
% Written by Mo Chen ([email protected]).
n = numel(x);
x = reshape(x,1,n);
Expand Down
7 changes: 5 additions & 2 deletions chapter01/jointEntropy.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = jointEntropy(x, y)
% Compute joint entropy H(x,y) of two discrete variables x and y.
% x, y: two vectors of integers of the same length
% Compute joint entropy z=H(x,y) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% Output:
% z: joint entroy z=H(x,y)
% Written by Mo Chen ([email protected]).
assert(numel(x) == numel(y));
n = numel(x);
Expand Down
5 changes: 4 additions & 1 deletion chapter01/mutInfo.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = mutInfo(x, y)
% Compute mutual information I(x,y) of two discrete variables x and y.
% x, y: two vectors of integers of the same length
% Input:
% x, y: two vectors of integers of the same length
% Output:
% z: mutual information z=I(x,y)
% Written by Mo Chen ([email protected]).
assert(numel(x) == numel(y));
n = numel(x);
Expand Down
7 changes: 5 additions & 2 deletions chapter01/nmi.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = nmi(x, y)
% Compute normalized mutual information I(x,y)/sqrt(H(x)*H(y)).
% x, y: two vectors of integers of the same length
% Compute normalized mutual information I(x,y)/sqrt(H(x)*H(y)) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% Ouput:
% z: normalized mutual information z=I(x,y)/sqrt(H(x)*H(y))
% Written by Mo Chen ([email protected]).
assert(numel(x) == numel(y));
n = numel(x);
Expand Down
7 changes: 5 additions & 2 deletions chapter01/nvi.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = nvi(x, y)
% Compute normalized variation information (1-I(x,y)/H(x,y)).
% x, y: two vectors of integers of the same length
% Compute normalized variation information z=(1-I(x,y)/H(x,y)) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% Output:
% z: normalized variation information z=(1-I(x,y)/H(x,y))
% Written by Mo Chen ([email protected]).
assert(numel(x) == numel(y));
n = numel(x);
Expand Down
7 changes: 5 additions & 2 deletions chapter01/relatEntropy.m
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
function z = relatEntropy (x, y)
% Compute relative entropy (a.k.a KL divergence) KL(p(x)||p(y)) of two discrete variables x and y.
% x, y: two vectors of integers of the same length
% Compute relative entropy (a.k.a KL divergence) z=KL(p(x)||p(y)) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% Output:
% z: relative entropy (a.k.a KL divergence) z=KL(p(x)||p(y))
% Written by Mo Chen ([email protected]).
assert(numel(x) == numel(y));
n = numel(x);
Expand Down

0 comments on commit 46f2a90

Please sign in to comment.