Skip to content

Commit

Permalink
refined doc of ch01
Browse files Browse the repository at this point in the history
  • Loading branch information
sth4nth committed Feb 19, 2016
1 parent 46f2a90 commit 2e0b3b8
Show file tree
Hide file tree
Showing 7 changed files with 7 additions and 6 deletions.
2 changes: 1 addition & 1 deletion chapter01/condEntropy.m
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
function z = condEntropy (x, y)
% Compute conditional entropy z=H(x|y) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% x, y: two integer vector of the same length
% Output:
% z: conditional entropy z=H(x|y)
% Written by Mo Chen ([email protected]).
Expand Down
2 changes: 1 addition & 1 deletion chapter01/jointEntropy.m
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
function z = jointEntropy(x, y)
% Compute joint entropy z=H(x,y) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% x, y: two integer vector of the same length
% Output:
% z: joint entroy z=H(x,y)
% Written by Mo Chen ([email protected]).
Expand Down
2 changes: 1 addition & 1 deletion chapter01/mutInfo.m
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
function z = mutInfo(x, y)
% Compute mutual information I(x,y) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% x, y: two integer vector of the same length
% Output:
% z: mutual information z=I(x,y)
% Written by Mo Chen ([email protected]).
Expand Down
2 changes: 1 addition & 1 deletion chapter01/nmi.m
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
function z = nmi(x, y)
% Compute normalized mutual information I(x,y)/sqrt(H(x)*H(y)) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% x, y: two integer vector of the same length
% Ouput:
% z: normalized mutual information z=I(x,y)/sqrt(H(x)*H(y))
% Written by Mo Chen ([email protected]).
Expand Down
2 changes: 1 addition & 1 deletion chapter01/nvi.m
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
function z = nvi(x, y)
% Compute normalized variation information z=(1-I(x,y)/H(x,y)) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% x, y: two integer vector of the same length
% Output:
% z: normalized variation information z=(1-I(x,y)/H(x,y))
% Written by Mo Chen ([email protected]).
Expand Down
2 changes: 1 addition & 1 deletion chapter01/relatEntropy.m
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
function z = relatEntropy (x, y)
% Compute relative entropy (a.k.a KL divergence) z=KL(p(x)||p(y)) of two discrete variables x and y.
% Input:
% x, y: two vectors of integers of the same length
% x, y: two integer vector of the same length
% Output:
% z: relative entropy (a.k.a KL divergence) z=KL(p(x)||p(y))
% Written by Mo Chen ([email protected]).
Expand Down
1 change: 1 addition & 0 deletions chapter02/logDirichlet.m
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
function y = logDirichlet(X, a)
% Compute log pdf of a Dirichlet distribution.
% Input:
% X: d x n data matrix satifying (sum(X,1)==ones(1,n) && X>=0)
% a: d x k parameters
% y: k x n probability density
Expand Down

0 comments on commit 2e0b3b8

Please sign in to comment.