Skip to content

Commit

Permalink
Merge pull request scutan90#356 from CoderOverflow/patch-1
Browse files Browse the repository at this point in the history
2.14.3 二类LDA算法原理 内容修订
  • Loading branch information
scutan90 authored Apr 17, 2019
2 parents 0cd90ba + ff1e6df commit 8f6b4a5
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions ch02_机器学习基础/第二章_机器学习基础.md
Original file line number Diff line number Diff line change
Expand Up @@ -774,9 +774,9 @@ $$
u_j = \frac{1}{N_j} \sum_{\boldsymbol x\epsilon X_j}\boldsymbol x(j=0,1),
\sum_j = \sum_{\boldsymbol x\epsilon X_j}(\boldsymbol x-u_j)(\boldsymbol x-u_j)^T(j=0,1)
$$
​ 假设投影直线是向量 $\boldsymbol w$,对任意样本 $\boldsymbol x_i$,它在直线 $w$上的投影为 $w^tx_i$,两个类别的中心点 $u_0$, $u_1 $在直线 $w$ 的投影分别为 $\boldsymbol w^Tu_0$ 、$\boldsymbol w^Tu_1$。
​ 假设投影直线是向量 $\boldsymbol w$,对任意样本 $\boldsymbol x_i$,它在直线 $w$上的投影为 $\boldsymbol w^Tx_i$,两个类别的中心点 $u_0$, $u_1 $在直线 $w$ 的投影分别为 $\boldsymbol w^Tu_0$ 、$\boldsymbol w^Tu_1$。

​ LDA的目标是让两类别的数据中心间的距离 $\| \boldsymbol w^Tu_0 - \boldsymbol w^Tu_1 \|^2_2$ 尽量大,与此同时,希望同类样本投影点的协方差$\boldsymbol w^T \sum_0 \boldsymbol w$、$\boldsymbol w^T \sum_1 \boldsymbol w$ 尽量小,最小化 $\boldsymbol w^T \sum_0 \boldsymbolw - \boldsymbol w^T \sum_1 \boldsymbol w​$ 。
​ LDA的目标是让两类别的数据中心间的距离 $\| \boldsymbol w^Tu_0 - \boldsymbol w^Tu_1 \|^2_2$ 尽量大,与此同时,希望同类样本投影点的协方差$\boldsymbol w^T \sum_0 \boldsymbol w$、$\boldsymbol w^T \sum_1 \boldsymbol w$ 尽量小,最小化 $\boldsymbol w^T \sum_0 \boldsymbol w - \boldsymbol w^T \sum_1 \boldsymbol w​$ 。
​ 定义
​ 类内散度矩阵
$$
Expand Down

0 comments on commit 8f6b4a5

Please sign in to comment.