diff --git a/docs/01-Introduction/1.1-Introduction.md b/docs/01-Introduction/1.1-Introduction.md index 17dbf4324f..69b9d186e7 100644 --- a/docs/01-Introduction/1.1-Introduction.md +++ b/docs/01-Introduction/1.1-Introduction.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=20) ---|--- 翻译 | szcf-weiya -时间 | 2016-07-26 + 发布 | 2016-09-30 统计学习在科学、经济和工业的许多领域都扮演着重要角色.下面是学习问题中的一些例子. diff --git a/docs/02-Overview-of-Supervised-Learning/2.1-Introduction.md b/docs/02-Overview-of-Supervised-Learning/2.1-Introduction.md index 3efb0426ad..c4b1ad6358 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.1-Introduction.md +++ b/docs/02-Overview-of-Supervised-Learning/2.1-Introduction.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=28) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-08-21 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.2-Variable-Types-and-Terminology.md b/docs/02-Overview-of-Supervised-Learning/2.2-Variable-Types-and-Terminology.md index 6186a35080..ef28be6999 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.2-Variable-Types-and-Terminology.md +++ b/docs/02-Overview-of-Supervised-Learning/2.2-Variable-Types-and-Terminology.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=28) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-08-21 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.3-Two-Simple-Approaches-to-Prediction.md b/docs/02-Overview-of-Supervised-Learning/2.3-Two-Simple-Approaches-to-Prediction.md index 19e3bc92ea..a8cd7fcafe 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.3-Two-Simple-Approaches-to-Prediction.md +++ b/docs/02-Overview-of-Supervised-Learning/2.3-Two-Simple-Approaches-to-Prediction.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=30) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-08-21, 2018-10-15 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.4-Statistical-Decision-Theory.md b/docs/02-Overview-of-Supervised-Learning/2.4-Statistical-Decision-Theory.md index 61fc4298b9..bee81ada9f 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.4-Statistical-Decision-Theory.md +++ b/docs/02-Overview-of-Supervised-Learning/2.4-Statistical-Decision-Theory.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=37) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-08-22 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.5-Local-Methods-in-High-Dimensions.md b/docs/02-Overview-of-Supervised-Learning/2.5-Local-Methods-in-High-Dimensions.md index 8ac79b7248..8181a8ea76 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.5-Local-Methods-in-High-Dimensions.md +++ b/docs/02-Overview-of-Supervised-Learning/2.5-Local-Methods-in-High-Dimensions.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=41) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 校订 | 2017-09-10 至今为止我们已经仔细讨论了两个关于预测的学习方法:稳定但是有偏差的线性模型和不稳定但显然偏差较小的 $k$-最近邻估计.当有充分大的训练数据,我们似乎总会选择 $k$-最近邻平均来近似理论上的最优条件期望,因为我们能够找到一个相当大的离 $x$ 近的观测构成的邻域并且平均里面的观测值.在高维情形下这种方法以及我们的直觉都没有用,而且这种现象通常被称作 **维度的诅咒 (curse of dimensionality)** (Bellman, 1961[^1]).关于这个问题有很多的证明,我们将要仔细讨论一些. diff --git a/docs/02-Overview-of-Supervised-Learning/2.6-Statistical-Models-Supervised-Learning-and-Function-Approximation.md b/docs/02-Overview-of-Supervised-Learning/2.6-Statistical-Models-Supervised-Learning-and-Function-Approximation.md index aab2a6e3d4..a046c3c478 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.6-Statistical-Models-Supervised-Learning-and-Function-Approximation.md +++ b/docs/02-Overview-of-Supervised-Learning/2.6-Statistical-Models-Supervised-Learning-and-Function-Approximation.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=47) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2019-11-02 22:28:35 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.7-Structured-Regression-Models.md b/docs/02-Overview-of-Supervised-Learning/2.7-Structured-Regression-Models.md index e9f612cae7..e4cccc0706 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.7-Structured-Regression-Models.md +++ b/docs/02-Overview-of-Supervised-Learning/2.7-Structured-Regression-Models.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=51) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2019-04-02 19:56:12 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.8-Classes-of-Restricted-Estimators.md b/docs/02-Overview-of-Supervised-Learning/2.8-Classes-of-Restricted-Estimators.md index 374d588698..fe87e57896 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.8-Classes-of-Restricted-Estimators.md +++ b/docs/02-Overview-of-Supervised-Learning/2.8-Classes-of-Restricted-Estimators.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=52) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-02-14, 2018-08-31 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/2.9-Model-Selection-and-the-Bias-Variance-Tradeoff.md b/docs/02-Overview-of-Supervised-Learning/2.9-Model-Selection-and-the-Bias-Variance-Tradeoff.md index 3e08585e9a..b1cc05f133 100644 --- a/docs/02-Overview-of-Supervised-Learning/2.9-Model-Selection-and-the-Bias-Variance-Tradeoff.md +++ b/docs/02-Overview-of-Supervised-Learning/2.9-Model-Selection-and-the-Bias-Variance-Tradeoff.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=56) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-02-14; 2018-02-21 状态 | Done diff --git a/docs/02-Overview-of-Supervised-Learning/Bibliographic-Notes.md b/docs/02-Overview-of-Supervised-Learning/Bibliographic-Notes.md index 8cce698a63..a69326d649 100644 --- a/docs/02-Overview-of-Supervised-Learning/Bibliographic-Notes.md +++ b/docs/02-Overview-of-Supervised-Learning/Bibliographic-Notes.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=58) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-01 + 发布 | 2016-09-30 更新 | 2018-02-21 状态 | Done diff --git a/docs/03-Linear-Methods-for-Regression/3.1-Introduction.md b/docs/03-Linear-Methods-for-Regression/3.1-Introduction.md index 9de87a3226..161f93d9fc 100644 --- a/docs/03-Linear-Methods-for-Regression/3.1-Introduction.md +++ b/docs/03-Linear-Methods-for-Regression/3.1-Introduction.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=62) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-02 + 发布 | 2016-09-30 更新 | 2017-10-31; 2018-02-22 状态 | Done diff --git a/docs/03-Linear-Methods-for-Regression/3.2-Linear-Regression-Models-and-Least-Squares.md b/docs/03-Linear-Methods-for-Regression/3.2-Linear-Regression-Models-and-Least-Squares.md index b0fda7aa70..2eb490a32c 100644 --- a/docs/03-Linear-Methods-for-Regression/3.2-Linear-Regression-Models-and-Least-Squares.md +++ b/docs/03-Linear-Methods-for-Regression/3.2-Linear-Regression-Models-and-Least-Squares.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=63) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-08-03 | +| 发布 | 2016-09-30 | |更新|2019-02-24 21:25:28| |状态|Done| diff --git a/docs/03-Linear-Methods-for-Regression/3.3-Subset-Selection.md b/docs/03-Linear-Methods-for-Regression/3.3-Subset-Selection.md index 0169ad7af1..a0681824a2 100644 --- a/docs/03-Linear-Methods-for-Regression/3.3-Subset-Selection.md +++ b/docs/03-Linear-Methods-for-Regression/3.3-Subset-Selection.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=76) ---|--- 翻译 | szcf-weiya -时间 | 2016-08-05 + 发布 | 2016-09-30 更新 | 2019-02-24 22:22:02 状态 | Done diff --git a/docs/03-Linear-Methods-for-Regression/3.4-Shrinkage-Methods.md b/docs/03-Linear-Methods-for-Regression/3.4-Shrinkage-Methods.md index 4dc5e9952f..bc3e8c8108 100644 --- a/docs/03-Linear-Methods-for-Regression/3.4-Shrinkage-Methods.md +++ b/docs/03-Linear-Methods-for-Regression/3.4-Shrinkage-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=80) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-09-30:2016-10-14 | +| 发布 | 2016-09-30 | |更新| 2018-03-22, 2018-03-23, 2018-03-24| |状态|Done| diff --git a/docs/03-Linear-Methods-for-Regression/3.5-Methods-Using-Derived-Input-Directions.md b/docs/03-Linear-Methods-for-Regression/3.5-Methods-Using-Derived-Input-Directions.md index a84c55e813..489bbf654b 100644 --- a/docs/03-Linear-Methods-for-Regression/3.5-Methods-Using-Derived-Input-Directions.md +++ b/docs/03-Linear-Methods-for-Regression/3.5-Methods-Using-Derived-Input-Directions.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-10-14:2016-10-21 | +| 发布 | 2016-10-14 | | 更新 | 2018-03-24, 2018-04-24, 2018-04-25| | 状态 |Done| |备注| Exercise(2/4)| diff --git a/docs/03-Linear-Methods-for-Regression/3.6-A-Comparison-of-the-Selection-and-Shrinkage-Methods.md b/docs/03-Linear-Methods-for-Regression/3.6-A-Comparison-of-the-Selection-and-Shrinkage-Methods.md index edc270c2ae..93b4a3f10a 100644 --- a/docs/03-Linear-Methods-for-Regression/3.6-A-Comparison-of-the-Selection-and-Shrinkage-Methods.md +++ b/docs/03-Linear-Methods-for-Regression/3.6-A-Comparison-of-the-Selection-and-Shrinkage-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-10-21:2016-10-21 | +| 发布 | 2016-10-21 | | 更新 | 2018-03-25| |状态|Done| |备注| [1 simulation](../notes/linear-reg/sim-3-18/index.html) | diff --git a/docs/03-Linear-Methods-for-Regression/3.8-More-on-the-Lasso-and-Related-Path-Algorithms.md b/docs/03-Linear-Methods-for-Regression/3.8-More-on-the-Lasso-and-Related-Path-Algorithms.md index 05a9337d3d..611e8a23e5 100644 --- a/docs/03-Linear-Methods-for-Regression/3.8-More-on-the-Lasso-and-Related-Path-Algorithms.md +++ b/docs/03-Linear-Methods-for-Regression/3.8-More-on-the-Lasso-and-Related-Path-Algorithms.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=105) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-10-21 | +| 发布 | 2016-10-21 | |更新|2019-02-18 16:25:57| |状态 |Done| diff --git a/docs/03-Linear-Methods-for-Regression/3.9-Computational-Considerations.md b/docs/03-Linear-Methods-for-Regression/3.9-Computational-Considerations.md index 57a0074e1e..3665e137f9 100644 --- a/docs/03-Linear-Methods-for-Regression/3.9-Computational-Considerations.md +++ b/docs/03-Linear-Methods-for-Regression/3.9-Computational-Considerations.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=112) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-11-15 | +| 发布 | 2017-11-15 | | 更新 | 2019-01-31| |状态 | Done| diff --git a/docs/03-Linear-Methods-for-Regression/Bibliographic-Notes.md b/docs/03-Linear-Methods-for-Regression/Bibliographic-Notes.md index 3a0d311374..ac0af9de7f 100644 --- a/docs/03-Linear-Methods-for-Regression/Bibliographic-Notes.md +++ b/docs/03-Linear-Methods-for-Regression/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-09 | +| 发布 | 2017-06-09 | 线性回归在很多统计教材中都有讨论,比如,Seber (1984)[^1], Weisberg (1980)[^2] 以及 Mardia et al. (1979)[^3].岭回归由 Hoerl and Kennard (1970)[^4]提出,而 lasso 由Tibshirani (1996)[^5]提出.几乎在同时,lasso形式的惩罚在信号处理中的 basis pursuit 方法中被提出(Chen et al., 1998)[^6].最小角回归过程由 Efron et al. (2004)[^7]等人提出;与这有关的是早期 Osborne et al. (2000a)[^8]和 Osborne et al. (2000b)[^9]的homotopy过程.他们的算法也利用了在 LAR/lasso 算法中的分段线性,但是缺少透明度 (transparency).向前逐步准则在 Hastie et al. (2007)[^10]中进行了讨论.Park and Hastie (2007)[^11] 发展了类似用于广义回归模型的最小角回归的路径算法.偏最小二乘由 Wold (1975)[^12]提出.收缩方法的比较或许可以在 Copas (1983)[^13] 和 Frank and Friedman (1993)[^14]中找到. diff --git a/docs/04-Linear-Methods-for-Classification/4.1-Introduction.md b/docs/04-Linear-Methods-for-Classification/4.1-Introduction.md index 98ab999f56..a22bec188e 100644 --- a/docs/04-Linear-Methods-for-Classification/4.1-Introduction.md +++ b/docs/04-Linear-Methods-for-Classification/4.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=120) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-06 | +| 发布 | 2016-11-23 | |更新|2019-07-16 15:14:59| |状态|Done| diff --git a/docs/04-Linear-Methods-for-Classification/4.2-Linear-Regression-of-an-Indicator-Matrix.md b/docs/04-Linear-Methods-for-Classification/4.2-Linear-Regression-of-an-Indicator-Matrix.md index f404dd2d0b..860e13a9d6 100644 --- a/docs/04-Linear-Methods-for-Classification/4.2-Linear-Regression-of-an-Indicator-Matrix.md +++ b/docs/04-Linear-Methods-for-Classification/4.2-Linear-Regression-of-an-Indicator-Matrix.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=122) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-06 | +| 发布 | 2016-12-16 | |更新|2019-07-16 15:54:18| |状态|Done| diff --git a/docs/04-Linear-Methods-for-Classification/4.3-Linear-Discriminant-Analysis.md b/docs/04-Linear-Methods-for-Classification/4.3-Linear-Discriminant-Analysis.md index 160ce050fc..637d38b3f8 100644 --- a/docs/04-Linear-Methods-for-Classification/4.3-Linear-Discriminant-Analysis.md +++ b/docs/04-Linear-Methods-for-Classification/4.3-Linear-Discriminant-Analysis.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=125) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-09, 2016-12-10 | +| 发布 | 2016-12-16 | |更新|2019-07-16 16:47:50| |状态|Done| diff --git a/docs/04-Linear-Methods-for-Classification/4.4-Logistic-Regression.md b/docs/04-Linear-Methods-for-Classification/4.4-Logistic-Regression.md index 5ef7b5f341..cae28fdbbb 100644 --- a/docs/04-Linear-Methods-for-Classification/4.4-Logistic-Regression.md +++ b/docs/04-Linear-Methods-for-Classification/4.4-Logistic-Regression.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=138) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-09:2016-12-15 | +| 发布 | 2016-12-16 | | 更新 | 2019-11-05 21:29:05| |状态|Done| diff --git a/docs/04-Linear-Methods-for-Classification/4.5-Separating-Hyperplanes.md b/docs/04-Linear-Methods-for-Classification/4.5-Separating-Hyperplanes.md index cf50c08a8d..169bf2f406 100644 --- a/docs/04-Linear-Methods-for-Classification/4.5-Separating-Hyperplanes.md +++ b/docs/04-Linear-Methods-for-Classification/4.5-Separating-Hyperplanes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=148) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-15:2016-12-15 | +| 发布 | 2016-12-16 | | 更新 | 2018-03-26| | 状态 | Done| diff --git a/docs/04-Linear-Methods-for-Classification/Bibliographic-Notes.md b/docs/04-Linear-Methods-for-Classification/Bibliographic-Notes.md index 8aa7a316c2..0acc336637 100644 --- a/docs/04-Linear-Methods-for-Classification/Bibliographic-Notes.md +++ b/docs/04-Linear-Methods-for-Classification/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-09 | +| 发布 | 2017-06-09 | 关于分类的一般性概述包括Duda et al. (2000)[^1],Hand (1981)[^2], McLachlan (1992)[^3]和Ripley (1996)[^4].Mardia et al. (1979)[^5]对线性判别分析进行了简要的讨论.Michie et al. (1994)[^6]在benchmark数据集上比较了许多著名的分类器.线性分离超平面在Vapnik (1996)[^7]中有讨论.我们关于感知器的学习算法参照Ripley (1996)[^4]. diff --git a/docs/05-Basis-Expansions-and-Regularization/5.1-Introduction.md b/docs/05-Basis-Expansions-and-Regularization/5.1-Introduction.md index 618096d24b..b1a19821c6 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.1-Introduction.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | |更新| 2019-09-01 16:37:39| |状态|Done| diff --git a/docs/05-Basis-Expansions-and-Regularization/5.2-Piecewise-Polynomials-and-Splines.md b/docs/05-Basis-Expansions-and-Regularization/5.2-Piecewise-Polynomials-and-Splines.md index f651b50d0d..ce68e9176a 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.2-Piecewise-Polynomials-and-Splines.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.2-Piecewise-Polynomials-and-Splines.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=160) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-08:2017-02-16 | +| 发布 | 2017-02-08 | | 更新 | 2017-09-13& 2017-10-18 & 2018-01-04 | !!! note "更新笔记" diff --git a/docs/05-Basis-Expansions-and-Regularization/5.3-Filtering-and-Feature-Extraction.md b/docs/05-Basis-Expansions-and-Regularization/5.3-Filtering-and-Feature-Extraction.md index d4672c21c8..deb6ba090c 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.3-Filtering-and-Feature-Extraction.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.3-Filtering-and-Feature-Extraction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-16 | +| 发布 | 2017-02-17 | |更新|2018-01-04| diff --git a/docs/05-Basis-Expansions-and-Regularization/5.4-Smoothing-Splines.md b/docs/05-Basis-Expansions-and-Regularization/5.4-Smoothing-Splines.md index 6e8089c091..d068499a2f 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.4-Smoothing-Splines.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.4-Smoothing-Splines.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=170) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-16 | +| 发布 | 2017-02-17 | |更新|2018-01-04, 2018-03-19| |状态|Done| diff --git a/docs/05-Basis-Expansions-and-Regularization/5.5-Automatic-Selection-of-the-Smoothing-Parameters.md b/docs/05-Basis-Expansions-and-Regularization/5.5-Automatic-Selection-of-the-Smoothing-Parameters.md index 89f238303c..95dbd30a77 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.5-Automatic-Selection-of-the-Smoothing-Parameters.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.5-Automatic-Selection-of-the-Smoothing-Parameters.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=175) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-06 | +| 发布 | 2017-12-06 | | 更新| 2018-03-30| | 状态 |Done| |习题| [Ex. 5.10](https://github.com/szcf-weiya/ESL-CN/issues/111), [Ex. 5.13](https://github.com/szcf-weiya/ESL-CN/issues/112) | diff --git a/docs/05-Basis-Expansions-and-Regularization/5.6-Nonparametric-Logistic-Regression.md b/docs/05-Basis-Expansions-and-Regularization/5.6-Nonparametric-Logistic-Regression.md index 11bfa50124..07cdd412b1 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.6-Nonparametric-Logistic-Regression.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.6-Nonparametric-Logistic-Regression.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-13 | +| 发布 | 2017-09-14 | | 更新 | 2018-03-26, 2018-07-10| | 状态 | Done| diff --git a/docs/05-Basis-Expansions-and-Regularization/5.7-Multidimensional-Splines.md b/docs/05-Basis-Expansions-and-Regularization/5.7-Multidimensional-Splines.md index c8764a1a76..8902dcf0c0 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.7-Multidimensional-Splines.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.7-Multidimensional-Splines.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=181) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-09 | +| 发布 | 2017-06-09 | | 状态 | Done | diff --git a/docs/05-Basis-Expansions-and-Regularization/5.8-Regularization-and-Reproducing-Kernel-Hibert-Spaces.md b/docs/05-Basis-Expansions-and-Regularization/5.8-Regularization-and-Reproducing-Kernel-Hibert-Spaces.md index f19a51dc02..b64e77ab69 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.8-Regularization-and-Reproducing-Kernel-Hibert-Spaces.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.8-Regularization-and-Reproducing-Kernel-Hibert-Spaces.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=188) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2018-02-03; 2018-02-09 | +| 发布 | 2017-06-09 | |更新| 2020-02-26 22:52:43| | 状态 | Done | diff --git a/docs/05-Basis-Expansions-and-Regularization/5.9-Wavelet-Smoothing.md b/docs/05-Basis-Expansions-and-Regularization/5.9-Wavelet-Smoothing.md index 28a7112529..3383ad5030 100644 --- a/docs/05-Basis-Expansions-and-Regularization/5.9-Wavelet-Smoothing.md +++ b/docs/05-Basis-Expansions-and-Regularization/5.9-Wavelet-Smoothing.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=193) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-07 | +| 发布 | 2017-12-07 | | 更新 | 2020-06-27 21:55:17| | 状态 | Done| diff --git a/docs/05-Basis-Expansions-and-Regularization/Appendix-Computations-for-B-splines.md b/docs/05-Basis-Expansions-and-Regularization/Appendix-Computations-for-B-splines.md index a703a9ca51..b657d5a281 100644 --- a/docs/05-Basis-Expansions-and-Regularization/Appendix-Computations-for-B-splines.md +++ b/docs/05-Basis-Expansions-and-Regularization/Appendix-Computations-for-B-splines.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-10-22 | +| 发布 | 2017-10-22 | | 更新 |2019-09-09 22:07:46| | 状态 | Done | diff --git a/docs/05-Basis-Expansions-and-Regularization/Bibliographic-Notes.md b/docs/05-Basis-Expansions-and-Regularization/Bibliographic-Notes.md index 0f792bd521..44176c51ab 100644 --- a/docs/05-Basis-Expansions-and-Regularization/Bibliographic-Notes.md +++ b/docs/05-Basis-Expansions-and-Regularization/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-10 | +| 发布 | 2017-06-09 | 样条和B样条在de Boor (1978)[^1]中有详细讨论.Green and Silverman (1994)[^2]和Wahba (1990)给出了光滑样条以及thin-plate样条的;后者也产生核Hilbert空间.关于采用RKHS方法的非参回归技巧的联系可以参见Girosi et al. (1995)[^3] 和Evgeniou et al. (2000)[^4].如5.2.3节所示,对函数数据建模,在Ramsay and Silverman (1997)[^5]中有详细介绍. diff --git a/docs/06-Kernel-Smoothing-Methods/6.0-Introduction.md b/docs/06-Kernel-Smoothing-Methods/6.0-Introduction.md index ef6d171cf9..abdecd4f10 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.0-Introduction.md +++ b/docs/06-Kernel-Smoothing-Methods/6.0-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-03-01 | | 更新 | 2018-07-18| |状态|Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.1-One-Dimensional-Kernel-Smoothers.md b/docs/06-Kernel-Smoothing-Methods/6.1-One-Dimensional-Kernel-Smoothers.md index 8af52deba3..dac9fd4632 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.1-One-Dimensional-Kernel-Smoothers.md +++ b/docs/06-Kernel-Smoothing-Methods/6.1-One-Dimensional-Kernel-Smoothers.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-27:2017-02-28 | +| 发布 | 2017-03-01 | | 更新 | 2018-07-18| | 状态 |Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.2-Selecting-the-Width-of-the-Kernel.md b/docs/06-Kernel-Smoothing-Methods/6.2-Selecting-the-Width-of-the-Kernel.md index b491980a87..eaf5fbc826 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.2-Selecting-the-Width-of-the-Kernel.md +++ b/docs/06-Kernel-Smoothing-Methods/6.2-Selecting-the-Width-of-the-Kernel.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-01:2017-03-01 | +| 发布 | 2017-03-01 | | 更新 | 2018-07-18| | 状态|Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.3-Local-Regression-in-Rp.md b/docs/06-Kernel-Smoothing-Methods/6.3-Local-Regression-in-Rp.md index 2369c06ca2..b2ba9be829 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.3-Local-Regression-in-Rp.md +++ b/docs/06-Kernel-Smoothing-Methods/6.3-Local-Regression-in-Rp.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=219) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-01:2017-03-02 | +| 发布 | 2017-03-01 | | 更新 | 2018-08-14| | 状态 | Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.4-Structured-Local-Regression-Models-in-Rp.md b/docs/06-Kernel-Smoothing-Methods/6.4-Structured-Local-Regression-Models-in-Rp.md index 3f860574a2..c228cb6b51 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.4-Structured-Local-Regression-Models-in-Rp.md +++ b/docs/06-Kernel-Smoothing-Methods/6.4-Structured-Local-Regression-Models-in-Rp.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=220) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-03:2017-03-03 | +| 发布 | 2016-09-30 | | 更新|2019-07-30 09:44:04| | 状态| Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.5-Local-Likelihood-and-Other-Models.md b/docs/06-Kernel-Smoothing-Methods/6.5-Local-Likelihood-and-Other-Models.md index fa9705ea08..54dbeb8ed5 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.5-Local-Likelihood-and-Other-Models.md +++ b/docs/06-Kernel-Smoothing-Methods/6.5-Local-Likelihood-and-Other-Models.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-04:2017-03-04 | +| 发布 | 2017-03-04 | |更新 | 2018-03-05| |状态|Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.6-Kernel-Density-Estimation-and-Classification.md b/docs/06-Kernel-Smoothing-Methods/6.6-Kernel-Density-Estimation-and-Classification.md index 4a673739ca..e431a86e1c 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.6-Kernel-Density-Estimation-and-Classification.md +++ b/docs/06-Kernel-Smoothing-Methods/6.6-Kernel-Density-Estimation-and-Classification.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=227) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-04 | +| 发布 | 2017-03-04 | |更新|2019-04-16 21:01:35| | 状态 | Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.7-Radial-Basis-Functions-and-Kernels.md b/docs/06-Kernel-Smoothing-Methods/6.7-Radial-Basis-Functions-and-Kernels.md index aec90be757..eff2e85f8e 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.7-Radial-Basis-Functions-and-Kernels.md +++ b/docs/06-Kernel-Smoothing-Methods/6.7-Radial-Basis-Functions-and-Kernels.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=231) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-06:2017-03-06 | +| 发布 | 2017-03-09 | | 更新 | 2019-10-24 18:09:35| | 状态 | Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.8-Mixture-Models-for-Density-Estimation-and-Classification.md b/docs/06-Kernel-Smoothing-Methods/6.8-Mixture-Models-for-Density-Estimation-and-Classification.md index 72d49655b2..28b9dee314 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.8-Mixture-Models-for-Density-Estimation-and-Classification.md +++ b/docs/06-Kernel-Smoothing-Methods/6.8-Mixture-Models-for-Density-Estimation-and-Classification.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-29 | +| 发布 | 2017-12-29 | | 更新 | 2019-10-24 19:15:19| | 状态 | Done| diff --git a/docs/06-Kernel-Smoothing-Methods/6.9-Computational-Consoderations.md b/docs/06-Kernel-Smoothing-Methods/6.9-Computational-Consoderations.md index b5913cd85d..ecacacc47f 100644 --- a/docs/06-Kernel-Smoothing-Methods/6.9-Computational-Consoderations.md +++ b/docs/06-Kernel-Smoothing-Methods/6.9-Computational-Consoderations.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-29 | +| 发布 | 2017-12-29 | | 更新 | 2020-03-19 12:10:40| | 状态 | Done| diff --git a/docs/06-Kernel-Smoothing-Methods/Bibliographic-Notes.md b/docs/06-Kernel-Smoothing-Methods/Bibliographic-Notes.md index 35849574d6..ed7c711e0d 100644 --- a/docs/06-Kernel-Smoothing-Methods/Bibliographic-Notes.md +++ b/docs/06-Kernel-Smoothing-Methods/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2018-02-23 | +| 发布 | 2018-02-24 | | 状态 | Done | 关于核方法有大量的文献,我们这里不打算总结.而是理出一些非常好的文献,它们自身有非常多的参考文献.Loader (1999)[^1]对局部回归和概率似然介绍很完整,并且也描述了拟合这些模型的 **最先进的(state-of-the-art)** 软件.Fan and Gijbels (1996)[^2]从更理论的层面介绍了这些模型. Hastie and Tibshirani (1990)[^3]在可加模型中讨论了局部回归. 和 Scott (1992)[^5]一样,Silverman (1986)[^4]也很好地总结了密度估计. diff --git a/docs/07-Model-Assessment-and-Selection/7.1-Introduction.md b/docs/07-Model-Assessment-and-Selection/7.1-Introduction.md index ae12c9b741..d706986f6c 100644 --- a/docs/07-Model-Assessment-and-Selection/7.1-Introduction.md +++ b/docs/07-Model-Assessment-and-Selection/7.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=238) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | | 更新 | 2019-03-28 15:49:35 | |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.10-Cross-Validation.md b/docs/07-Model-Assessment-and-Selection/7.10-Cross-Validation.md index cc02596d39..1d986ed1a5 100644 --- a/docs/07-Model-Assessment-and-Selection/7.10-Cross-Validation.md +++ b/docs/07-Model-Assessment-and-Selection/7.10-Cross-Validation.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-17:2017-02-18 | +| 发布 | 2016-09-30 | |更新 |2018-01-09, 2018-01-12, 2018-03-18, 2018-03-19| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.11-Bootstrap-Methods.md b/docs/07-Model-Assessment-and-Selection/7.11-Bootstrap-Methods.md index e69fd337e2..52aeff3f9f 100644 --- a/docs/07-Model-Assessment-and-Selection/7.11-Bootstrap-Methods.md +++ b/docs/07-Model-Assessment-and-Selection/7.11-Bootstrap-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-19:2017-02-19 | +| 发布 | 2017-02-20 | | 更新 | 2019-07-27 22:07:23| |状态| Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.12-Conditional-or-Expected-Test-Error.md b/docs/07-Model-Assessment-and-Selection/7.12-Conditional-or-Expected-Test-Error.md index f87f0f7c5b..849f21241a 100644 --- a/docs/07-Model-Assessment-and-Selection/7.12-Conditional-or-Expected-Test-Error.md +++ b/docs/07-Model-Assessment-and-Selection/7.12-Conditional-or-Expected-Test-Error.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-20:2017-02-20 | +| 发布 | 2017-02-20 | | 更新 | 2018-03-20| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.2-Bias-Variance-and-Model-Complexity.md b/docs/07-Model-Assessment-and-Selection/7.2-Bias-Variance-and-Model-Complexity.md index 66275772c3..9a1952db06 100644 --- a/docs/07-Model-Assessment-and-Selection/7.2-Bias-Variance-and-Model-Complexity.md +++ b/docs/07-Model-Assessment-and-Selection/7.2-Bias-Variance-and-Model-Complexity.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-18:2017-02-18 | +| 发布 | 2016-09-30 | | 更新 | 2019-03-28 16:14:51| | 状态 | Done | diff --git a/docs/07-Model-Assessment-and-Selection/7.3-The-Bias-Variance-Decomposition.md b/docs/07-Model-Assessment-and-Selection/7.3-The-Bias-Variance-Decomposition.md index 8f80e27eb0..529d278ddf 100644 --- a/docs/07-Model-Assessment-and-Selection/7.3-The-Bias-Variance-Decomposition.md +++ b/docs/07-Model-Assessment-and-Selection/7.3-The-Bias-Variance-Decomposition.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=242) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-18:2017-02-18 | +| 发布 | 2016-09-30 | |更新|2019-03-28 16:53:59| |状态| Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.4-Optimism-of-the-Training-Error-Rate.md b/docs/07-Model-Assessment-and-Selection/7.4-Optimism-of-the-Training-Error-Rate.md index b233fd9baa..fa6ac8e47c 100644 --- a/docs/07-Model-Assessment-and-Selection/7.4-Optimism-of-the-Training-Error-Rate.md +++ b/docs/07-Model-Assessment-and-Selection/7.4-Optimism-of-the-Training-Error-Rate.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=247) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-18:2017-02-18 | +| 发布 | 2016-09-30 | |更新|2019-07-27 12:07:49| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.5-Estimates-of-In-Sample-Prediction-Error.md b/docs/07-Model-Assessment-and-Selection/7.5-Estimates-of-In-Sample-Prediction-Error.md index b5ab06fe2e..761bce3dbb 100644 --- a/docs/07-Model-Assessment-and-Selection/7.5-Estimates-of-In-Sample-Prediction-Error.md +++ b/docs/07-Model-Assessment-and-Selection/7.5-Estimates-of-In-Sample-Prediction-Error.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-18:2017-02-18 | +| 发布 | 2016-09-30 | |更新|2019-07-27 12:30:55| !!! note "weiya 注:Recall" diff --git a/docs/07-Model-Assessment-and-Selection/7.6-The-Effective-Number-of-Parameters.md b/docs/07-Model-Assessment-and-Selection/7.6-The-Effective-Number-of-Parameters.md index f3525a5c01..b6922d81e5 100644 --- a/docs/07-Model-Assessment-and-Selection/7.6-The-Effective-Number-of-Parameters.md +++ b/docs/07-Model-Assessment-and-Selection/7.6-The-Effective-Number-of-Parameters.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-18:2017-02-18 | +| 发布 | 2016-09-30 | |更新|2019-07-27 18:07:50| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.7-The-Bayesian-Approach-and-BIC.md b/docs/07-Model-Assessment-and-Selection/7.7-The-Bayesian-Approach-and-BIC.md index 2c1e2aadec..a03116a526 100644 --- a/docs/07-Model-Assessment-and-Selection/7.7-The-Bayesian-Approach-and-BIC.md +++ b/docs/07-Model-Assessment-and-Selection/7.7-The-Bayesian-Approach-and-BIC.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-18:2017-02-19 | +| 发布 | 2016-09-30 | | 更新 |2019-07-27 18:23:22| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.8-Minimum-Description-Length.md b/docs/07-Model-Assessment-and-Selection/7.8-Minimum-Description-Length.md index d03596560f..2d3a1c0b88 100644 --- a/docs/07-Model-Assessment-and-Selection/7.8-Minimum-Description-Length.md +++ b/docs/07-Model-Assessment-and-Selection/7.8-Minimum-Description-Length.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-19:2017-02-19 | +| 发布 | 2016-09-30 | |更新 |2019-07-27 18:44:16| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/7.9-Vapnik-Chervonenkis-Dimension.md b/docs/07-Model-Assessment-and-Selection/7.9-Vapnik-Chervonenkis-Dimension.md index 3fa19c2aec..64c37f06c3 100644 --- a/docs/07-Model-Assessment-and-Selection/7.9-Vapnik-Chervonenkis-Dimension.md +++ b/docs/07-Model-Assessment-and-Selection/7.9-Vapnik-Chervonenkis-Dimension.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=256) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-19:2017-02-19 | +| 发布 | 2016-09-30 | |更新|2019-07-27 19:02:08| |状态|Done| diff --git a/docs/07-Model-Assessment-and-Selection/Bibliographic-Notes.md b/docs/07-Model-Assessment-and-Selection/Bibliographic-Notes.md index 2c7c68d45f..d7f39720b5 100644 --- a/docs/07-Model-Assessment-and-Selection/Bibliographic-Notes.md +++ b/docs/07-Model-Assessment-and-Selection/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-20:2017-02-20 | +| 发布 | 2017-02-20 | 交叉验证的主要参考文献为Stone(1974)[^1],Stone(1977)[^2]和Allen(1974)[^3].AIC由Akaike(1973)[^4]提出,而BIC由Schwarz(1978)[^5]提出.Madigan and Raftery(1994)[^6]概述了贝叶斯模型选择.MDL准则归功于Rissanen(1983)[^7].Cover and Thomas(1991)[^8]包含编码理论和复杂性的很好的描述.VC维在Vapnik(1996)[^8]中有描述.Stone(1977)[^2]证明了AIC和舍一交叉验证渐进相等.一般的交叉验证由Golub et. al(1979)[^10]和Wahba(1980)[^11]描述.也可以参见Hastie and Tibshirani(1990)[^12]的第三章.自助法归功于Efron(1979)[^13];它的概述可以参见Efron and Tibshirani(1993)[^14].Efron(1983)[^15]提出一系列预测误差的自助法估计,包括乐观估计和.632估计.Efron(1986)[^16]比较CV和GCV以及误差率的自助法估计.Breiman and Spector(1992)[^17],Breiman(1992)[^18],Shao(1996)[^19],Zhang(1993)[^20]和Kohavi(1995)[^21]等人研究了模型选择的交叉验证和自助法..632+估计由Efron and Tibshirani(1997)[^22]提出. diff --git a/docs/08-Model-Inference-and-Averaging/8.1-Introduction.md b/docs/08-Model-Inference-and-Averaging/8.1-Introduction.md index 283bc0ca13..b224dce74d 100644 --- a/docs/08-Model-Inference-and-Averaging/8.1-Introduction.md +++ b/docs/08-Model-Inference-and-Averaging/8.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | | 更新 | 2017-09-13 | 本书的大部分章节中,对于回归而言,模型的拟合(学习)通过最小化平方和实现;或对于分类而言,通过最小化交叉熵实现.事实上,这两种最小化都是用极大似然来拟合的实例. diff --git a/docs/08-Model-Inference-and-Averaging/8.2-The-Bootstrap-and-Maximum-Likelihood-Methods.md b/docs/08-Model-Inference-and-Averaging/8.2-The-Bootstrap-and-Maximum-Likelihood-Methods.md index 1ee1dc008d..5155c96c78 100644 --- a/docs/08-Model-Inference-and-Averaging/8.2-The-Bootstrap-and-Maximum-Likelihood-Methods.md +++ b/docs/08-Model-Inference-and-Averaging/8.2-The-Bootstrap-and-Maximum-Likelihood-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=280) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-31 | +| 发布 | 2017-02-08 | | 更新 | 2019-05-10 23:46:09 | | 状态| Done| diff --git a/docs/08-Model-Inference-and-Averaging/8.3-Bayesian-Methods.md b/docs/08-Model-Inference-and-Averaging/8.3-Bayesian-Methods.md index 38456c1b16..60d9d3e768 100644 --- a/docs/08-Model-Inference-and-Averaging/8.3-Bayesian-Methods.md +++ b/docs/08-Model-Inference-and-Averaging/8.3-Bayesian-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=286) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-01 | +| 发布 | 2017-02-08 | | 更新 | 2019-07-28 10:07:58 | | 状态 | Done| diff --git a/docs/08-Model-Inference-and-Averaging/8.4-Relationship-Between-the-Bootstrap-and-Bayesian-Inference.md b/docs/08-Model-Inference-and-Averaging/8.4-Relationship-Between-the-Bootstrap-and-Bayesian-Inference.md index 1c414dd3d0..5b70d468a5 100644 --- a/docs/08-Model-Inference-and-Averaging/8.4-Relationship-Between-the-Bootstrap-and-Bayesian-Inference.md +++ b/docs/08-Model-Inference-and-Averaging/8.4-Relationship-Between-the-Bootstrap-and-Bayesian-Inference.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-01 | +| 发布 | 2017-02-08 | | 更新 | 2019-07-28 10:33:24| | 状态 | Done| diff --git a/docs/08-Model-Inference-and-Averaging/8.5-The-EM-Algorithm.md b/docs/08-Model-Inference-and-Averaging/8.5-The-EM-Algorithm.md index 28b702da3d..11d15a5bcb 100644 --- a/docs/08-Model-Inference-and-Averaging/8.5-The-EM-Algorithm.md +++ b/docs/08-Model-Inference-and-Averaging/8.5-The-EM-Algorithm.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=291) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-20 & 2017-02-01:2017-02-03 | +| 发布 | 2017-02-08 | |更新|2018-04-29, 2018-10-04| |状态|Done| diff --git a/docs/08-Model-Inference-and-Averaging/8.6-MCMC-for-Sampling-from-the-Posterior.md b/docs/08-Model-Inference-and-Averaging/8.6-MCMC-for-Sampling-from-the-Posterior.md index 50d46542de..00eb65170a 100644 --- a/docs/08-Model-Inference-and-Averaging/8.6-MCMC-for-Sampling-from-the-Posterior.md +++ b/docs/08-Model-Inference-and-Averaging/8.6-MCMC-for-Sampling-from-the-Posterior.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=298) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-03 | +| 发布 | 2017-02-08 | |更新|2017-12-27, 2018-07-17| |状态|Done| diff --git a/docs/08-Model-Inference-and-Averaging/8.7-Bagging.md b/docs/08-Model-Inference-and-Averaging/8.7-Bagging.md index 4b841e1af3..6359d53397 100644 --- a/docs/08-Model-Inference-and-Averaging/8.7-Bagging.md +++ b/docs/08-Model-Inference-and-Averaging/8.7-Bagging.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=301) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-03 | +| 发布 | 2017-02-08 | | 更新 | 2020-02-22 23:10:28| |状态|Done| diff --git a/docs/08-Model-Inference-and-Averaging/8.8-Model-Averaging-and-Stacking.md b/docs/08-Model-Inference-and-Averaging/8.8-Model-Averaging-and-Stacking.md index 27de8734ee..6fee413768 100644 --- a/docs/08-Model-Inference-and-Averaging/8.8-Model-Averaging-and-Stacking.md +++ b/docs/08-Model-Inference-and-Averaging/8.8-Model-Averaging-and-Stacking.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=307) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-31 | +| 发布 | 2017-06-09 | | 更新 |{{ git_revision_date }} | 在 [8.4 节](8.4-Relationship-Between-the-Bootstrap-and-Bayesian-Inference/index.html)我们根据一种非参贝叶斯分析,将估计器的 bootstrap 值看成对应参数近似的后验值.从这个角度看,bagged 估计 \eqref{8.51} 是后验贝叶斯均值的近似.相反,训练样本估计量 $\hat f(x)$ 对应后验的**最大值 (mode)**.因为后验均值(不是最大值)最小化平方误差损失,所以 bagging 可以经常降低均方误差也不奇怪. diff --git a/docs/08-Model-Inference-and-Averaging/8.9-Stochastic-Search.md b/docs/08-Model-Inference-and-Averaging/8.9-Stochastic-Search.md index 1e3f62edd3..d2b7917dde 100644 --- a/docs/08-Model-Inference-and-Averaging/8.9-Stochastic-Search.md +++ b/docs/08-Model-Inference-and-Averaging/8.9-Stochastic-Search.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2018-01-02 | +| 发布 | 2017-06-09 | | 更新| 2019-08-01 19:31:48| |状态|Done| diff --git a/docs/08-Model-Inference-and-Averaging/Bibliographic-Notes.md b/docs/08-Model-Inference-and-Averaging/Bibliographic-Notes.md index 6d92194f3c..20f84b6c89 100644 --- a/docs/08-Model-Inference-and-Averaging/Bibliographic-Notes.md +++ b/docs/08-Model-Inference-and-Averaging/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-05 | +| 发布 | 2017-09-06 | | 更新| 2019-08-01 19:39:45| | 状态|Done| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.0-Introduction.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.0-Introduction.md index dc6aa8df66..b82a948e6f 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.0-Introduction.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.0-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=314) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | | 更新 | 2018-02-21, 2018-03-17 | | 状态 | Done | diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.1-Generalized-Additive-Models.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.1-Generalized-Additive-Models.md index 4c43fad3f7..dcf22d0b12 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.1-Generalized-Additive-Models.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.1-Generalized-Additive-Models.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=314) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-04 | +| 发布 | 2017-02-08 | | 更新 | 2019-08-19 09:10:05| | 状态 | Done | diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.2-Tree-Based-Methods.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.2-Tree-Based-Methods.md index e9763a11b6..693da39f8d 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.2-Tree-Based-Methods.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.2-Tree-Based-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-05 | +| 发布 | 2017-02-08 | | 更新 |2020-02-28 15:08:03| |状态|Done| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.3-PRIM.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.3-PRIM.md index 3c4daf907c..dd1730009b 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.3-PRIM.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.3-PRIM.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=336) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-12 | +| 发布 | 2018-03-17 | | 更新 |2018-03-17| |状态|Done| |备注| 1 note; 1 question; 1 simulation| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.4-MARS.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.4-MARS.md index 259e3e9841..22bb3a538d 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.4-MARS.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.4-MARS.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=340) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-13 | +| 发布 | 2017-03-14 | | 更新 | 2019-05-10 23:14:29 | |状态|Done| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.5-Hierarchical-Mixtures-of-Experts.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.5-Hierarchical-Mixtures-of-Experts.md index 1e36193a14..5e16c51183 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.5-Hierarchical-Mixtures-of-Experts.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.5-Hierarchical-Mixtures-of-Experts.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=348) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-13 | +| 发布 | 2017-03-14 | | 更新 | 2018-03-17| | 状态 |Done| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.6-Missing-Data.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.6-Missing-Data.md index 458f1e973c..35fec6837e 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.6-Missing-Data.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.6-Missing-Data.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-13 | +| 发布 | 2017-03-14 | | 更新 | 2020-01-08 10:50:28 | | 状态 | Done| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/9.7-Computational-Considerations.md b/docs/09-Additive-Models-Trees-and-Related-Methods/9.7-Computational-Considerations.md index ad58b54e6b..bfd8e1b759 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/9.7-Computational-Considerations.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/9.7-Computational-Considerations.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-13 | +| 发布 | 2017-03-14 | |更新 |2018-03-17| |状态|Done| diff --git a/docs/09-Additive-Models-Trees-and-Related-Methods/Bibliographic-Notes.md b/docs/09-Additive-Models-Trees-and-Related-Methods/Bibliographic-Notes.md index 66f9d489c5..d33767a96f 100644 --- a/docs/09-Additive-Models-Trees-and-Related-Methods/Bibliographic-Notes.md +++ b/docs/09-Additive-Models-Trees-and-Related-Methods/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-13 | +| 发布 | 2017-03-14 | 广义可加模型的最系统的资料是Hastie and Tibshirani(1990)[^1].这个工作在医学问题上的不同应用在Hastie et al. (1989)[^2]和Hastie and Herman (1990)[^3]中有讨论,而且在Chambers and Hastie (1991)[^3]中描述了Splus软件的实现.Green and Silverman (1994)[^4]讨论了在不同设定下惩罚和样条模型.Efron and Tibshirani(1991)[^5]对非数学读者,介绍了统计的现代发展(包括广义加性模型).分类和回归树至少追溯到Morgan and Sonquist(1963)[^6].我们已经采用Breiman et al. (1984)[^7]和Quinlan (1993)[^8]等人的现代方法.PRIM方法归功于Friedman and Fisher(1989)[^9].专家的系统混合由Jordan and Jacobs (1994)[^10]提出;也参见Jacobs et al. (1991)[^11]. diff --git a/docs/10-Boosting-and-Additive-Trees/10.1-Boosting-Methods.md b/docs/10-Boosting-and-Additive-Trees/10.1-Boosting-Methods.md index 820f8717b3..efcee5510a 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.1-Boosting-Methods.md +++ b/docs/10-Boosting-and-Additive-Trees/10.1-Boosting-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-06 | +| 发布 | 2017-02-08 | | 更新 | 2019-07-28 23:45:47| | 状态 | Done | diff --git a/docs/10-Boosting-and-Additive-Trees/10.10-Numerical-Optimization-via-Gradient-Boosting.md b/docs/10-Boosting-and-Additive-Trees/10.10-Numerical-Optimization-via-Gradient-Boosting.md index 8c0995658c..9bfd0dd1c4 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.10-Numerical-Optimization-via-Gradient-Boosting.md +++ b/docs/10-Boosting-and-Additive-Trees/10.10-Numerical-Optimization-via-Gradient-Boosting.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-27 | +| 发布 | 2017-08-27 | | 更新 | 2020-01-13 14:57:32 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.11-Right-Sized-Trees-for-Boosting.md b/docs/10-Boosting-and-Additive-Trees/10.11-Right-Sized-Trees-for-Boosting.md index 93f812abb2..b01d6a6da5 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.11-Right-Sized-Trees-for-Boosting.md +++ b/docs/10-Boosting-and-Additive-Trees/10.11-Right-Sized-Trees-for-Boosting.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-10-07 & 2017-10-08 & 2017-10-09 | +| 发布 | 2017-10-07 | | 更新 | 2020-01-14 18:36:02 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.12-Regularization.md b/docs/10-Boosting-and-Additive-Trees/10.12-Regularization.md index fc666830c6..ec6dda85c9 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.12-Regularization.md +++ b/docs/10-Boosting-and-Additive-Trees/10.12-Regularization.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-10-09 & 2017-10-15 & 2017-10-16 | +| 发布 | 2017-10-09 | | 更新 | 2020-01-10 11:30:03 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.13-Interpretation.md b/docs/10-Boosting-and-Additive-Trees/10.13-Interpretation.md index 511cf070d0..48be5ba78a 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.13-Interpretation.md +++ b/docs/10-Boosting-and-Additive-Trees/10.13-Interpretation.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=386) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-10-09, 2018-12-28 | +| 发布 | 2018-12-28 | |更新|2020-01-14 18:22:34| |状态| Done diff --git a/docs/10-Boosting-and-Additive-Trees/10.14-Illustrations.md b/docs/10-Boosting-and-Additive-Trees/10.14-Illustrations.md index 6baaec5f4b..aa01ac2fec 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.14-Illustrations.md +++ b/docs/10-Boosting-and-Additive-Trees/10.14-Illustrations.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-10-09 | +| 发布 | 2018-12-30 | | 更新 | 2018-12-30| | 状态| Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.2-Boosting-Fits-an-Additive-Model.md b/docs/10-Boosting-and-Additive-Trees/10.2-Boosting-Fits-an-Additive-Model.md index ab3911cf4d..ec74c6fb86 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.2-Boosting-Fits-an-Additive-Model.md +++ b/docs/10-Boosting-and-Additive-Trees/10.2-Boosting-Fits-an-Additive-Model.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=360) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-06 | +| 发布 | 2017-02-08 | | 更新 | 2017-08-26, 2018-02-28 | |状态|Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.3-Forward-Stagewise-Additive-Modeling.md b/docs/10-Boosting-and-Additive-Trees/10.3-Forward-Stagewise-Additive-Modeling.md index ea5be21641..d2ca557938 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.3-Forward-Stagewise-Additive-Modeling.md +++ b/docs/10-Boosting-and-Additive-Trees/10.3-Forward-Stagewise-Additive-Modeling.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=361) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-06 | +| 发布 | 2017-02-08 | | 更新 | 2017-08-26, 2018-02-28 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.4-Exponential-Loss-and-AdaBoost.md b/docs/10-Boosting-and-Additive-Trees/10.4-Exponential-Loss-and-AdaBoost.md index 3b71cefa59..f9a1cf1704 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.4-Exponential-Loss-and-AdaBoost.md +++ b/docs/10-Boosting-and-Additive-Trees/10.4-Exponential-Loss-and-AdaBoost.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=362) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-06 | +| 发布 | 2018-02-28 | | 更新 | 2018-02-28, 2018-05-17@xinyu-intel,2018-06-09 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.5-Why-Exponential-Loss.md b/docs/10-Boosting-and-Additive-Trees/10.5-Why-Exponential-Loss.md index 666c1f526f..40b330b1c7 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.5-Why-Exponential-Loss.md +++ b/docs/10-Boosting-and-Additive-Trees/10.5-Why-Exponential-Loss.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=364) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-06 | +| 发布 | 2017-02-08 | | 更新 | 2018-02-28 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.6-Loss-Functions-and-Robustness.md b/docs/10-Boosting-and-Additive-Trees/10.6-Loss-Functions-and-Robustness.md index 11c2af8a08..18b2b36f69 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.6-Loss-Functions-and-Robustness.md +++ b/docs/10-Boosting-and-Additive-Trees/10.6-Loss-Functions-and-Robustness.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=365) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-06 | +| 发布 | 2018-03-01 | | 更新 | 2020-01-13 21:53:17 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.7-Off-the-Shelf-Procedures-for-Data-Mining.md b/docs/10-Boosting-and-Additive-Trees/10.7-Off-the-Shelf-Procedures-for-Data-Mining.md index a7581653e6..d30e41aa9a 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.7-Off-the-Shelf-Procedures-for-Data-Mining.md +++ b/docs/10-Boosting-and-Additive-Trees/10.7-Off-the-Shelf-Procedures-for-Data-Mining.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=369) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-27 | +| 发布 | 2018-03-01 | | 更新 | 2019-07-12 16:38:24 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.8-Spam-Data.md b/docs/10-Boosting-and-Additive-Trees/10.8-Spam-Data.md index cff432a8f2..9b0df02cf7 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.8-Spam-Data.md +++ b/docs/10-Boosting-and-Additive-Trees/10.8-Spam-Data.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2020-01-14 | +| 发布 | 2017-06-09 | |更新|2020-01-14 14:53:37| |状态|Done| diff --git a/docs/10-Boosting-and-Additive-Trees/10.9-Boosting-Trees.md b/docs/10-Boosting-and-Additive-Trees/10.9-Boosting-Trees.md index eb2ac9ade7..48e9bf324f 100644 --- a/docs/10-Boosting-and-Additive-Trees/10.9-Boosting-Trees.md +++ b/docs/10-Boosting-and-Additive-Trees/10.9-Boosting-Trees.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=172) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-27 | +| 发布 | 2018-03-01 | | 更新 | 2019-08-22 15:41:45 | | 状态 | Done| diff --git a/docs/10-Boosting-and-Additive-Trees/Bibliographic-Notes.md b/docs/10-Boosting-and-Additive-Trees/Bibliographic-Notes.md index 9be371b4d0..91b749c819 100644 --- a/docs/10-Boosting-and-Additive-Trees/Bibliographic-Notes.md +++ b/docs/10-Boosting-and-Additive-Trees/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-05 | +| 发布 | 2017-09-05 | Schapire (1990)[^1]在PAC学习框架(Valiant, 1984[^2]; Kearns and Vazirani, 1994[^3])发展了第一个简单的boosting过程.Schapire证明了弱学习器(weak learner)总是可以通过在输入数据流的过滤版本训练两个额外的分类器来提高效果.弱学习器是是产生两类别分类器的算法,该分类器保证了效果回避抛硬币猜测要显著地好.当在前$N$个训练点上学习出初始的分类器$G_1$, diff --git a/docs/11-Neural-Networks/11.1-Introduction.md b/docs/11-Neural-Networks/11.1-Introduction.md index c76dc1bd0d..f66969e964 100644 --- a/docs/11-Neural-Networks/11.1-Introduction.md +++ b/docs/11-Neural-Networks/11.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | |更新| 2017-12-27; 2018-04-29| |状态|Done| diff --git a/docs/11-Neural-Networks/11.2-Projection-Pursuit-Regression.md b/docs/11-Neural-Networks/11.2-Projection-Pursuit-Regression.md index e398423833..c2dd76630b 100644 --- a/docs/11-Neural-Networks/11.2-Projection-Pursuit-Regression.md +++ b/docs/11-Neural-Networks/11.2-Projection-Pursuit-Regression.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-07 | +| 发布 | 2017-02-08 | |更新| 2017-12-27; 2018-04-29| 在我们一般监督学习问题中,假设我们有 $p$ 个组分的输入向量 $X$,以及目标变量 $Y$.令 $\omega_m,m=1,2,\ldots, M$ 为未知参数的 $p$ 维单位向量.**投影寻踪回归 (PPR)** 模型有如下形式 diff --git a/docs/11-Neural-Networks/11.3-Neural-Networks.md b/docs/11-Neural-Networks/11.3-Neural-Networks.md index 149fdf0af7..adcb221728 100644 --- a/docs/11-Neural-Networks/11.3-Neural-Networks.md +++ b/docs/11-Neural-Networks/11.3-Neural-Networks.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=411) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-07 | +| 发布 | 2017-02-08 | |更新|2019-02-16 15:57:15| |状态|Done| diff --git a/docs/11-Neural-Networks/11.4-Fitting-Neural-Networks.md b/docs/11-Neural-Networks/11.4-Fitting-Neural-Networks.md index 08f2a918d0..aa115a7a64 100644 --- a/docs/11-Neural-Networks/11.4-Fitting-Neural-Networks.md +++ b/docs/11-Neural-Networks/11.4-Fitting-Neural-Networks.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=414) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-07 | +| 发布 | 2017-02-08 | |更新|2019-02-16 16:13:40| |状态|Done| diff --git a/docs/11-Neural-Networks/11.5-Some-Issues-in-Training-Neural-Networks.md b/docs/11-Neural-Networks/11.5-Some-Issues-in-Training-Neural-Networks.md index b7d32062a6..a3f866738a 100644 --- a/docs/11-Neural-Networks/11.5-Some-Issues-in-Training-Neural-Networks.md +++ b/docs/11-Neural-Networks/11.5-Some-Issues-in-Training-Neural-Networks.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-07 | +| 发布 | 2017-02-08 | |更新|2019-02-16 16:59:50| |状态|Done| diff --git a/docs/11-Neural-Networks/11.6-Example-of-Simulated-Data.md b/docs/11-Neural-Networks/11.6-Example-of-Simulated-Data.md index 92b749b785..76c394ecaf 100644 --- a/docs/11-Neural-Networks/11.6-Example-of-Simulated-Data.md +++ b/docs/11-Neural-Networks/11.6-Example-of-Simulated-Data.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-07 | +| 发布 | 2017-02-08 | |更新|2017-12-28| !!! note "更新笔记" diff --git a/docs/11-Neural-Networks/11.7-Example-ZIP-Code-Data.md b/docs/11-Neural-Networks/11.7-Example-ZIP-Code-Data.md index 4891ba49d6..c1ea216f87 100644 --- a/docs/11-Neural-Networks/11.7-Example-ZIP-Code-Data.md +++ b/docs/11-Neural-Networks/11.7-Example-ZIP-Code-Data.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-28 | +| 发布 | 2017-12-28 | 这个例子是字体识别任务:对手写字母分类.这个问题引起了机器学习和神经网络社区数年的关注,并且仍是该领域的基准问题.图11.9展示了标准化手写字体的一些例子,这是从美国邮局服务的信封中自动扫描得到的.原始的扫描数字是二值的且有不同的大小和朝向;这里展示的图象已经被deslanted且大小被标准化了,最终得到$16\times 16$的灰白图象(Le Cun et al., 1990[^1]).这256个像素值作为神经网络分类器的输入. diff --git a/docs/11-Neural-Networks/Bibliographic-Notes.md b/docs/11-Neural-Networks/Bibliographic-Notes.md index 310a4f2481..a63e251d2e 100644 --- a/docs/11-Neural-Networks/Bibliographic-Notes.md +++ b/docs/11-Neural-Networks/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-06 | +| 发布 | 2017-09-06 | 投影寻踪由Friedman and Tukey (1974)[^1]提出,并且由Friedman and Stuetzle (1981)[^2]具体化为回归.Huber (1985)[^3]给出了一个概述,并且Roosen and Hastie (1994)[^4]采用光滑样条提出了一个关系式.神经网络的动机追溯到McCulloch and Pitts (1943)[^5],Widrow and Hoff (1960)[^6](在Anderson and Rosenfeld (1988)[^7]中转载)以及Rosenblatt (1962)[^8].Hebb (1949)[^9]严重受到学习算法发展的影响.神经网络的再起是在1980s中期,归功于Werbos (1974)[^10],Parker (1985)[^11]和Rumelhart et al. (1986)[^12],后者提出了向后传播算法.今天这方面有很多书,Hertz et al. (1991)[^13],Bishop (1995)[^14]以及Ripley (1996)[^15]或许是信息量最大的.神经网络的贝叶斯学习在Neal (1996)[^16]中有描述.ZIP例子取自Le Cun (1989)[^17],同时参见Le Cun et al. (1990)[^18]以及Le Cun et al. (1998)[^19]. diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.1-Introduction.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.1-Introduction.md index a5b0b973d9..265cc6ec9f 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.1-Introduction.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-09 | +| 发布 | 2016-12-16 | | 更新|2017.10.15; 2018.02.12 | |状态|Done| diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.2-The-Support-Vector-Classifier.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.2-The-Support-Vector-Classifier.md index f5a5a9a1b0..457084c0ab 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.2-The-Support-Vector-Classifier.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.2-The-Support-Vector-Classifier.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-19:2016-12-20 | +| 发布 | 2016-12-16 | |更新|2020-01-03 11:29:06 | |状态|Done| diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.3-Support-Vector-Machines-and-Kernels.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.3-Support-Vector-Machines-and-Kernels.md index 5c0043f8f4..0d253b2ef5 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.3-Support-Vector-Machines-and-Kernels.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.3-Support-Vector-Machines-and-Kernels.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=442) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-12-19:2016-12-20 | +| 发布 | 2016-09-30 | | 更新 |2019-05-31 13:59:00 | | 状态 |Done| diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.4-Generalizing-Linear-Discriminant-Analysis.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.4-Generalizing-Linear-Discriminant-Analysis.md index 367627223d..948bbbf90d 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.4-Generalizing-Linear-Discriminant-Analysis.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.4-Generalizing-Linear-Discriminant-Analysis.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=457) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-15 | +| 发布 | 2018-07-21 | | 更新| 2018-07-20| | 状态 | Done| diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.5-Flexible-Disciminant-Analysis.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.5-Flexible-Disciminant-Analysis.md index ec76e05106..c6cf5c5650 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.5-Flexible-Disciminant-Analysis.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.5-Flexible-Disciminant-Analysis.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=459) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-15 | +| 发布 | 2016-09-30 | | 更新 | 2020-05-26 17:46:15| | 状态 | Done| diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.6-Penalized-Discriminant-Analysis.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.6-Penalized-Discriminant-Analysis.md index 27d5352bb8..8af26d0dea 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.6-Penalized-Discriminant-Analysis.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/12.6-Penalized-Discriminant-Analysis.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=465) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-12-17 | +| 发布 | 2018-08-08 | | 更新 | 2020-05-27 10:29:44| |状态| Done| diff --git a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/Bibliographic-Notes.md b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/Bibliographic-Notes.md index 7784bcbbae..488bb1908e 100644 --- a/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/Bibliographic-Notes.md +++ b/docs/12-Support-Vector-Machines-and-Flexible-Discriminants/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-06 | +| 发布 | 2017-06-09 | 支持向量机背后的理论归功于 Vapnik 并且在 Vapnik (1996)[^1] 中进行了描述.有关 SVM 的新兴文献、由 Alex Smola 和 Bernhard Scholkopf 创造并维护的在线文献,可以在下面的网站中找到 diff --git a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.1-Introduction.md b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.1-Introduction.md index f2b5e3fe92..84a746c60b 100644 --- a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.1-Introduction.md +++ b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=478) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | | 更新 | 2019-07-11 16:17:30 | | 状态 | Done| diff --git a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.2-Prototype-Methods.md b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.2-Prototype-Methods.md index 354a6aebc8..5e76ed4ec9 100644 --- a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.2-Prototype-Methods.md +++ b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.2-Prototype-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=478) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-28&2018-01-26 | +| 发布 | 2017-08-28 | | 状态 | Done | diff --git a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.3-k-Nearest-Neighbor-Classifiers.md b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.3-k-Nearest-Neighbor-Classifiers.md index 5695d6fc42..8b9a7c41dc 100644 --- a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.3-k-Nearest-Neighbor-Classifiers.md +++ b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.3-k-Nearest-Neighbor-Classifiers.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=482) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-28&2018-01-26 | +| 发布 | 2017-08-28 | |更新|2021-05-06 22:01:56| |状态 | Done| diff --git a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.4-Adaptive-Nearest-Neighbor-Methods.md b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.4-Adaptive-Nearest-Neighbor-Methods.md index 1f75d73073..79dcfcee26 100644 --- a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.4-Adaptive-Nearest-Neighbor-Methods.md +++ b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.4-Adaptive-Nearest-Neighbor-Methods.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=494) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-29& 2018-01-26 | +| 发布 | 2017-08-29 | | 更新|2019-09-23 16:14:01| |状态|Done| diff --git a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.5-Computational-Considerations.md b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.5-Computational-Considerations.md index 13fa3a04d0..1df6fa020c 100644 --- a/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.5-Computational-Considerations.md +++ b/docs/13-Prototype-Methods-and-Nearest-Neighbors/13.5-Computational-Considerations.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-29&2018-01-26 | +| 发布 | 2017-08-29 | | 更新 | 2019-01-06| |状态| Done| diff --git a/docs/13-Prototype-Methods-and-Nearest-Neighbors/Bibliographic-Notes.md b/docs/13-Prototype-Methods-and-Nearest-Neighbors/Bibliographic-Notes.md index abfed3dc37..be42ca9a3b 100644 --- a/docs/13-Prototype-Methods-and-Nearest-Neighbors/Bibliographic-Notes.md +++ b/docs/13-Prototype-Methods-and-Nearest-Neighbors/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-07&2018-01-26 | +| 发布 | 2017-06-09 | |状态|Done| 最近邻方法至少追溯到Fix and Hodges (1951)[^1].Dasarathy (1991)[^2]综述了关于该话题的大量文献;Ripley (1996)[^3]的第六章有个很好的总结.$K$均值聚类归功于Lloyd (1957)[^4]和MacQueen (1967)[^5].Kohonen (1989)[^6]提出学习向量量化.切距离方法归功于Simard et al. (1993)[^7].Hastie and Tibshirani (1996a)[^8]提出了判别式自适应最近邻技巧. diff --git a/docs/14-Unsupervised-Learning/14.1-Introduction.md b/docs/14-Unsupervised-Learning/14.1-Introduction.md index b662da8124..df6b2652c4 100644 --- a/docs/14-Unsupervised-Learning/14.1-Introduction.md +++ b/docs/14-Unsupervised-Learning/14.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-20:2017-02-20 | +| 发布 | 2016-09-30 | |更新| 2017-09-10, 2018-03-20| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/14.10-The-Google-PageRank-Algorithm.md b/docs/14-Unsupervised-Learning/14.10-The-Google-PageRank-Algorithm.md index 86703cdf5f..ba95418028 100644 --- a/docs/14-Unsupervised-Learning/14.10-The-Google-PageRank-Algorithm.md +++ b/docs/14-Unsupervised-Learning/14.10-The-Google-PageRank-Algorithm.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-04 | +| 发布 | 2016-09-30 | |更新|2018-01-23| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/14.2-Association-Rules.md b/docs/14-Unsupervised-Learning/14.2-Association-Rules.md index ae274ed0e9..2c25812bca 100644 --- a/docs/14-Unsupervised-Learning/14.2-Association-Rules.md +++ b/docs/14-Unsupervised-Learning/14.2-Association-Rules.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=506) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-20:2017-02-22 | +| 发布 | 2016-09-30 | |更新| 2020-05-23 18:48:31| |状态| Done| diff --git a/docs/14-Unsupervised-Learning/14.3-Cluster-Analysis.md b/docs/14-Unsupervised-Learning/14.3-Cluster-Analysis.md index 681261f3ea..e1789895bb 100644 --- a/docs/14-Unsupervised-Learning/14.3-Cluster-Analysis.md +++ b/docs/14-Unsupervised-Learning/14.3-Cluster-Analysis.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-22:2017-02-23 | +| 发布 | 2016-09-30 | |更新|2019-10-03 09:57:02| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/14.4-Self-Organizing-Maps.md b/docs/14-Unsupervised-Learning/14.4-Self-Organizing-Maps.md index f131dcfc42..10cc90d5b1 100644 --- a/docs/14-Unsupervised-Learning/14.4-Self-Organizing-Maps.md +++ b/docs/14-Unsupervised-Learning/14.4-Self-Organizing-Maps.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-03 | +| 发布 | 2016-09-30 | | 更新 | 2019-08-04 12:28:35| | 状态 | Done | diff --git a/docs/14-Unsupervised-Learning/14.5-Principal-Components-Curves-and-Surfaces.md b/docs/14-Unsupervised-Learning/14.5-Principal-Components-Curves-and-Surfaces.md index 26bf7ac8df..a996cff850 100644 --- a/docs/14-Unsupervised-Learning/14.5-Principal-Components-Curves-and-Surfaces.md +++ b/docs/14-Unsupervised-Learning/14.5-Principal-Components-Curves-and-Surfaces.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=553) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2016-10-21 | +| 发布 | 2016-11-23 | |更新|2020-05-15 15:46:43| |状态| Done| diff --git a/docs/14-Unsupervised-Learning/14.6-Non-negative-Matrix-Factorization.md b/docs/14-Unsupervised-Learning/14.6-Non-negative-Matrix-Factorization.md index c34eadb7fc..85c6ae979d 100644 --- a/docs/14-Unsupervised-Learning/14.6-Non-negative-Matrix-Factorization.md +++ b/docs/14-Unsupervised-Learning/14.6-Non-negative-Matrix-Factorization.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-03 | +| 发布 | 2016-09-30 | |更新|2019-09-01 23:58:46| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/14.7-Independent-Component-Analysis-and-Exploratory-Projection-Pursuit.md b/docs/14-Unsupervised-Learning/14.7-Independent-Component-Analysis-and-Exploratory-Projection-Pursuit.md index 05e59acfa4..9f04f0dfa9 100644 --- a/docs/14-Unsupervised-Learning/14.7-Independent-Component-Analysis-and-Exploratory-Projection-Pursuit.md +++ b/docs/14-Unsupervised-Learning/14.7-Independent-Component-Analysis-and-Exploratory-Projection-Pursuit.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=576) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-03 | +| 发布 | 2016-09-30 | |更新 |2020-05-15 23:06:55| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/14.8-Multidimensional-Scaling.md b/docs/14-Unsupervised-Learning/14.8-Multidimensional-Scaling.md index c3690f20b5..84e3102a68 100644 --- a/docs/14-Unsupervised-Learning/14.8-Multidimensional-Scaling.md +++ b/docs/14-Unsupervised-Learning/14.8-Multidimensional-Scaling.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=589) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-04 | +| 发布 | 2016-09-30 | |更新|2020-01-17 21:27:12| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/14.9-Nonlinear-Dimension-Reduction-and-Local-Multidimensional-Scaling.md b/docs/14-Unsupervised-Learning/14.9-Nonlinear-Dimension-Reduction-and-Local-Multidimensional-Scaling.md index 36b066a973..7599e6fbfd 100644 --- a/docs/14-Unsupervised-Learning/14.9-Nonlinear-Dimension-Reduction-and-Local-Multidimensional-Scaling.md +++ b/docs/14-Unsupervised-Learning/14.9-Nonlinear-Dimension-Reduction-and-Local-Multidimensional-Scaling.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-04 | +| 发布 | 2016-09-30 | |更新| 2020-02-20 15:46:34| |状态|Done| diff --git a/docs/14-Unsupervised-Learning/Bibliographic-Notes.md b/docs/14-Unsupervised-Learning/Bibliographic-Notes.md index 366a83bf56..41cd265f9c 100644 --- a/docs/14-Unsupervised-Learning/Bibliographic-Notes.md +++ b/docs/14-Unsupervised-Learning/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-04 | +| 发布 | 2017-09-04 | |状态|Done| 1. 关于聚类的有很多书,包括Hartigan (1975)[^1], Gordon (1999)[^2] 和 Kaufman and Rousseeuw (1990)[^3]. diff --git a/docs/15-Random-Forests/15.1-Introduction.md b/docs/15-Random-Forests/15.1-Introduction.md index 48ddc718f1..741b29919f 100644 --- a/docs/15-Random-Forests/15.1-Introduction.md +++ b/docs/15-Random-Forests/15.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-28 | +| 发布 | 2017-02-08 | | 更新 | 2020-01-09 10:05:13| | 状态 | Done| diff --git a/docs/15-Random-Forests/15.2-Definition-of-Random-Forests.md b/docs/15-Random-Forests/15.2-Definition-of-Random-Forests.md index 9cefd1b52d..6b2bd545a7 100644 --- a/docs/15-Random-Forests/15.2-Definition-of-Random-Forests.md +++ b/docs/15-Random-Forests/15.2-Definition-of-Random-Forests.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=606) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-05-17 | +| 发布 | 2017-06-09 | |更新| 2019-04-19 20:38:52| |状态|Done| diff --git a/docs/15-Random-Forests/15.3-Details-of-Random-Forests.md b/docs/15-Random-Forests/15.3-Details-of-Random-Forests.md index 11732b04c4..c83b9dca4e 100644 --- a/docs/15-Random-Forests/15.3-Details-of-Random-Forests.md +++ b/docs/15-Random-Forests/15.3-Details-of-Random-Forests.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=611) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2018-01-02 | +| 发布 | 2017-06-09 | | 更新 | 2020-02-27 21:30:29| | 状态 | Done| diff --git a/docs/15-Random-Forests/15.4-Analysis-of-Random-Forests.md b/docs/15-Random-Forests/15.4-Analysis-of-Random-Forests.md index 6cd472a40d..aab666f239 100644 --- a/docs/15-Random-Forests/15.4-Analysis-of-Random-Forests.md +++ b/docs/15-Random-Forests/15.4-Analysis-of-Random-Forests.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2018-01-03 | +| 发布 | 2018-01-03 | 这一节我们分析随机森林应用的额外随机化的机制.为了讨论,我们集中在回归和平方误差损失,因为这会得到主要结论,而0-1损失下的偏差和方差都会更复杂(见7.3.1节).另外,甚至在分类问题的情形中,我们可以将随机森林的平均看成是类别后验概率的估计,因为偏差和方差都是合适的描述器. diff --git a/docs/15-Random-Forests/Bibliographic-Notes.md b/docs/15-Random-Forests/Bibliographic-Notes.md index 8b795c0e4e..6807e559b8 100644 --- a/docs/15-Random-Forests/Bibliographic-Notes.md +++ b/docs/15-Random-Forests/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-04 | +| 发布 | 2017-09-04 | 这里讨论的随机森林是Breiman(2001)[^1]提出来的,经很多想法很早一前就以不同的形式出现了.值得一提的是,Ho(1995)[^2]提出“random forest”的概念,并且用了在随机的特征子空间中增长树.采用随机排列和平均来避免过拟合是由Kleinberg (1990)[^3]提出来的,最后出现在Kleinberg (1996)[^4]中.Amit and Geman (1997)[^5]采用在图象特征中增长随机树来处理图象分类问题.Breiman (1996a)[^6]引出了bagging,这是他的随机森林的先驱.Dietterich (2000b)[^7]也提出采用额外的随机化来提高bagging的性能.他的方法是在每个结点出对前20个候选分离排序,接着随机从中选择.他通过仿真和实际例子展示了额外的随机化能够提高bagging的性能.Friedman and Hall (2007)[^8]证明了子采样(不放回)是bagging的一个有效的替代方案.他们证明在大小为$N/2$的样本上生长和平均树是近似等于bagging(考虑偏差及方差),而采用更少的样本则会降低更大的方差(通过去相关处理). diff --git a/docs/16-Ensemble-Learning/16.1-Introduction.md b/docs/16-Ensemble-Learning/16.1-Introduction.md index 2037cbb8c7..06b9eb2721 100644 --- a/docs/16-Ensemble-Learning/16.1-Introduction.md +++ b/docs/16-Ensemble-Learning/16.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=624) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-01-29 | +| 发布 | 2017-02-08 | | 更新 |2018-07-18, 2018-08-18| | 状态| Done| diff --git a/docs/16-Ensemble-Learning/16.2-Boosting-and-Regularization-Paths.md b/docs/16-Ensemble-Learning/16.2-Boosting-and-Regularization-Paths.md index fe61ebf633..08ef3c7122 100644 --- a/docs/16-Ensemble-Learning/16.2-Boosting-and-Regularization-Paths.md +++ b/docs/16-Ensemble-Learning/16.2-Boosting-and-Regularization-Paths.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=626) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-09 | +| 发布 | 2017-03-10 | | 更新 | 2018-07-19| |状态|Done| diff --git a/docs/16-Ensemble-Learning/16.3-Learning-Ensembles.md b/docs/16-Ensemble-Learning/16.3-Learning-Ensembles.md index c9b9c41fb5..043d75636f 100644 --- a/docs/16-Ensemble-Learning/16.3-Learning-Ensembles.md +++ b/docs/16-Ensemble-Learning/16.3-Learning-Ensembles.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=635) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-25 | +| 发布 | 2016-09-30 | | 更新2020-01-13 19:19:38| |状态|Done| diff --git a/docs/16-Ensemble-Learning/Bibliographic-Notes.md b/docs/16-Ensemble-Learning/Bibliographic-Notes.md index b25cbd89ca..1cb3dd0f6f 100644 --- a/docs/16-Ensemble-Learning/Bibliographic-Notes.md +++ b/docs/16-Ensemble-Learning/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-09-09 | +| 发布 | 2017-06-09 | |更新|2018-08-18| |状态|Done| diff --git a/docs/17-Undirected-Graphical-Models/17.1-Introduction.md b/docs/17-Undirected-Graphical-Models/17.1-Introduction.md index f489263a7f..d328563622 100644 --- a/docs/17-Undirected-Graphical-Models/17.1-Introduction.md +++ b/docs/17-Undirected-Graphical-Models/17.1-Introduction.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=644) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-24:2017-02-24 | +| 发布 | 2017-02-24 | | 更新 | 2017-08-26; 2018-04-30; 2018-06-10 | | 状态 | Done | diff --git a/docs/17-Undirected-Graphical-Models/17.2-Markov-Graphs-and-Their-Properties.md b/docs/17-Undirected-Graphical-Models/17.2-Markov-Graphs-and-Their-Properties.md index b082343883..4c296a7419 100644 --- a/docs/17-Undirected-Graphical-Models/17.2-Markov-Graphs-and-Their-Properties.md +++ b/docs/17-Undirected-Graphical-Models/17.2-Markov-Graphs-and-Their-Properties.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=646) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-24:2017-02-24 | +| 发布 | 2016-09-30 | | 更新 |2018-06-11| |状态|Done| diff --git a/docs/17-Undirected-Graphical-Models/17.3-Undirected-Graphical-Models-for-Continuous-Variables.md b/docs/17-Undirected-Graphical-Models/17.3-Undirected-Graphical-Models-for-Continuous-Variables.md index fae24a11a4..9d53a08aa9 100644 --- a/docs/17-Undirected-Graphical-Models/17.3-Undirected-Graphical-Models-for-Continuous-Variables.md +++ b/docs/17-Undirected-Graphical-Models/17.3-Undirected-Graphical-Models-for-Continuous-Variables.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=649) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-24:2017-02-25 | +| 发布 | 2016-09-30 | |更新|2019-07-25 21:17:03| |状态|Done| diff --git a/docs/17-Undirected-Graphical-Models/17.4-Undirected-Graphical-Models-for-Discrete-Variables.md b/docs/17-Undirected-Graphical-Models/17.4-Undirected-Graphical-Models-for-Discrete-Variables.md index 8f27ce515f..883b2a4faa 100644 --- a/docs/17-Undirected-Graphical-Models/17.4-Undirected-Graphical-Models-for-Discrete-Variables.md +++ b/docs/17-Undirected-Graphical-Models/17.4-Undirected-Graphical-Models-for-Discrete-Variables.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=657) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-25:2017-02-26 | +| 发布 | 2016-09-30 | | 更新 | 2017-08-27, 2018-07-09| | 状态 |Done| diff --git a/docs/17-Undirected-Graphical-Models/Bibliographic-Notes.md b/docs/17-Undirected-Graphical-Models/Bibliographic-Notes.md index c9763f460e..0ba8e8ab78 100644 --- a/docs/17-Undirected-Graphical-Models/Bibliographic-Notes.md +++ b/docs/17-Undirected-Graphical-Models/Bibliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-02-26:2017-02-26 | +| 发布 | 2017-02-27 | 定义和理解图模型结构做了很多工作.对图模型系统的处理可以在Whittaker(1990)[^1],Lauritzen(1996)[^2],Cox and Wermuth(1996)[^3],Edwards(2000)[^4],Pearl(2000)[^5],Anderson(2003)[^6],Jordan(2004)[^7],以及Koller and Friedman(2007)[^8]中找到.Wasserman(2004)[^9]给了简短的介绍,而且Bishop(2006)[^10]的第8章给了更详细的概要.玻尔兹曼机在Ackley et al. (1985)[^11]中提出.Ripley(1996)[^12]有详细的一章来介绍与机器学习有关的图模型.我们发现这对于玻尔兹曼机的讨论特别有用. diff --git a/docs/18-High-Dimensional-Problems/18.1-When-p-is-Much-Bigger-than-N.md b/docs/18-High-Dimensional-Problems/18.1-When-p-is-Much-Bigger-than-N.md index a3d4ac6010..f00c2865a0 100644 --- a/docs/18-High-Dimensional-Problems/18.1-When-p-is-Much-Bigger-than-N.md +++ b/docs/18-High-Dimensional-Problems/18.1-When-p-is-Much-Bigger-than-N.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-14:2017-03-14 | +| 发布 | 2017-03-14 | | 更新 | 2019-08-17 10:52:02 | | 状态 | Done | diff --git a/docs/18-High-Dimensional-Problems/18.2-Diagonal-Linear-Discriminant-Analysis-and-Nearest-Shrunken-Centroids.md b/docs/18-High-Dimensional-Problems/18.2-Diagonal-Linear-Discriminant-Analysis-and-Nearest-Shrunken-Centroids.md index fe8d9ceb8d..c13fb82b41 100644 --- a/docs/18-High-Dimensional-Problems/18.2-Diagonal-Linear-Discriminant-Analysis-and-Nearest-Shrunken-Centroids.md +++ b/docs/18-High-Dimensional-Problems/18.2-Diagonal-Linear-Discriminant-Analysis-and-Nearest-Shrunken-Centroids.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-03-14:2017-03-14 | +| 发布 | 2017-03-14 | |更新|2019-08-17 11:41:10| |状态|Done| diff --git a/docs/18-High-Dimensional-Problems/18.3-Linear-Classifiers-with-Quadratic-Regularization.md b/docs/18-High-Dimensional-Problems/18.3-Linear-Classifiers-with-Quadratic-Regularization.md index bba4e5246b..1442416a9b 100644 --- a/docs/18-High-Dimensional-Problems/18.3-Linear-Classifiers-with-Quadratic-Regularization.md +++ b/docs/18-High-Dimensional-Problems/18.3-Linear-Classifiers-with-Quadratic-Regularization.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-04-27:2017-04-27 | +| 发布 | 2017-04-27 | |更新| 2019-08-17 17:45:41| |状态|Done| diff --git a/docs/18-High-Dimensional-Problems/18.4-Linear-Classifiers-with-L1-Regularization.md b/docs/18-High-Dimensional-Problems/18.4-Linear-Classifiers-with-L1-Regularization.md index 13d6966303..1468097b6d 100644 --- a/docs/18-High-Dimensional-Problems/18.4-Linear-Classifiers-with-L1-Regularization.md +++ b/docs/18-High-Dimensional-Problems/18.4-Linear-Classifiers-with-L1-Regularization.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf#page=680) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-04-27:2017-04-27 | +| 发布 | 2016-09-30 | |更新|2019-08-17 18:38:07| |状态|Done| diff --git a/docs/18-High-Dimensional-Problems/18.5-Classification-When-Features-are-Unavailable.md b/docs/18-High-Dimensional-Problems/18.5-Classification-When-Features-are-Unavailable.md index 3ecc9981f2..b74cf39247 100644 --- a/docs/18-High-Dimensional-Problems/18.5-Classification-When-Features-are-Unavailable.md +++ b/docs/18-High-Dimensional-Problems/18.5-Classification-When-Features-are-Unavailable.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-10 | +| 发布 | 2017-08-10 | |更新|2019-08-05 21:00:43| |状态|Done| diff --git a/docs/18-High-Dimensional-Problems/18.6-High-Dimensional-Regression.md b/docs/18-High-Dimensional-Problems/18.6-High-Dimensional-Regression.md index caf523e2ba..dbb0b8517f 100644 --- a/docs/18-High-Dimensional-Problems/18.6-High-Dimensional-Regression.md +++ b/docs/18-High-Dimensional-Problems/18.6-High-Dimensional-Regression.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-17 | +| 发布 | 2017-08-17 | | 更新 | 2020-08-25 15:46:55 | |状态 |Done | diff --git a/docs/18-High-Dimensional-Problems/18.7-Feature-Assessment-and-the-Multiple-Testing-Problem.md b/docs/18-High-Dimensional-Problems/18.7-Feature-Assessment-and-the-Multiple-Testing-Problem.md index f402de3374..6721b15b46 100644 --- a/docs/18-High-Dimensional-Problems/18.7-Feature-Assessment-and-the-Multiple-Testing-Problem.md +++ b/docs/18-High-Dimensional-Problems/18.7-Feature-Assessment-and-the-Multiple-Testing-Problem.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-17:2017-08-19 | +| 发布 | 2017-06-09 | |更新|2017-12-29; 2018-05-17| |状态|Done| diff --git a/docs/18-High-Dimensional-Problems/Bioliographic-Notes.md b/docs/18-High-Dimensional-Problems/Bioliographic-Notes.md index fb03e432e9..90588b82b8 100644 --- a/docs/18-High-Dimensional-Problems/Bioliographic-Notes.md +++ b/docs/18-High-Dimensional-Problems/Bioliographic-Notes.md @@ -3,7 +3,7 @@ | 原文 | [The Elements of Statistical Learning](https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf) | | ---- | ---------------------------------------- | | 翻译 | szcf-weiya | -| 时间 | 2017-08-29 | +| 发布 | 2017-06-09 | 许多文献已经在本章的特定地方给出来了;这里我们在列出另外的一些文献.Dudoit et al. (2002a)[^1]给出了对基因表达数据的判别分析方法的概述及比较.Levina (2002)[^2] 做了一些数学分析在$p, N\rightarrow \infty, p>N$的情况下比较对角LDA和全LDA.她证明了在合理的假设下,对角LDA比全LDA有更低的极限误差率.Tibshirani et al. (2001a)[^3]和 Tibshirani et al. (2003)[^4] 提出了最近收缩中心分类器.Zhu and Hastie (2004)[^5]研究了正则化逻辑斯蒂回归.高维回归和lasso是非常活跃的研究领域,许多的文献在3.8.5节给出.Tibshirani et al. (2005)[^6]提出fused lasso,而Zou and Hastie (2005)[^7]提出弹性网.Bair and Tibshirani (2004)[^8]和 Bair et al. (2006)[^9] 中讨论了有监督的主成分.关于censored survival data分析的介绍,参见Kalbfleisch and Prentice (1980)[^10]. diff --git a/docs/Preface/2016-07-20-Preface-to-the-Second-Edition.md b/docs/Preface/2016-07-20-Preface-to-the-Second-Edition.md index 852d98f187..1d0bbd0368 100644 --- a/docs/Preface/2016-07-20-Preface-to-the-Second-Edition.md +++ b/docs/Preface/2016-07-20-Preface-to-the-Second-Edition.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](../book/The Elements of Statistical Learning.pdf) ---|--- 翻译 | szcf-weiya -时间 | 2016-07-20 + 发布 | 2016-09-30 更新 | 2018-02-14 状态| Done diff --git a/docs/Preface/2016-07-21-Preface-to-the-First-Edition.md b/docs/Preface/2016-07-21-Preface-to-the-First-Edition.md index 85102bd018..91ba60a191 100644 --- a/docs/Preface/2016-07-21-Preface-to-the-First-Edition.md +++ b/docs/Preface/2016-07-21-Preface-to-the-First-Edition.md @@ -3,7 +3,7 @@ 原文 | [The Elements of Statistical Learning](../book/The Elements of Statistical Learning.pdf) ---|--- 翻译 | szcf-weiya -时间 | 2016-07-21 + 发布 | 2016-09-30 更新 | 2018-02-14 状态| Done diff --git a/docs/eqtags.sh b/docs/eqtags.sh index 9b27b20ed1..7c80815f84 100644 --- a/docs/eqtags.sh +++ b/docs/eqtags.sh @@ -14,3 +14,9 @@ find . -regextype sed -regex "./[0-9]\{2\}-.*.md" | xargs sed -i "s/。/./g" # https://unix.stackexchange.com/questions/67192/find-command-with-regex-quantifier-e-g-1-2 find . -regextype egrep -regex "./[0-9]{2}-.*.md" find . -regextype egrep -regex "./[0-9]{2}-.*.md" | xargs sed -i "s\../book/The Elements of Statistical Learning.pdf\https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf\g" + +for file in $(find . -regex "./.*\.md"); do + first=$(git log --date=short --format=%ad --follow $file | tail -1) + echo $file $first + sed -i "s/^\(|*\)[ ]*时间[ ]*|[^|]*\(|*\)[ ]*$/\1 发布 | $first \2/g" $file +done \ No newline at end of file diff --git a/docs/notes/BS/bs.md b/docs/notes/BS/bs.md index 64fcebb293..09a22606a2 100644 --- a/docs/notes/BS/bs.md +++ b/docs/notes/BS/bs.md @@ -3,7 +3,7 @@ | 原文 | [第五章附录:B 样条的计算](../../05-Basis-Expansions-and-Regularization/Appendix-Computations-for-B-splines.md) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2020-05-14 16:25:33 | +| 发布 | 2020-05-14 | 在 Functional Data Analysis (FDA) 中,很重要的一步便是对原始数据进行光滑化,其中经常用到 B spline,而 FDA 有个很强大的 R package `fda`,里面就包含 B 样条的实现方法。 diff --git a/docs/notes/Graph/alg-17-1.md b/docs/notes/Graph/alg-17-1.md index a4ec6c81bd..98f9e6ece9 100644 --- a/docs/notes/Graph/alg-17-1.md +++ b/docs/notes/Graph/alg-17-1.md @@ -3,7 +3,7 @@ | 原文 | [17.3 连续变量的无向图模型](../../17-Undirected-Graphical-Models/17.3-Undirected-Graphical-Models-for-Continuous-Variables/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-06-15 | +| 发布 | 2018-06-15 | | 更新 | 2018-07-08| 这篇笔记记录了用 R 语言实现算法 17.1 并应用到实际数据的具体过程。 diff --git a/docs/notes/Graph/ex-17-7.md b/docs/notes/Graph/ex-17-7.md index 1a9a582169..76ff845f9e 100644 --- a/docs/notes/Graph/ex-17-7.md +++ b/docs/notes/Graph/ex-17-7.md @@ -3,7 +3,7 @@ | 原文 | [Issue 138: Ex. 17.7](https://github.com/szcf-weiya/ESL-CN/issues/138) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-07-08 | +| 发布 | 2018-07-08 | ## 问题描述 diff --git a/docs/notes/HighDim/sim18_1.md b/docs/notes/HighDim/sim18_1.md index 22e7f559bf..f0961b1d36 100644 --- a/docs/notes/HighDim/sim18_1.md +++ b/docs/notes/HighDim/sim18_1.md @@ -3,7 +3,7 @@ | R Notebook | [模拟:Fig. 13.5](http://rmd.hohoweiya.xyz/ex18_1.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2017-12-29 | +| 发布 | 2017-12-29 | | 更新 | 2018-02-04 | 本笔记是[ESL18.1节](https://esl.hohoweiya.xyz/18%20High-Dimensional%20Problems/18.1%20When%20p%20is%20Much%20Bigger%20than%20N/index.html)例子的模拟。 diff --git a/docs/notes/ICA/sim14_42.md b/docs/notes/ICA/sim14_42.md index e36abcde66..de66dd9596 100644 --- a/docs/notes/ICA/sim14_42.md +++ b/docs/notes/ICA/sim14_42.md @@ -3,7 +3,7 @@ | R Notebook | [模拟:Fig. 14.42](http://rmd.hohoweiya.xyz/sim_14_42.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-01-22 | +| 发布 | 2018-01-23 | | 更新 | 2018-02-04 | 本笔记是[ESL14.7节](https://esl.hohoweiya.xyz/14%20Unsupervised%20Learning/14.7%20Independent%20Component%20Analysis%20and%20Exploratory%20Projection%20Pursuit/index.html)图14.42的模拟过程。第一部分将以`ProDenICA`法为例试图介绍ICA的整个计算过程;第二部分将比较`ProDenICA`、`FastICA`以及`KernelICA`这种方法,试图重现图14.42。 diff --git a/docs/notes/LDA/sim-4-3.md b/docs/notes/LDA/sim-4-3.md index 874798d20a..5b27a7662f 100644 --- a/docs/notes/LDA/sim-4-3.md +++ b/docs/notes/LDA/sim-4-3.md @@ -3,7 +3,7 @@ | 正文 | [4.3 线性判别分析](../../04-Linear-Methods-for-Classification/4.2-Linear-Regression-of-an-Indicator-Matrix/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-07-11 | +| 发布 | 2018-07-11 | ## 生成数据 diff --git a/docs/notes/Mixture-Gaussian.md b/docs/notes/Mixture-Gaussian.md index 8d8bb626e9..318e6d3d45 100644 --- a/docs/notes/Mixture-Gaussian.md +++ b/docs/notes/Mixture-Gaussian.md @@ -3,7 +3,7 @@ | 博客 | [cnblogs](http://www.cnblogs.com/szcf715/p/8127416.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2017-12-27 | +| 发布 | 2018-01-26 | |更新| 2018-02-04| 对于如下的两类别的高斯混合模型 diff --git a/docs/notes/ModelSelection/sim7_3.md b/docs/notes/ModelSelection/sim7_3.md index f6ff5ed32b..baf3542f94 100644 --- a/docs/notes/ModelSelection/sim7_3.md +++ b/docs/notes/ModelSelection/sim7_3.md @@ -3,7 +3,7 @@ | R Notebook | [模拟:Fig. 7.3](http://rmd.hohoweiya.xyz/sim7_3.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-01-05 | +| 发布 | 2018-01-06 | | 更新 | 2018-02-04 | 本笔记是[ESL7.3节](https://esl.hohoweiya.xyz/07%20Model%20Assessment%20and%20Selection/7.3%20The%20Bias-Variance%20Decomposition/index.html)图7.3的模拟。 diff --git a/docs/notes/ModelSelection/sim7_7.md b/docs/notes/ModelSelection/sim7_7.md index f56d51d322..0c0ac168f0 100644 --- a/docs/notes/ModelSelection/sim7_7.md +++ b/docs/notes/ModelSelection/sim7_7.md @@ -3,7 +3,7 @@ | R Notebook | [模拟:Fig. 7.7](http://rmd.hohoweiya.xyz/sim7_7.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-01-05 | +| 发布 | 2018-01-06 | | 更新 | 2018-02-04; 2018-02-21 | 本笔记是[ESL7.9节](https://esl.hohoweiya.xyz/07%20Model%20Assessment%20and%20Selection/7.9%20Vapnik-Chervonenkis%20Dimension/index.html)图7.9的模拟。 diff --git a/docs/notes/ModelSelection/sim7_9.md b/docs/notes/ModelSelection/sim7_9.md index 4eacaec7a4..d539bdc118 100644 --- a/docs/notes/ModelSelection/sim7_9.md +++ b/docs/notes/ModelSelection/sim7_9.md @@ -3,7 +3,7 @@ | 原文 | [7.10 交叉验证](../../07-Model-Assessment-and-Selection/7.10-Cross-Validation/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-12-14 | +| 发布 | 2018-12-14 | 本笔记是 [ESL 7.10 交叉验证](../../07-Model-Assessment-and-Selection/7.10-Cross-Validation/index.html) 中图 7.9 的模拟,这也是 [模拟:Fig. 7.3](sim7_3/index.html) 的继续。 diff --git a/docs/notes/Prototype/sim13_5.md b/docs/notes/Prototype/sim13_5.md index dcfe666362..da9d3ff23b 100644 --- a/docs/notes/Prototype/sim13_5.md +++ b/docs/notes/Prototype/sim13_5.md @@ -3,7 +3,7 @@ | R Notebook | [模拟:Fig. 13.5](http://rmd.hohoweiya.xyz/sim13_5.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-02-04 | +| 发布 | 2018-02-03 | ## 问题重述 diff --git a/docs/notes/SVM/e1071.md b/docs/notes/SVM/e1071.md index 1c7d23a269..8083cd5bf2 100644 --- a/docs/notes/SVM/e1071.md +++ b/docs/notes/SVM/e1071.md @@ -3,7 +3,7 @@ | Stats Blog | [Illustrations of Support Vector Machines](https://stats.hohoweiya.xyz//2017/05/18/Support-Vector-Classifier/) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-02-12 | +| 发布 | 2018-02-12 | | 更新 | 2018-03-19 | diff --git a/docs/notes/linear-reg/sim-3-18.md b/docs/notes/linear-reg/sim-3-18.md index 7bcdf2591a..5ca517b89a 100644 --- a/docs/notes/linear-reg/sim-3-18.md +++ b/docs/notes/linear-reg/sim-3-18.md @@ -3,7 +3,7 @@ | 原文 | [3.6 选择和收缩方法的比较](../../03-Linear-Methods-for-Regression/3.6-A-Comparison-of-the-Selection-and-Shrinkage-Methods/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-03-25 | +| 发布 | 2018-03-25 | 首先生成数据,然后通过最小二乘、岭回归、lasso、主成分回归、偏最小二乘以及最优子集回归这六种方法计算各自的 $\beta_1$ 和 $\beta_2$,从而绘制曲线 diff --git a/docs/notes/manual.md b/docs/notes/manual.md index ed1417e9fc..c49a9a3939 100644 --- a/docs/notes/manual.md +++ b/docs/notes/manual.md @@ -2,7 +2,7 @@ | 作者 | szcf-weiya | | ---- | ---------------------------------------- | -| 时间 | 2018-04-28 | +| 发布 | 2018-04-28 | 习题及其解答都放在本项目的 Issue 中,不定期更新,欢迎批评指正。下面分别根据关键词和章节对这些习题解答进行索引。 diff --git a/docs/notes/missing-data/missing-data.md b/docs/notes/missing-data/missing-data.md index 754b7c46e4..f37c5d22db 100644 --- a/docs/notes/missing-data/missing-data.md +++ b/docs/notes/missing-data/missing-data.md @@ -3,7 +3,7 @@ | 原文 | [9.6 缺失数据](../../09-Additive-Models-Trees-and-Related-Methods/9.6-Missing-Data/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-03-17 | +| 发布 | 2018-03-17 | 本文主要参考 [R in Action](../../references/r-in-action-en.pdf) 一书中的第 15 章,限于篇幅,这里仅列出主要代码,以及作必要简短的说明,更多细节详见原书。 diff --git a/docs/notes/sim73.md b/docs/notes/sim73.md index 15a3d9f4bf..4faee2285b 100644 --- a/docs/notes/sim73.md +++ b/docs/notes/sim73.md @@ -8,7 +8,7 @@ | R Notebook | [模拟图7.3](http://rmd.hohoweiya.xyz/sim7_3.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2017-01-05 | +| 发布 | 2018-01-23 | | 更新 | 2018-02-04 | 本笔记是[ESL7.3节](https://esl.hohoweiya.xyz/07%20Model%20Assessment%20and%20Selection/7.3%20The%20Bias-Variance%20Decomposition/index.html)图7.3的模拟。 diff --git a/docs/notes/spline/sim-5-9.md b/docs/notes/spline/sim-5-9.md index 25530949a9..f4275aaab2 100644 --- a/docs/notes/spline/sim-5-9.md +++ b/docs/notes/spline/sim-5-9.md @@ -3,7 +3,7 @@ | 原文 | [5.5 光滑参数的自动选择](../../05-Basis-Expansions-and-Regularization/5.5-Automatic-Selection-of-the-Smoothing-Parameters/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-03-30 | +| 发布 | 2018-03-30 | ## 生成数据 diff --git a/docs/notes/tree/sim-9-7.md b/docs/notes/tree/sim-9-7.md index e4c969ed9c..4b14651707 100644 --- a/docs/notes/tree/sim-9-7.md +++ b/docs/notes/tree/sim-9-7.md @@ -3,7 +3,7 @@ | 原文 | [9.3 PRIM](../../09-Additive-Models-Trees-and-Related-Methods/9.3-PRIM/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-03-17 | +| 发布 | 2018-03-17 | ## 生成数据 diff --git a/docs/notes/tree/various-classification-methods.md b/docs/notes/tree/various-classification-methods.md index 9837b3e6b7..7bda7b6840 100644 --- a/docs/notes/tree/various-classification-methods.md +++ b/docs/notes/tree/various-classification-methods.md @@ -3,7 +3,7 @@ | 原文 | [9.2 基于树的方法](../../09%20Additive%20Models,%20Trees,%20and%20Related%20Methods/9.2%20Tree-Based%20Methods(CART)/index.html) | | ---- | ---------------------------------------- | | 作者 | szcf-weiya | -| 时间 | 2018-03-16 | +| 发布 | 2018-03-15 | 构造分类问题的决策树有多种算法,对于 R 语言来说,如