From 17976fcbc351196c043ef2cc8d31ea641309863d Mon Sep 17 00:00:00 2001 From: Jingwen ZHENG Date: Fri, 12 Jul 2019 23:26:30 +0200 Subject: [PATCH] Update table 2.12.8 --- .../Chapter 2_TheBasisOfMachineLearning.md | 2 +- ...\231\250\345\255\246\344\271\240\345\237\272\347\241\200.md" | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/English version/ch02_MachineLearningFoundation/Chapter 2_TheBasisOfMachineLearning.md b/English version/ch02_MachineLearningFoundation/Chapter 2_TheBasisOfMachineLearning.md index 4ec840d8..857150e9 100644 --- a/English version/ch02_MachineLearningFoundation/Chapter 2_TheBasisOfMachineLearning.md +++ b/English version/ch02_MachineLearningFoundation/Chapter 2_TheBasisOfMachineLearning.md @@ -719,7 +719,7 @@ $$ The table below briefly compares the difference between stochastic gradient descent (SGD), batch gradient descent (BGD), small batch gradient descent (mini-batch GD), and online GD: || BGD | SGD | Mini-batch GD | Online GD | -|:----:|:---:|:-------------:|:---------:| +|:--:|:----:|:---:|:-------------:|:---------:| | training set | fixed | fixed | fixed | real-time update | |Single iteration sample number | Whole training set | Single sample | Subset of training set | According to specific algorithm | | Algorithm Complexity | High | Low | General | Low | diff --git "a/ch02_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200/\347\254\254\344\272\214\347\253\240_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200.md" "b/ch02_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200/\347\254\254\344\272\214\347\253\240_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200.md" index ead57c05..abe810a7 100644 --- "a/ch02_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200/\347\254\254\344\272\214\347\253\240_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200.md" +++ "b/ch02_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200/\347\254\254\344\272\214\347\253\240_\346\234\272\345\231\250\345\255\246\344\271\240\345\237\272\347\241\200.md" @@ -795,7 +795,7 @@ $$ ​ 下表简单对比随机梯度下降(SGD)、批量梯度下降(BGD)、小批量梯度下降(Mini-batch GD)、和Online GD的区别: ||BGD|SGD|Mini-batch GD|Online GD| -|:--:|:-:|:-----------:|:-------:| +|:--:|:--:|:-:|:-----------:|:-------:| |训练集|固定|固定|固定|实时更新| |单次迭代样本数|整个训练集|单个样本|训练集的子集|根据具体算法定| |算法复杂度|高|低|一般|低|