epoch,iteration,batch size的区别(The difference between epoch, iteration and batch size)

epoch,iteration,batch size的区别

1个epoch = 在整个训练数据集上进行1次前向传播+1次反向传播

batch size = 在1次前向传播/反向传播中使用的训练样本的数目

iteration = pass的数目,1次前向传播+1次反向传播 = 1次pass

举例:训练集中有1000个样本,batch size设置为500,那么,为了完成1个epoch,需要进行2个iteration。

参考文献:https://stackoverflow.com/questions/4752626/epoch-vs-iteration-when-training-neural-networks

————————

epoch,iteration,batch size的区别

One epoch = one forward propagation + one back propagation on the whole training data set < / strong >

Batch size = number of training samples used in < strong > 1 forward propagation / back propagation < / strong >

Iteration = number of passes, < strong > 1 forward propagation + 1 reverse propagation = 1 pass < / strong >

For example, if there are 1000 samples in the training set and the batch size is set to 500, then two iterations are required to complete one epoch.

参考文献:https://stackoverflow.com/questions/4752626/epoch-vs-iteration-when-training-neural-networks