Difference Between a Batch and an Epoch in a Neural Network
Tweet Tweet Share Share Last Updated on August 15, 2022 Stochastic gradient descent is a learning algorithm that has a number of hyperparameters. Two hyperparameters that often confuse beginners are the batch size and number of epochs. They are both integer values and seem to do the same thing. In this post, you will discover …
Read more “Difference Between a Batch and an Epoch in a Neural Network”