Epoch, Batch, and Iteration in Machine Learning

rijalghodi

Rijal Ghodi

Posted on November 22, 2024

Epoch, Batch, and Iteration in Machine Learning

Background

In the journey of learning Artificial Neural Networks (ANNs), I often encounter confusion surrounding the terms: Epoch, Batch, and Iteration. Sometimes I mix them up when using them. Let’s tackle these concepts together!

Definition

  1. Epoch:
    • Definition: An epoch refers to one complete pass of the model through all the available training data.
    • Example: If you have a pack of 52 playing cards, an epoch is when you have completely learned all the cards.
  2. Batch:
    • Definition: A batch is a subset of the training data used in one iteration of updating model parameters.
    • Example: Continuing with the playing card analogy, if you divide the cards into 4 stacks and learn the cards stack by stack, you will have 4 batches, each containing 13 cards.
  3. Iteration:
    • Definition: An iteration refers to one step of processing where the model processes one batch of data and update the model parameters.
    • Example: Continuing with the playing card analogy, you would need to complete 4 iterations to fully finish the epoch.

Relationship

  • In each epoch, you go through several batches, and in each batch, you perform multiple iterations.
  • For example, if you have 100 examples and use a batch size of 20, you’d have 5 iterations per epoch

Lets see the usage of these terms in typical ANN training:

  • Let's say you have 1000 training data points representing house area versus price. You want to build an Artificial Neural Network (ANN) to predict price from a given unseen house area. To achieve this, you need to determine the parameters (typically weight and bias) that establish the relationship between house area and price.
  • Because handling 1000 training data points at once is resource-intensive, you divide them into 20 batches. Each batch consists of 50 training data points (1000 / 20 = 50), so the batch size is 50.
  • First, initialize the model with random parameters.
  • Next, use the first batch to calculate the loss and update the model parameters accordingly.
  • Then, with the updated parameters from the previous batch, proceed to calculate the loss and update the model parameters using the second batch.
  • Continue this process iteratively until you process the final batch.
  • Congratulations! After completing 20 iterations to finish one epoch, you obtain the optimized model parameters that effectively represent the relationship between house area and price.

Closing

Having a clear understanding of these concepts will help you correctly use these terms in the context of training your Artificial Neural Network. Epochs, batches, and iterations work together to form an efficient and effective training process for your model. I hope this explanation helps clear up any confusion you had about these terms!

💖 💪 🙅 🚩
rijalghodi
Rijal Ghodi

Posted on November 22, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related