Understanding and Implementing Recurrent Networks (RNNs) from Scratch in Python

kuthchi

Kuth

Posted on March 9, 2024

Understanding and Implementing Recurrent Networks (RNNs) from Scratch in Python

Today AI is the most popular topic in various industries and it's also has different develop purpose. This writing is about a powerful class of neural network is RNNs.

What is Recurrent Neural Networks (RNNs)?

Recurrent Neural Networks (RNNs) are a powerful class of neural networks well-suited for sequence data processing, making them invaluable in natural language processing (NLP), time series analysis, and more. In this tutorial, we'll delve into the fundamentals of RNNs and implement a basic version from scratch in Python. By tend, you'll have a solid understanding of how RNNs work and how to build one by your own.

Knowledge and Tools

  • Basic knowledge of Python
  • Familiarity with Numpy library

Understanding Recurrent Neural Networks (RNNs):
RNNs are designed to work with sequential data, where the order of elements matters. Unlike feedforward neural networks, which process data in a fixed sequence, RNNs maintain a hidden state that captures information about the sequence seen so far. This hidden state is updated at each time step, allowing RNNs to model temporal dependencies in data.

Implementing Neural Network from Scratch

To implement an RNN, we need to define the following components:

  1. Parameters initialization
  2. Forward pass
  3. Backpropagation through time (BPTT)

Let's get started the implementation:

Step 1: Import the necessary libraries

import numpy as np
Enter fullscreen mode Exit fullscreen mode

Step 2: Define the RNN class

class RNN:
    def __init__(self, input_size, hidden_size, output_size):
        self.input_size = input_size
        self.hidden_size = hidden_size
        self.output_size = output_size

        # Initialize weights
        self.Wxh = np.random.randn(hidden_size, input_size) * 0.01
        self.Whh = np.random.randn(hidden_size, hidden_size) * 0.01
        self.Why = np.random.randn(output_size, hidden_size) * 0.01

        # Initialize biases
        self.bh = np.zeros((hidden_size, 1))
        self.by = np.zeros((output_size, 1))

Enter fullscreen mode Exit fullscreen mode

Step 3: Implement the forward pass

def forward(self, inputs, h_prev):
    # List to store outputs at each time step
    outputs = []

    for x in inputs:
      # Update hidden state
      h_next = np.tanh(np.dot(self.Wxh, x) + np.dot(self.Whh, h_prev) + self.bh)

      y = np.dot(self.Why, h_next) + self.by

      outputs.append(y)

      h_prev = h_next

    return outputs, h_next
Enter fullscreen mode Exit fullscreen mode

Step 4: Implement backgropagation through time (BPTT)

def backward(self, inputs, targets, h_prev, dh_next):
    # Initialize gradients
    dWxh, dWhh, dWhy = np.zeros_like(self.Wxh), np.zeros_like(self.Whh), np.zeros_like(self.Why)
    dbh, dby = np.zeros_like(self.bh), np.zeros_like(self.by)
    dh_next_temp = np.zeros_like(dh_next)

    # Backpropagate through time
    for x, y_true in zip(reversed(inputs), reversed(targets)):
        # Compute gradients
        dy = outputs - y_true
        dWhy += np.dot(dy, h_next.T)
        dby += dy
        dh = np.dot(self.Why.T, dy) + dh_next_temp
        dh_raw = (1 - h_next * h_next) * dh
        dbh += dh_raw
        dWxh += np.dot(dh_raw, x.T)
        dWhh += np.dot(dh_raw, h_prev.T)
        dh_next_temp = np.dot(self.Whh.T, dh_raw)

     return dWxh, dWhh, dWhy, dbh, dby

Enter fullscreen mode Exit fullscreen mode

Conclusion:

In this tutorial, we've covered the basics of Recurrent Neural Networks (RNNs) and implemented a simple version from scratch in Python. While this implementation is basic, it provides a foundational understanding of how RNNs work and how they can be trained using backpropagation through time (BPTT). Experiment with different architectures and datasets to deepen your understanding and explore the full potential of RNNs in various applications.

References:

Don't forget comments on it.

💖 💪 🙅 🚩
kuthchi
Kuth

Posted on March 9, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related