Dann.js - Creating a neural network which learns and counts!

leviackerman3855

LeviAckerman3855

Posted on August 12, 2021

Dann.js - Creating a neural network which learns and counts!

Ever wanted to learn how huge neural networks work, and create one? Dann.js makes is as easy as ever!

Dann.js is a neural network library for JavaScript, which makes creating neural networks super simplified. It acts as a playground for experimentation with deep neural networks. It is also fast to implement, which makes it a great choice for various applications.

In this tutorial, we will learn how to set up Dann.js and teach a neural network how to count!

What you will need

  • Node.js
  • A computer with more than 2GB of ram and a good CPU

Getting started

Setup

To get Dann.js setup, we first install it as a NPM package.

So to install Dann.js, we do

npm i dannjs
Enter fullscreen mode Exit fullscreen mode

This installs Dann.js as a package for Nodejs, so we can use it in our JavaScript.

Let's get started!

So open up your favorite editor (I personally recommend Atom), and create a new project. In the main JavaScript file, which is usually main.js or index.js (If there isn't one create it), and import Dann.js using require statements as follows:

const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
Enter fullscreen mode Exit fullscreen mode

This imports the dannjs module we installed into the constant Dannjs so we can refer to it later. This also initializes the Dann, the basic class of the Neural Network.

Creating the network

Now we have the Dannjs module imported, we should create our own neural network using instances of the Dann class.

Since in this project we are going to make a network which counts numbers in binary, we are going to create a basic network called CountingDan. You can name it whatever you want.

The basic syntax for creating a Neural Network instance is Dann(inputneurons,outputneurons). To start basic, we are giving our neural network 4 input neurons and 4 output neurons.

const CountingDan = Dann(4,4);
Enter fullscreen mode Exit fullscreen mode

I've tracked down that 16 (4*4) neurons functioned good enough. You could try different things with this worth later.

Finishing the creation

We're not done at this point! We need to add some 'hidden' layers. A hidden layer is essentially a neuron layer that can perform calculations. The name 'hidden' comes from the way that you don't have to see the values of every neuron, in contrast to the input/output layers. You can learn more about hidden layers & the basics surrounding it here. We are also going to set the activation function to leakyReLU

countingDan.addHiddenLayer(16,'leakyReLU');
countingDan.lr = 0.01;
Enter fullscreen mode Exit fullscreen mode

Finishing the creation of the counting network

Technically, we have finished the creation of the network countingDan. You can still experiment more, but this should be enough for our current project.

We should test out model by using the .log() method, which essentially displays information about our network, and by feeding the network some data to process:

countingDan.log();
countingDan.feedForward([0,0,1,0],{log:true});
Enter fullscreen mode Exit fullscreen mode

The .feedForward() method takes a array, which it feeds to the input neurons of the network. Remember, we specified 4 input neurons for our network? So we are passing 4 binary bits in an array, one for each neuron. The log parameter specifies that it should tell every processing and log the output.

In my case, it outputted this:

Dann NeuralNetwork:
  Layers:
    Input Layer:   4
    hidden Layer: 16  (leakyReLU)
    output Layer: 4  (sigmoid)
  Other Values:
    Learning rate: 0.01
    Loss Function: mse
    Latest Loss: 0

Prediction:  [0.5398676080698,0.6730957170697,0.6748749672290,0.6377636387674]
Enter fullscreen mode Exit fullscreen mode

It can be different in your case, as we never trained the model and it just gives some random results, as you can expect from a newborn baby!

Training the model

Setting up the dataset

To train the model, we’re going to need a dataset. Here is a lightweight JavaScript dataset for 4-bits binary counting. It basically looks like this:

const dataset4bit = [

    //...
    {
        input:[1,0,1,0],
        target:[1,0,1,1]
    },
    //  {
    //      input:[1,0,1,1],
    //      target:[1,1,0,0]
    //  },
    {
        input:[1,1,0,0],
        target:[1,1,0,1]
    },

    //...
];
Enter fullscreen mode Exit fullscreen mode

You can download the dataset from this link

We can see that this dataset contains one number x in 4-bit binary, as the input value and the number x+1 in 4-bit binary as the target value. I commented out the element [1,0,1,1] so we can have a test sample the neural network has never seen. To access the data, we can copy the code included in the GitHub gist above and save it in binaryDataset.js in the same directory as our project. We can then require the file as a module:

const dataset = require('./binaryDataset.js').dataset;
Enter fullscreen mode Exit fullscreen mode

Here we are importing the object dataset from the file binaryDataset.js.

We can now simply access the object data as:

dataset[i].input
dataset[i].target
Enter fullscreen mode Exit fullscreen mode

as attributes.

Training the model

Now that we have access to the dataset let’s apply it by calling .backpropagate() method for each data point in our dataset array. This will tune the weights of the model according to the data you give it.

for (data of dataset4bit) {
    countingDan.backpropagate(data.input,data.target);
}
Enter fullscreen mode Exit fullscreen mode

As we have the main dataset object defined as dataset4bit, we access it that way, and train the model for every bit of the dataset by calling .backpropagate() once for every bit.

Sadly, one epoch (One training pass) is not enough for any dataset to completely train, as you would expect from a child.

So we should train it multiple times, say 100000 times?
This will allow the network to train from every bit:

const epoch = 100000;
for (let e=0; e < epoch;e++) {
    for (data of dataset) {
        countingDan.backpropagate(data.input,data.target);
    }
}
Enter fullscreen mode Exit fullscreen mode

Running the network

Now we have trained it enough, we should run the model!

countingNN.feedForward([1,0,1,1],{log:true});
Enter fullscreen mode Exit fullscreen mode

This outputs:

Prediction:  [0.999884854,0.9699951248,0.020084607062,0.008207215405]
Enter fullscreen mode Exit fullscreen mode

Which seems good enough, as it is pretty close to [1,1,0,0], which we wanted as an answer.

Finishing up

You can experiment with many datasets, and change the values as you like. The whole code used in this tutorial is:

const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
const dataset = require('./binaryDataset.js').dataset;

const countingDan = new Dann(4,4);
countingDan.addHiddenLayer(16,'leakyReLU');

countingDan.lr = 0.01;

countingDan.feedForward([1,0,1,1],{log:true});
const epoch = 100000;
for (let e=0; e < epoch;e++) {
    for (data of dataset) {
        countingDan.backpropagate(data.input,data.target);
    }
}
countingDan.feedForward([1,0,1,1],{log:true});
Enter fullscreen mode Exit fullscreen mode
💖 💪 🙅 🚩
leviackerman3855
LeviAckerman3855

Posted on August 12, 2021

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related