Paradigm shift: Immutability in Elixir

kalarani

Kalarani Lakshmanan

Posted on June 19, 2023

Paradigm shift: Immutability in Elixir

I spent this weekend by writing an Elixir script to solve the Linear Regression problem. And then when I went on to implement Gradient descent to train the model, I hit a road block.

I came up with this code, which looked perfect to me, just that it didn't work. It didn't do what I expected this to do.

  def train_from(filename) do
    raw_dataset = read_dataset(filename)
    x = raw_dataset |> Enum.map(fn d -> elem(Integer.parse(Enum.at(d, 0)),0) end)
    y = raw_dataset |> Enum.map(fn d -> elem(Float.parse(Enum.at(d, 1)),0) end)
    training_dataset = %{x: x, y: y}
    w = 100
    b = 100

    for n <- 1..5 do
      IO.inspect("Iteration #{n}: #{w}, #{b}")
      [w, b] = gradient_descent_step(x, y, w, b)
      IO.inspect("End of Iteration #{n}: #{w}, #{b}")
    end

    IO.inspect("End values for w: #{w}, b: #{b}");
  end
Enter fullscreen mode Exit fullscreen mode

It spit out this output.

"Iteration 1: 100, 100"
902964882.6662791
4514335.333331471
"Iteration 2: 100, 100"
902964882.6662791
4514335.333331471
"Iteration 3: 100, 100"
902964882.6662791
4514335.333331471
"Iteration 4: 100, 100"
902964882.6662791
4514335.333331471
"Iteration 5: 100, 100"
902964882.6662791
4514335.333331471
"End values for w: 100, b: 100"

I was expecting the value of w and b to be updated and passed on in each iteration. Even though, the gradient descent returned back new values for w and b, it wasn't carried forward to the next iteration.

Turned out, I missed an important aspect of Elixir, rather functional programming. It is

Immutability.

All variables are immutable. Ideally, they are constants. You can't change it once they are initialised.

If that is the case, how am I supposed to do multiple iterations and feed the outputs from the previous iteration to the next one?

This is where the paradigm shift from object oriented programming is required.

By forcing ourselves to deal with immutable variables and methods that doesn't shouldn't have any side effects, we can come up an alternative way to solve the problem. i.e., to use recursion for looping.

Epilogue:

My relationship with Elixir has always been on and off. Having spent a couple of years building websites in RoR, I've explored basics of elixir few years back. Recently I came across this blog from Sean Moriarity and that triggered my interest to try out elixir in the ML space.

For curious minds, Charles Scalfani wrote a series of blogposts that helps with the fundamentals of functional programming.

💖 💪 🙅 🚩
kalarani
Kalarani Lakshmanan

Posted on June 19, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

Random Cut Forests
machinelearning Random Cut Forests

August 7, 2023

Paradigm shift: Immutability in Elixir
machinelearning Paradigm shift: Immutability in Elixir

June 19, 2023

How to build a machine learning project in Elixir.