Learn Julia (12): Logistic Regression

zqiu

Z. QIU

Posted on November 26, 2020

Learn Julia (12): Logistic Regression

Following my Linear Regression exercise done yesterday, I did a new small exercise for Logistic Regression today. This is a quite useful experience from which I gained more basic knowledge about Julia.

One can refer to this wiki page for details about Logistic Regression.

Firstly, I need to define the famous sigmoid function whose expression and curve can be seen in the figure below.
Alt Text

Moreover, I need to define this sigmoid function in array form in my code. To this end, I learned the Dot Syntax for Vectorizing Functions. As introduced in the official doc:

Any Julia function f can be applied elementwise to any array (or other collection) with the syntax f.(A). For example, sin can be applied to all elements in the vector A like so: sin.(A).

Thus I define the sigmoid function as follows:

## sigmoid which takes only one scalar as input
# sigmoid(x) = 1/(1 + exp(-x))
# x = collect(-10:0.5:10)
# y = [sigmoid(i) for i in x]
# plot(x, y)

## sigmoid which takes an array as input
sigmoid(x) = 1 ./(1 .+ exp.(-x))

##### test of sigmoid function 
x = collect(-10:0.5:10)
y = sigmoid(x)

using Plots
plot(x, y, title="Sigmoid function")
Enter fullscreen mode Exit fullscreen mode

See what the output of this function looks like:
Alt Text

Now I need some "labelled" sample data. I imagined a quite simple scenario: given a pair of float numbers (x1, x2), if their sum is greater than 0, the according label is 1; otherwise the label is 0. They are defined as follows:

raw_x = rand(Float64, (50,2))  ## generate a matrix of 50*2
raw_x = 10.0* (raw_x .- 0.5)
# raw_x =[ -10 4; 4 3; 5 9; -4 8; 9 -19; 4 9; -3 -1; -6 11; 8 2; 9 1; 3 -7; -4 -3]
## size of sample points
m = size(raw_x)[1]
println("Sample size:  $m")
raw_y = zeros((m,1))

for idx in 1: size(raw_x)[1]
    if raw_x[idx, 1] +  raw_x[idx, 2]> 0
        raw_y[idx] = 1.0
    end
end
println(raw_y)

gr() # Set the backend to GR
# This plots using GR
display(plot(raw_x[:,1],raw_x[:,2], raw_y[:,1] ,seriestype=:scatter, title="Sample points"))

Enter fullscreen mode Exit fullscreen mode

The samples plot in 3D:
Alt Text

I did some mathematics definitions for this problem:
Alt Text
Accordingly, the loss function can be defined as:
Alt Text

Now I code them in Julia:

func(v) = (raw_x*v[1:2] ).+ v[3]


cost(v) = -(sum(raw_y.*log.(sigmoid(func(v))) + (1 .-raw_y).*log.(1 .- sigmoid(func(v)))))
Enter fullscreen mode Exit fullscreen mode

The objective for me is to find a vector v that minimizes the loss function. It's thus an optimization problem. I need now to use the Optim module to solve it.

init_v = [1.0, 2.0, 0.0]

res = optimize(cost, init_v)

# now solve the minimization problem
sol = Optim.minimizer(res)
println("sol: $sol")

val = Optim.minimum(res)
println("val: $val")

Enter fullscreen mode Exit fullscreen mode

This is what the Optimization solver yields:
Alt Text

Now, let's test our solution (prediction). I made three points for testing : (7 12), (5 8) and (9 -19). Thus we are expecting the output 1, 1 and 0.


# now test the solution:
x_test1 = [7 12; 5 8; 9 -19]
y_test1 = sigmoid((x_test1*sol[1:2] ).+ sol[3])
println("x_test1: $x_test1")
println("y_test1: $y_test1")
Enter fullscreen mode Exit fullscreen mode

See below what the code predicts:
Alt Text

Que la vie est belle :D

💖 💪 🙅 🚩
zqiu
Z. QIU

Posted on November 26, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related