b0r
Posted on December 8, 2021
Learn how to build a simple load balancer server in Go.
Table of Contents:
- What is a Load Balancer
- Use cases
- Load Balancing techniques
- Load Balancer implementation
- Conclusion
- Additional information
- Resources
What is a Load Balancer
A load balancer is a server that provides a gateway between the client and one or more origin servers. Instead of connecting directly to one of the origin servers, client directs the request to the load balancer server, which routes the request to one of multiple origin servers capable of fulfilling the request.
Load balancing refers to evenly distributing load (incoming network traffic) across a group of backend resources or origin servers. [1]
Load balancer is a type of reverse proxy with capability of evenly distributing the load.
Use cases
Typical use cases are:
- distributing client requests or network load efficiently across multiple origin servers
- ensuring high availability and reliability by sending requests only to origin servers that are online
- providing the flexibility to add or subtract servers as demand dictates (elasticity) [2]
Load Balancing techniques
Random order
Requests are distributed across the group of origin servers at random order.
Round Robin
Requests are distributed across the group of origin servers sequentially.
Weighted Round Robin
Weight (priority) is associated to each origin server based on some metric.
Requests are distributed across the group of origin servers sequentially, respecting the priority of each.
Load/Metric based
Requests are distributed across the group of origin servers based on the load (e.g. affirmed by health check) of each origin server.
IP based
The IP address of the client is used to determine which server receives the request.
Path based
Requests are distributed across the group of origin servers based on the path of the request.
Load Balancer implementation
Step 1: Create origin server
In order to test our load balancer, we first need to create and start a simple origin server.
Origin server will be started twice at port 8081
and 8082
, and it will return a string containing the value "origin server response : 8081 or 8082".
package main
import (
"flag"
"fmt"
"log"
"net/http"
"time"
)
func main() {
portFlag := flag.Int("port", 8081, "listening port")
flag.Parse()
port := fmt.Sprintf(":%d", *portFlag)
originServerHandler := http.HandlerFunc(func(rw http.ResponseWriter, req *http.Request) {
fmt.Printf("[origin server] received request: %s\n", time.Now())
_, _ = fmt.Fprintf(rw, "origin server response %s", port)
})
log.Fatal(http.ListenAndServe(port, originServerHandler))
}
Step 1 Test
Start the server:
go run main -port=8081
go run main -port=8082
Use curl
command to validate origin servers (8081
, 8082
) works as expected:
% curl -i localhost:8081
HTTP/1.1 200 OK
Date: Wed, 08 Dec 2021 20:01:10 GMT
Content-Length: 28
Content-Type: text/plain; charset=utf-8
origin server response :8081%
% curl -i localhost:8082
HTTP/1.1 200 OK
Date: Wed, 08 Dec 2021 20:01:12 GMT
Content-Length: 28
Content-Type: text/plain; charset=utf-8
origin server response :8082
Step 2: Create a load balancer server
package main
import (
"log"
"net/http"
"net/http/httputil"
"net/url"
"sync"
)
var nextServerIndex int32 = 0
func main() {
var mu sync.Mutex
// define origin server list to load balance the requests
originServerList := []string{
"http://localhost:8081",
"http://localhost:8082",
}
loadBalancerHandler := http.HandlerFunc(func(rw http.ResponseWriter, req *http.Request) {
// use mutex to prevent data race
mu.Lock()
// get next server to send a request to
originServerURL, _ := url.Parse(originServerList[(nextServerIndex)%2])
// increment next server value
nextServerIndex++
mu.Unlock()
// use existing reverse proxy from httputil to route
// a request to previously selected server url
reverseProxy := httputil.NewSingleHostReverseProxy(originServerURL)
reverseProxy.ServeHTTP(rw, req)
})
log.Fatal(http.ListenAndServe(":8080", loadBalancerHandler))
}
Step 2 Test
Start the server:
go run main
Use curl
command to validate load balancer works as expected:
- First request should be send to origin server
:8081
% curl -i localhost:8080
HTTP/1.1 200 OK
Content-Length: 28
Content-Type: text/plain; charset=utf-8
Date: Wed, 08 Dec 2021 20:08:04 GMT
origin server response :8081%
In the terminal of the origin server you should see:
[origin server] received request: 2021-12-08 21:08:09.021995 +0100 CET m=+433.153383251
- Second request should be send to the origin server
:8082
% curl -i localhost:8080
HTTP/1.1 200 OK
Content-Length: 28
Content-Type: text/plain; charset=utf-8
Date: Wed, 08 Dec 2021 20:08:09 GMT
origin server response :8082%
In the terminal of the origin server you should see:
[origin server] received request: 2021-12-08 21:08:09.402678 +0100 CET m=+423.670045543
Conclusion
In this article, load balancing explanation, its use cases and load balancing techniques were described. In addition, simple implementation of the load balancer server in Go was provided.
Readers are encouraged to try improve this example by implementing another load balancing techniques, add health check or make a list of origin servers dynamic.
Additional information
Resources
[1] https://docs.microsoft.com/en-us/azure/load-balancer/load-balancer-overview
[2] https://www.nginx.com/wp-content/uploads/2014/07/what-is-load-balancing-diagram-NGINX-640x324.png
[3] https://www.nginx.com/resources/glossary/load-balancing/
[cover image] Photo by Wilson Vitorino from Pexels
Posted on December 8, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.