Learning Algorithms  -  Merge Sort

ji90

Jenny Yang

Posted on October 8, 2020

Learning Algorithms  -  Merge Sort

Previously, we had a look of Insertion Sort and implemented it.

However, its worst-case scenario was O(n²), which is really slow isn't it? and I promised you to more sophisticated and faster algorithms to sort an array that is Merge Sort.

Divide and Conquer

So how can we save time for sorting when it comes to a list of elements? The answer is to divide and conquer. I assume you must know binary search at this juncture, where you already started learning algorithms or you can go back to learn binary search. Let me give you a big picture of the whole plan. Firstly, we will have an array of size n, we will call it A. Secondly, We will then divide in a half (until it is not dividable), which will be two arrays Left and Right. Next, the two arrays will be sorted. Lastly, we will merge them in a sorted order back in array A.

1_Y2FZaJLxvUDfcaaj3ckj9w

Recursion

Unfortunately, to use Merge Sort, we need to know this notoriously confusing concept, so-called recursion. In a nutshell, recursion is a function call that calls itself. Let’s take a look of code right below.

def call_me(arg):
    call_me(arg)
    print(arg)
call_me("I am calling me")
Enter fullscreen mode Exit fullscreen mode

1_z5aRQeU2pBiJMEKMTK4C9g
The function call_me(arg) calls itself, then prints arg. What will happen in this case? It will call itself infinitely printing “I am calling me” like a black hole. It will cause, the notorious name of error, stack overflow, which is literally function calls being stacked up and program used up memory spaces to store those calls, so thrown an error. So all we need to do is let this recursion finishing at some point, to do that, we need a condition to end the recursion, we call it base case.
1_XXDyk0U73T8KeFDZioshXg
Let me stop this recursion with a base case. the call_me function is taking one more argument, i, that increments by 1 every execution, and if i becomes 5 the function is going to be returned.

def call_me(arg, i):
    if i == 5:  # if i == 5, return
        return

    i += 1
    call_me(arg, i)  # otherwise keep executing with incremented i
    print(arg)
call_me("I am calling me", 0)
Enter fullscreen mode Exit fullscreen mode

1_sKQApV0u2_ZYWdcAcXRuPA

Merge Sort

Copying halves of the array A

I assume we are now ready to get cracking. We have this unsorted array A as follows.

A = [4, 7, 20, 3, 9, 13, 5, 11]
Enter fullscreen mode Exit fullscreen mode

As I mentioned earlier in a planning phase, we are going to copy arrays of left half(L) and right half(R) respectively.

def merge_sort(A):
    mid = len(A) // 2
    # copy slicing from the 0 to mid from array A
    L = A[:mid]
    # copy slicing from mid to the end from array A
    R = A[mid:] 
Enter fullscreen mode Exit fullscreen mode

1_Li9UK-FOoiTmD5T5v0w20w
We are going to recursively halve the arrays until it is not dividable. As I explained earlier we need a base case to stop recursion at some point, and it will be until the length of the array A becomes 1 since there is nothing to sort and the array is individable.

 def merge_sort(A):
    if len(A) > 1:
        ... previous code
        # recursively halve arrays
        merge_sort(L)
        merge_sort(R)

        # indices for array A, L, R 
        # for short term, i = j = k = 0 works as well.
        i = 0
        j = 0
        k = 0
Enter fullscreen mode Exit fullscreen mode

1_cv-aRDjjDsTR0fLJMP1Tzw
As a result of the code above, arrays will look like this. Note that, each level of halving process happens simultaneously, so in this case, we took 4 steps to halve an array length of 8.

Merge and sort

Now, it is time to merge and sort back to an array A, to sort elements, we need to compare which one should come first. the first one would be the smaller ones, the larger one comes next. Implementation of the merge and sort will be like so, once an element is placed back in array A, index moves to a next element.

# while indices are within a range of array L and R
# note that indices i for L, j for R, k for A
while i < len(L) and j < len(R):
    if L[i] < R[j]:
        A[k] = L[i]
        i += 1
    else:
        A[k] = R[j]
        j += 1
    k += 1
Enter fullscreen mode Exit fullscreen mode

There is the last thing left, that is to insert an element that hasn’t been moved to array A from L or R. This can happen when the length of L and R is different, in other words when the length of A is an odd number like 7.

... previous code
while i < len(L):
    A[k] = L[i]
    i += 1
    k += 1
while j < len(R):
    A[k] = R[j]
    j += 1
    k += 1
Enter fullscreen mode Exit fullscreen mode

The process of the merge would look like this.
1_AECCzpdvdQBk-7aJyiWxqw

Time and Space Complexity Analysis

Time Complexity of Merge Sort

Seemingly, the time complexity of this algorithm O(n log n) because of divide and conquer phases (the halving arrays). But what about the merge and sort phases? It loops through each array to sort and merge, which is O(n). When worst-case exists, the other case doesn’t matter, so the time complexity is O(n).

Space Complexity of Merge Sort

How many memory spaces merge sort uses? A lot! it copies every half of the array and that means extra spaces, depending on the number of elements. Therefore space complexity of Merge Sort is O(n). In comparison to Insertion sort, it would be Merge sort O(n) > Insertion sort O(1).
It has been a long journey to the end, I hope you now have a good grasp of the concept.

💖 💪 🙅 🚩
ji90
Jenny Yang

Posted on October 8, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

Learning Algorithms  -  Merge Sort
computerscience Learning Algorithms  -  Merge Sort

October 8, 2020