Understanding the Big O of Merge Sort

0

Space Complexity

The first thing we’re going to talk about is space complexity.

With merge sort, we start with our original list, and then we create two new lists, called left and right.

And then, we continue breaking these down.

So we start with this original list, and by the time we’ve broken everything down completely, we end up with multiple smaller lists — eventually, eight lists that each contain just one item if we started with eight elements.

So essentially, the number of items being stored in memory has doubled.

That means merge sort requires O(n) extra space.

In other words, the space complexity of merge sort is O(n).

This is quite different from the previous sorting algorithms we looked at — like Bubble Sort, Insertion Sort, and Selection Sort — which all sort the data in place.

Those algorithms only move elements around inside the original list.

Merge Sort, on the other hand, creates new lists every time it splits and merges.

So that’s why merge sort has a higher space complexity than the in-place sorting algorithms.


Time Complexity

Now let’s talk about time complexity.

When merge sort breaks the list in half, it does that in several steps.

For example, if we start with a list of eight items, we can break it down like this:

  • 8 → 4 → 2 → 1

We broke the list in half three times to reach individual items.

So that’s log base 2 of n steps.

That’s where we get the log n part — because 23=82^3 = 8.

So, breaking the list apart has a complexity of O(log n).


Now, what about putting it back together?

When we merge those smaller lists back together, remember our merge helper function?

It had those while loops that went through each item, one by one, to combine them into a sorted list.

That merging step requires us to touch every element once per merge step, so that part is O(n).


So when we combine these two parts together —

  • breaking down: O(log n)

  • merging back together: O(n)

We get O(n × log n) time complexity for merge sort.


Comparing Merge Sort with Other Algorithms

Let’s look at this on a graph.

When it comes to sorting algorithms, only the two on the left really matter — those are the most efficient general-purpose sorts.

Let’s gray out the rest for a moment.

The first three sorting algorithms we learned — Bubble Sort, Insertion Sort, and Selection Sort — all have a time complexity of O(n²).

But Merge Sort is much more efficient with its O(n × log n) performance.


Now, it might not look like a huge difference on a small graph where the Y-axis only goes up to 75, but if that axis went up to 1,000 or even 1,000,000, the gap between O(n²) and O(n log n) would be enormous.

Merge Sort scales far better as the list size grows.


Final Thoughts

So to summarize:

  • Space Complexity: O(n) — because of the extra lists created.

  • Time Complexity: O(n × log n) — from breaking and merging.

And O(n log n) is the most efficient time complexity you can achieve for a sorting algorithm that can handle any type of data.

There are a few specialized algorithms that can sort only numbers slightly faster, but for general-purpose sorting, Merge Sort is about as efficient as it gets.

And that’s our overview of Merge Sort Big O.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to Top