Big O and Sorting Algorithms

Introduction

I first encountered Big O notation in my Fullstack technical interview. I was given a problem that could easily be solved by a nested for loop, but I knew an more elegant/efficient solution must have been possible.

The interviewer mentioned Big O notation and how this was an O of n^2 because we were looping through the array twice. I was eventually able (with significant prodding) to come to a solution that had a Big O of n. Cool, I still didn’t get what Big O was, but I knew that the second function was more efficient.

What is Big O?

Big O can be described as how the time it takes for an algorithm to complete grows as the data set that the algorithm processes grows. Big O accounts for the worst case scenario

A Big O of 1 means that the algorithm finishes in constant time. It doesn’t matter if the set has 1 item, or 1,000,000 items, it will take the same amount of time to finish. Examples of this are accessing data at an index of an array or accessing data in an ideal hash table. You can go directly to that item without looping or checking any other items. These examples are very simplified and a Big O of 1, doesn’t mean it will be fast, but only that it is constant.

A Big O of log n indicates that the completion time grows smaller than the data set grows. One example of this is a balanced Binary Search Tree. Since each step in the search allows you to throw out half of the remaining data, doubling the data only increase the completion time by one time step.

A Big O of n means that the time to complete the algorithm increases linearly with the size of a set. An example of this is searching a linked list. In a worst case scenario, you start with the first node in the list and your result is in the last node. You have then checked every single item in the list.

A Big O of n log n generally indicates a two step process where one process requires you to touch every single item, but instead of having to do it n times, there is an optimization that allows you to break the other step into smaller pieces. An example of this is a merge sort. To merge the arrays, you will need to touch each of the elements in the array. But do to splitting each array in half, etc., you will only need to merge the arrays log n times, rather than n times, resulting in the product n log n.

A Big O of n^2 means that you need to touch every single item in the list at least twice. An example is the Bubble Sort search algorithm. At a worst case, you iterate through the n values in the array n times. Each time, bubbling the largest number to the end of the array.

A Big of n! can be found in algorithms like finding all permutations of an array. Here dynamic programming techniques should come in handy.