Sorting algorithms: Technical interviews must-know

Sorting algorithms are a must-know for any technical interview, be it a startup or a FANG company like Facebook, Amazon. If you fail to explain the sorting algorithm you are using and why are you using it, you can kiss goodbye to your interview. However, the most important aspect of sorting algorithms is their complexity. We will be going through 5 most known algorithms and group them by their complexity. Another aspect that is good know is the stability of the sorting algorithm.

Stability does the sorting algorithm maintain the relative order of elements that are equal?

The first group is of polynomial complexity like O(n2) and second group is linear logarithmic complexity like (n log n). Algorithms like Selection Sort, Bubble Sort, and Insertion sort fall in the first group. Quicksort, Merge sort and heap sort they fall in the second group.

Worst-case complexity of quicksort is O(n2), however, that is limited to very specific cases like already sorted input, on all other cases, if the pivot is selected properly, it will give O(nlogn)

Selection sort

The fundamental of this sort is: Find the minimum element in remaining unsorted array and put it at the current position. We start with index 0 and go all the way to the last index, find the minimum in the remaining array, and put it to the current index.

  public static void selectionSort(int[] arr){  
      for (int i = 0; i < arr.length - 1; i++){  
            int index = i;
            //Select minimum in the remaining array  
            for (int j = i + 1; j < arr.length; j++){  
                if (arr[j] < arr[index]){  
                    index = j;
                }  
            }  
            //Swap the smallest number and current index number
            int smallestNumber = arr[index];   
            arr[index] = arr[i];  
            arr[i] = smallerNumber;  
        }  
    }  

If you are wondering why complexity of selection sort is O(n2) even though inner loop goes from i+1 to end, I would suggest reading this post on calculation of time complexity in for loop. The space complexity is O(1).

Bubble sort

The fundamental of Bubble sort is: Compare two adjacent indices, if they are out of order, put them in order. We start with index 0, compare it with every new neighbor till the end of the array. After each pass, we have sorted one element.

   void bubbleSort(int arr[]) 
    { 
        int n = arr.length; 
        for (int i=0; i < n; i++){
            for (int j = 1; j < n-i; j++){
                if (arr[j-1] > arr[j]){ 
                    // swap arr[j-1] and arr[j] 
                    int temp = arr[j]; 
                    arr[j] = arr[j-1]; 
                    arr[j-1] = temp; 
                } 
             }
        }
    }

The time complexity is obviously O(n2) and the space complexity is O(1). Bubble sort is a stable sorting algorithm.

Insertion sort

The third algorithm in the polynomial complexity class is insertion sort, fundamental of it is: Taken element and put it in its correct position in the array by shifting all the indices on right by 1. The idea to keep the part of the array sorted, examine all the indices, and put them at the correct position in the sorted part of the array.

    void insertionSort(int arr[]){ 
        int n = arr.length; 
        for (int i = 1; i < n; ++i) { 
            int key = arr[i]; 
            int j = i - 1; 
  
            /* Move elements of arr[0..i-1], that are 
               greater than key, to one position ahead 
               of their current position */
            while (j >= 0 && arr[j] > key) { 
                arr[j + 1] = arr[j]; 
                j = j - 1; 
            } 
            arr[j + 1] = key; 
        } 
    } 

Time complexity is O(n2) and space complexity is O(1). Insertion sort is stable sort as well.

Quicksort

The entire idea of quicksort revolves around a pivot. Pivot is an element in input around which input is arranged in such a way that all elements on the left side are smaller and all elements on the right side are greater than the pivot. The question is how to find or select pivot and put it into the correct position.
To put this pivot at the correct position in input, start with the next element of pivot in input space, and find an element that is greater than the pivot. Let that be ith position.

At the same time, start from the end of the array and find an element that is smaller than pivot. Let it be jth position.

If i and j have not crossed each other i.e i < j, then swap element at ith and jth positions, and continue moving right on input to find element greater than pivot and moving left to find element smaller than pivot. Once i and j cross each other, swap pivot with element at jth position.
After this step, the pivot will be at its correct position and the array will be divided into two parts. All elements on the left side will be less than the pivot and all elements on the right side will be greater than the pivot.

Recursively apply quicksort on left and right subarray, until there is only one element to sort.

    private void swap(int [] a, int i, int j){
        int temp = a[i];
        a[i] = a[j];
        a[j] = temp;
    }

    private int partition(int [] a, int start, int end){
        int pivot = a[start];
        int i  = start;
        int j  = end;

        while(i < j){
            while(i <= end && a[i] <= pivot) i++;
            while(j >= start && a[j] > pivot) j--;
            
            if(i < j) {
                swap(a, i, j);
            }
        }

        swap(a, start, j);
        return j;
    }

    public void quickSort(int [] a, int start, int end){
        if(start < end){
            int p = partition(a, start, end);
            quickSort(a,start, p-1);
            quickSort(a, p+1, end);
        }
    }

Time complexity of the quicksort implementation depends on the selection of pivot. If pivot splits the original array into two equal parts (which is the intention), the complexity of quicksort is O(n log n) (Master Theorem).

The worst-case complexity of quick sort happens when array is sorted or reverse sorted. Array is partitioned into two subarrays, one with size 1 and other with size n-1. Subarray with n-1 elements, it again is divided into two subarrays of size 1 and n-2. In order to completely sort array, it will split for n-1 times and each time it requires to traverse n element to find the correct position of pivot. Hence overall complexity of quick sort comes out as O(n2).

What is space complexity of this algorithm? There is no apparent memory is used. However, recursive implementation internally puts stack frames on the stack for partitioned indices and function call return addresses and so on. In the worst case, there can be n stack frames, so, the worst-case complexity of quicksort will be O(n).

Merge sort

Merge sort algorithm falls in divide and conquer class of algorithms where input space is divided into subproblems and these subproblems are then solved individually and combined together to solve the original problems.
A question arises is what is the smallest subproblem? The smallest subproblem is where input cannot be further divided, a subproblem with one item to sorted. Once the split is done to the smallest size, we start merging. Going from the bottom, start with two one-item subproblems, combine that to create a two-item subproblem, then combine two two-items subproblems to create a four-item subproblem. Go on till you get a problem with the original size.
In merge operation, scan both sorted arrays one by one element and based on the output of comparison, put one of the two items into output array, till both arrays are scanned.

int merge(int[] a, int start, int mid, int end){
 
    int i,j,k;
    int temp[end-start+1];
 
    i = start; j = mid+1; k=0;
    /* Compare all elements of two sorted arrays and store
      them in sorted array. */
    while(i <= mid && j <= end){
        if(a[i]< a[j]){
            temp[k++]= a[i++];
        }
        else {
            temp[k++]  = a[j++];
        }
    }
    while(i <= mid){
        temp[k++] = a[i++]; 
    }
    while(j <= end){
        temp[k++] = a[j++]; 
    }
    //Copy the temporary array into original array
    for(i=0; i<k; i++){
        a[i+start] = temp[i];
    }
}
int mergeSort(int a[], int start, int end ){
 
    int mid  = (start + end)/2;
    if(start<end){
        //sort the left part
        mergeSort(a, start, mid);
        //sort right part
        mergeSort(a, mid+1, end);
 
        //merge them together
        merge(a, start, mid, end);
    }
}

For n elements in an array, there will be O(logn) such steps. Hence, the complexity of merge sort algorithm is O(nlogn) Refer Master theorem.

Every time we merge, two sub-arrays an auxiliary array is needed to temporarily store array elements and that incurs O(n) space complexity on merge sort algorithm.

This in addition to overhead caused by recursion.

Heap Sort

This one depends on the concept of heaps, which is covered on this website here: Heaps Fundamental is that we put all the elements in a priority queue or heap, min-heap to be precise. Then take element one by one and add back into the array. Using Java this is one of the simplest sorting algorithms, given that we get an inbuilt implementation of the priority queue in java.

    public void heapSort (int[] a){

        PriorityQueue<Integer> pq = Arrays.stream(a)
                .boxed()
                .collect(Collectors.toCollection(PriorityQueue::new));

        int i = 0;
        while(!pq.isEmpty()){
            a[i] = pq.poll();
            i++;
        }

        Arrays.stream(a)
                .forEach(e -> System.out.print(e + " "));
   }

The time complexity of heap sort is O(long) and space complexity is O(n).

There is another class of sorting algorithms that sort input in linear time complexity, but those algorithms are dependent on some conditions to be present in the input, for example for Radix sort, the input must be an integer.

Prepare your sorting algorithm basics and understand time and space complexity and stability. It is very good to know the fundamentals of them, even if you are using the inbuild library or function to do the sorting.

Always account for complexity for sorting algorithm in your solution, most the time that dominates the complexity.

If you are preparing for an interview and need some help with preparation, please reach out to us or book a session with us to understand more.