Common time complexities of functions

Tags: , , , ,

In last two posts time complexity for aglorithms and Master throrem, we learned the fundamentals of the complexity analysis. Today, we will discuss some of the common time complexities and examples of each to consolidate our understanding.

There are eight types of time complexities which occur commonly in algorithm : O(1) also called as constant time, O(n) called as linear, O(n2) called as quadratic, O(nk) called as polynomical, O(logn) known as logarithmic, O(nlogn) known as linearithmic, O(2n) known as exponential and O(n!) also known as factorial complexity.

Growth of complexity functions

All these complexity functions grow at a different rate and it is very important to know the difference. That will help you evaluate your algorithms in an objective manner. Below is the graph of growth of each complexity function as n grows larger.

Time complexity for algorithms

O(1) – Constant time complexity

Constant time complexity does not mean that there is not work or operations are done. It just means that the number of operations does not depend on the size of the input. Irrespective of the size of the input, it will run a constant number of times.
For example, accessing a specific index in an array has constant time complexity. Similarly, accessing a value for a given key from a dictionary* is O(1) operation, it does not matter if there is only one word in the dictionary or a million words.
Another misconception about constant time is that it should be small. Not that is not required, it can be as large as possible. As far as it does not depend on the input, your algorithm is a constant time or O(1) algorithm.

A perfect hash function can give a worst-case complexity of O(1). This ideal hash function is not practical, so there will be some collisions that can lead to worst-case runtime of O(n). On average the lookup time is O(1).

Quick check: What is the complexity of the code below?

for(int i=0; i<100000; i++){
     System.out.println(i);
}

By the look of it, we tend to think that it is O(n). However, notice that the for loop does not run a variable number of times, it runs exactly 100000 times no matter what is the input for the function. This is a constant time code.

It becomes more confusing when this kind of loop is nested inside multiple loops. For example,

for(int i=0; i<n; i++){
   for(int j=0; j<n; j++){
       for(int k=0; k<100000; k++){
           System.out.println("Hello");
       }
   }
}

By the look of it, we tend to think that it is O(n). However, notice that the for loop does not run a variable number of times, it runs exactly 100000 times no matter what is the input for the function. This is a constant time code.

O(n) – Linear time complexity

When the complexity is directly proportional to the size of the input size, time complexity of the algorithm is linear. For example,

for(int i=0; i<n; i++){
   System.out.println( i + " ");
}

this function prints all the numbers from 0 to n-1. As n grows, execution of print statement increases as well. For n=0, it will not execute at all, for n=1, it will execute once, for n=2, it executes two times and for n=n, it executes n times. So, the loop is directly proportional to the value of n, hence complexity of loop is O(n)

Can you identify the complexity of the following function? It prints all the odd numbers less than n.

for(int i=0; i<n; i++){
   if(i%2 == 1)
     System.out.println( i + " ");
}

We can see that the print statement is executed only half the time, but still, the complexity of loop if O(n) because we are still depending on the size of the input. Same is the complexity of the below code

for(int i=0; i<n/2; i++){
     System.out.println( i + " ");
}

O(n2) – Quadratic time complexity

Typical quadratic complexity function has format of this kind, a loop with a in loop and both loops depends on the size of input.

for(int i=0; i<n; i++){
   for(int j=0; j<n; j++){
      System.out.println(i+j);
   }
}

But there are some cases where it is not quite evident and can lead to misconception. One of the common quadratic complexity codes is as below and students and professional make mistake identifying it.

for(int i=0; i<n; i++){
   for(int j=i+1; j<n; j++){
      System.out.println(i+j);
   }
}

Cause of concern is that the inner loop is not exactly run n times. let’s see why this piece of code is O(sup>2)?

For i = 0: inner loop executes n times
For i = 1: inner loop executes n-1 times
For i = 2: inner loop executes n-2 times

For i = n-1: inner loop executes 1 time

So total number of time print statement is executed is n + (n-1) + (n-2) + (n-3) + … + 1 = (n * n-1)/2 which is O(n2)

Another common code where candidates miscalculate calculate the time complexity is as below

for(int i=0; i<a; i++){
   for(int j=0; j<b; j++){
      System.out.println(i+j);
   }
}

It is a common tendency to report the complexity of this code as O(n2) by looking at two loops. More common when there is a n anywhere in the function. Be careful that there two different variables a and b which are controlling the loops and there is no pre-knowledge of any relationship between two. So, best way to report complexity is O(a * b) and not O(n2).

Examples of quadratic complexity algorithms are: finding duplicates in an array, insertion sort, bubble sort, finding all the pairs of an array and many more. We will learn more when we solve problems.

O(log n) – Logarithmic time complexity

Logarithmic time complexities usually apply to algorithms that divide the inputs into multiples parts every time. For instance, in the binary search algorithm, we divide the array or input into two halves and discard one half each time. We keep doing that until we do not have any element in the input to look for. So, for first iteration, we will have size n, then n/2, then n/4 and so on till we reach 1. How many iterations does it take to reach from n to 1, by dividing n by 2 each time? It is log n times.
Hence complexity of binary search algorithm is O(log n), you can deduce the same using Master Theorem.

Adding a new element in a balanced binary search tree is a logarithmic algorithm.

In the next post, we will discuss the last 4 types of complexities algorithms. Please comment for anything missing or wrong.
If you are preparing for an interview and looking for personalized coaching for preparation, please reach out to us or book a free session with us.