# What is Master Theorem?

The Master theorem provides us solution for complexity analysis of recursive function by substituting values for variable. The Master Theorem applies to recurrences of the following form:

T (n) = aT (n/b) + f(n)

where `a ≥ 1` and `b > 1` are constants and `f(n)` is an asymptotically positive function.

These kinds of recurrence relations occur in the analysis of many “divide and conquer” algorithms like binary search or merge sort.

But what do these variables `a`, `b` and `n` mean?

`n` is the size of the input or the problem.

`a` is the number of subproblems in the recursion, means are we dividing the problem into two halves or 3 or 5? For example, for binary search algorithm a=1, and for merge sort algorithm it is 2.

`n/b` is the relative subproblem size. What rate is the input reduced? E.g., Binary search and Merge sort cut input in half.

`f(n)` is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and

the cost of merging the solutions to the subproblems.

Once, we have `a`, `b` and `f(n)` it is easy to find complexity of the algorithm by substitutin values in this expression

O(n^{logba})

However, that’s not all, we have `f(n)` in the equation and final runtime of your algorithm depends on the relationship between `n ^{logba}` and

`f(n)`

There are three possible scenarios:

**Scenario 1:** `f(n) = O(n ^{c})` and

**c < log**then the complexity is

_{b}a`n`This mean recursion is taking more time than the combine and merge.

^{logba}**Example:**

T(n) = 8T(n/2) + 1000n

^{2}

Here, log

_{b}a = log

_{2}8 = 3 > c (2); hence complexity is O(n

^{logba}) = O(n

^{3})

**Scenario 2:** `f(n) = O(n ^{c}log^{k}n)` for k >= 0 and

**c = log**then the complexity is

_{b}a`n`. It essentially means same runtime inside and outside the recursion.

^{logba}log^{k+1}n**Example:**

T(n) = 2T(n/2) + 10n

10n can be written as O(nlog

^{0}n) with k = 0.

Here, log

_{b}a = log

_{2}2 = 1 = c (1); hence complexity is

`O(n`= O(nlogn)

^{logba}log^{k+1}n)**Scenario 3:** `f(n) = O(n ^{c})` and

**c > log**then the complexity is

_{b}a`O(n`. It essentially means same runtime outside the recursion is more than split and recurse.

^{c})**Example:**

T(n) = 2T(n/2) + n

^{2}

Here, log

_{b}a = log

_{2}2 = 1 < c (2); hence complexity is

`O(n`= O(n

^{c})^{2})

## Exceptions to Master Theorem

1. T(n) = 2

^{n}T(n/2) + n^{n}.

This is not admissible because `a` is not constant here. For Master theorem to be application `a` and `b` must be constant.

2. T(n) = 0.5 T(n/2) + n

^{2}.

This is not admissible because `a < 1` which is to say we are reducing the problem in less than one subproblems. For Master theorem to be application `a` must be greater than 1.

3. T(n) = 64T(n/2) – n

^{2}logn.

This is not admissible because `f(n)` is negative.

4. T(n) = 64T(n/2) – 2

^{n}

This is not admissible because `f(n)` is not polynomial.

## Master Theorem Examples

Let’s apply the Master theorem on some of the known algorithms and see if it works?

Binary Search Algorithm

In the binary search algorithm, depending on the relationship between the middle element and the key, we discard one part of the array and look into the other. Also, from above, we know the Master theorem:

T(n) = aT(n/b) + f(n)

In this case, `a` is 1 as we are reducing to only one problem. `b` is 2 as we divide the input by half and no outside of recursion no work is done, hence the `f(n)` is `O(1)`

`log _{b}a` is

`log`= 0. So

_{2}1`log`is actually equal to

_{b}a`c`which is 0. In that case, the complexity of the algorithm is defined by

`O(n`, where k = 0. Substituting values in this, we get the complexity of binary search alogrithm as O(logn)

^{logba}log^{k+1}n)Merge Sort Algorithm

In the merge sort algorithm, we split the array into two equal parts and sort them individually. Apart from split we do a merge of elements, which take O(n) time. Let’s find `a`, `b` in the Master Theorem equation.

T(n) = aT(n/b) + f(n)

In this case, `a` is 2 as we are reducing to only two subproblems. `b` is 2 as we divide the input by half and outside of recursion no work is done, hence the `f(n)` is `O(n)`

`log _{b}a` is

`log`= 1. So

_{2}2`log`is actually equal to

_{b}a`c`which is 1. In that case, the complexity of the algorithm is defined by

`O(n`, where k = 0. Substituting values in this, we get the complexity of merge sort alogrithm as O(nlogn)

^{logba}log^{k+1}n)Please book a free session if you are looking for coaching to prepare for your next technical interview. At Algorithms and Me, we provide personalized coaching and mock interviews to prepare you for Amazon, Google, Facebook, etc. interviews.

### References

Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Sections 4.3 (The master method) and 4.4 (Proof of the master theorem), pp. 73–90.

Wiley, 2002. ISBN 0-471-38365-1. The master theorem (including the version of Case 2 included here, which is stronger than the one from CLRS) is on pp. 268–270