Backtracking : Eight Queens problem

Backtracking : Eight Queens problem

Given N x N chessboard, find a way to place N queens such that none of the queen can attack other. A queen can move along the column, row and diagonal of the chess board.

This is typical example of backtracking algorithm. What we need to do is that start with the first queen at the bottom left position and check all other queens can be place satisfying the constraints of the game. If at nay point we can not go forward, we try to replace the previous queen on next safe location and try again. If we are stuck at dead end, then we might need to replace more than one queen from their position so that we can move forward.
Let’s take this example, you have a 4 x 4 chessboard and you need to place four queens on it.
We place the queen at the left most bottom corner. Once we place queen there, our available options are for placing queen 2 are shown.

Looking at the figure below we can infer that there is no way possible to place Q3 and Q4 both with current configuration. Hence we back track and select another available position for Q2

Again we can’t place Q3 and Q4, and we try all other available positions for Q2, but none of those work .(exercise). Hence we change the position of the Q1.
After changing the position of Q1, we can reach the solution as shown in figure below, with couple of more backtracks of course.

Next step to figure out that if the queens is safe at a particular position. We need to check the column, row and diagonal, to make sure no other queen is placed in those places.

8 queens (1)

8 queens (2)

8 Queen solution implementation

#include <stdio.h>
#define N 8

int isColumnSafe(int chessBoard[N][N], int col){
	
	for(int i=0; i<N; i++){
		if(chessBoard[i][col]) return 0;
	}
	return 1;
}

int isRowSafe(int chessBoard[N][N], int row){
	
	for(int i=0; i<N; i++){
		if(chessBoard[row][i]) return 0;
	}
	return 1;
}

int isDiagonalSafe(int chessBoard[N][N], int row, int col){

	int j;

	/* Check the left upper diagonal */

	for(int i=row, j = col; i>=0 && j>=0; i--, j--){
		if(chessBoard[i][j]) return 0;
	}

	/*check left lower diagonal */
	for(int i=row, j = col; i<N && j>=0; i++, j--){
		if(chessBoard[i][j]) return 0;
	}
	
	return 1;
}

int isSafe(int chessBoard[N][N], int row, int col){

	int columnSafe = isColumnSafe(chessBoard, col);
	int rowSafe = isRowSafe(chessBoard, row);
	int diagonalSafe  = isDiagonalSafe(chessBoard, row, col);

	return columnSafe && rowSafe && diagonalSafe;
}

void placeQueen(int chessBoard[N][N], int i, int j){
	chessBoard[i][j] =1;
}
void removeQueen(int chessBoard[N][N], int i, int j){
	chessBoard[i][j] =0;
}

int solveQueens(int chessBoard[N][N], int col){
	
	if(col >= N) return 1;

	for(int i=0; i<N; i++){
		if(isSafe(chessBoard, i, col)){
			placeQueen(chessBoard, i, col);
			if(solveQueens(chessBoard,col+1)) return 1;
			removeQueen(chessBoard,i,col);
		}
	}
	
	return 0;
}  

int main(void) {
	int chessBoard[8][8];
	solveQueens(chessBoard, 0);
	
	return 0;
}

Complexity of backtracking algorithm for 8 queens problem will be O(N*N).>  : http://www-isl.ece.arizona.edu/ece175/assignments275/assignment4a/Solving%208%20queen%20problem.pdf

Quick sort algorithm

Quick sort Algorithm

Quicksort like merge sort is a sorting algorithm under divide and conquer paradigm of algorithms like merge sort. The basic idea of the algorithm is to divide inputs around a pivot and then sort two smaller parts recursively and finally get original input sorted.

Selection of pivot

The entire idea of quicksort revolves around a pivot. Pivot is an element in input around which input is arranged in such a way that all elements on the left side are smaller and all elements on the right side are greater than the pivot. The question is how to find or select pivot and put it into the correct position.

To make things simpler to start with, let’s assume that the first element of the input is pivot element.

To put this pivot at the correct position in input, start with the next element of pivot in input space and find the first element which is greater than pivot. Let that be ith position.

At the same time, start from end of the array and find first element which is smaller than pivot. Let it be jth position.

If i and j have not crossed each other i.e i < j, then swap element at ith and jth positions, and continue moving right on input to find element greater than pivot and moving left to find element smaller than pivot.
Once i and j cross each other, swap pivot with element at jth position.  After this step, pivot will be at its correct position and array will be divided into two parts. All elements on left side will be less than pivot and all elements on right side will be greater than pivot.

Quick sort partition example

This is too much to process, I know! Let’s take an example and see how it does it work? We have an array as follows

quick sort

Let’s select first element as pivot, pivot = 3.

quick sort pivot selection

Start from the next element of the pivot, move towards the right of the array, till we see the first element which is greater than pivot i.e. 3.

From end of the array, move towards left till you find an element that is less than the pivot.

Now, there are two indices, i and j, where A[i] > pivot and A[j] < pivot. See that i and j not yet crossed each other. Hence, we swap A[i] with A[j]. Array at the bottom of pic, shows resultant array after swap.

quick sort partition

Again, start with i+1 and follow the same rule: Stop when you find an element greater than the pivot. In this case, 10 is greater than 3, hence we stop.

Similarly, move left from the end again, until we find an element which is less than the pivot. In this case, we end up at index = 2 which is element 1.

Since i > j, then means paths have been crossed. At this time, instead of swapping element at i and j index, swap element at j index with pivot.

After swapping pivot with jth index, we have array divided into two parts, pivot as a boundary. All elements on the left side of the pivot are smaller (they may not be sorted) and all elements on the right side of the pivot are greater than pivot (again may not be sorted).

quick sort partitions

We, apply this same partition process to left and right arrays again, till the base condition is hit. In this case, the base condition would be if there is only one element in the array to be partitioned.

Quick sort algorithm

quickSort([], start, end)
1. If array has more than one elements i.e (start < end):
1.1 Find correct place for pivot.
pivot = partition(arr, low, high)
1.2. Apply same function recursively to left of pivot index
quickSort(arr, start, pivot -1 )
and to the right of pivot index
quickSort(arr, pivot + 1, end)

Quick sort implementation

package AlgorithmsAndMe;

public class QuickSort {

    private void swap(int [] a, int i, int j){
        int temp = a[i];
        a[i] = a[j];
        a[j] = temp;
    }

    private int partition(int [] a, int start, int end){
        int pivot = a[start];
        int i  = start;
        int j  = end;

        while(i < j){
            while(i <= end && a[i] <= pivot) i++;
            while(j >= start && a[j] > pivot) j--;
            
            if(i < j) {
                swap(a, i, j);
            }
        }

        swap(a, start, j);
        return j;
    }

    public void quickSort(int [] a, int start, int end){
        if(start < end){
            int p = partition(a, start, end);
            quickSort(a,start, p-1);
            quickSort(a, p+1, end);
        }
    }
}

There is another implementation that is based on the Lomuto partition scheme, in this scheme, we make the last element as pivot. The implementation is compact but complexity is a bit higher than the original partition methods in terms of the number of swaps.

#include<stdlib.h>
#include<stdio.h>
 
void swap(int *a, int *b){
    int temp = *a;
    *a = *b;
    *b = temp;
}
 
int partition(int a[], int low, int high)
{
    // set pivot as highest element
    int x  = a[high];
 
    //Current low points to previous of low of this part of array. 
    int i = low - 1;
 
    for (int j = low; j <= high-1; j++)
    {
    	/*Move in the array till current node data is 
        less than the pivot */
        if (a[j] <= x){
            //set the current low appropriately
            i++;
            swap(&a[i], &a[j]);
        }
    }
    //Now swap the next node of current low with pivot
 
    swap(&a[i+1], &a[high]);
 
    printf("\n Pivot : %d\n", a[i+1]);
    for(int j=0; j<=high; j++){
 
    	printf("%d ", a[j]);
    }
    //return current low as partitioning point.
    return i+1;
}
 
/* A recursive implementation of quicksort for linked list */
void quickSortUtil(int a[], int low, int high)
{
    if (low < high)
    {
        int p = partition(a,low, high);
        quickSortUtil(a,low, p-1);
        quickSortUtil(a, p+1, high);
    }
}
 
/* Driver program to run above code */
int main(){
 
    int a[] = {5,4,2,7,9,1,6,10,8};
 
    int size = sizeof(a)/sizeof(a[0]);
    quickSortUtil(a, 0, size-1);
 
    for(int i=0; i<size; i++){
    	printf("%d ", a[i]);
    }
    return 0;
}

Complexity analysis of quick sort algorithm

If pivot splits the original array into two equal parts (which is the intention), the complexity of quicksort is O(nlogn). However, worst-case complexity of quick sort happens when the input array is already sorted in increasing or decreasing order. In this case, array is partitioned into two subarrays, one with size 1 and other with size n-1. Similarly, subarray with n-1 elements, it again is divided into two subarrays of size 1 and n-2. In order to completely sort the array, it will split for n-1 times, and each time it requires to traverse n element to find the correct position of pivot. Hence overall complexity of quick sort comes out as O(n2).

There is a very interesting question, which tests your understanding of system basics. Question is what is the space complexity of this algorithm? There is no apparent memory is used. However, recursive implementation internally puts stack frames on the stack for partitioned indices and function call return addresses and so on. In the worst case, there can be n stack frames, hence worst-case complexity of quicksort will be O(n).

How can we reduce that? If the partition with fewest elements is (recursively) sorted first, it requires at most O(log n) space. Then the other partition is sorted using tail recursion or iteration, which doesn’t add to the call stack. This idea, was described by R. Sedgewick, and keeps the stack depth bounded by O(log n) and hence space complexity will be O(log n).

Quick sort with tail recursion

Quicksort(A, p, r)
{
 while (p < r)
 {
  q = Partition(A, p, r)
  Quicksort(A, p, q)
  p = q+1
 }
}

Selection of Pivot
If an array is completely sorted, then the worst-case behavior of quicksort is O(n2), so there comes another problem. How can we select pivot so that two subarrays are almost equal size? There are many solutions proposed.
1. Taking median of the array as a pivot. So how to select the median of an unsorted array. We look into this problem separately, but yes it guarantees two halves of the same size.
2. Selecting pivot randomly. This requires heuristics algorithms to select pivot.

Please leave your comment in case you find something wrong or you have some improved version.

Count sort : Sorting in linear time

Count sort : Sorting in linear time

Are there any sorting algorithm which has worst case complexity of O(n)? There are a few like count sort and decision tree algorithms. In this post we would discuss about count sort and couple of problems where this counting sort algorithm can be applied.

Counting sort was invented by Harold H. Seward
To apply counting sort, we need to keep in mind following assumptions:

  1. There should be duplicate values in the input
  2. There should be at most K different type of values in input.
  3. The input data ranges in 0 to K

Count sort goes for O(n2) if K is very close to n i.e. a very few elements are duplicate and rest all are unique. So above three conditions are necessary to have this algorithm work in linear time.

Count Sort -Approach
Let’s see how does it work? There are three steps involved in the algorithm:

  • Sampling the data, counting the frequencies of each element in the input set.
  • Accumulating frequencies to find out relative positions of each element.
  • Third step distributes each element to its appropriate place.
  • Now let’s take one example and go through the above three steps:
    Input is  an array A = [1,2,3,2,2,3,3,1,1,1,1,3,2,2]. Here we can see that K=3. Now let’s perform the first step of the method, i.e. count the frequency of each element. We keep a hash and keep track of each element’s count. Since values are upper bound by K, we need at most K sized hash. We initialize this hash as zero. A is input array and n is the size of the array.

    int char count [K];
    
    for(i=0; i<K;i++){
            count[i]=0;
    }
    
    for(i=0;i<n;i++){
            count[a[i]]++;
    }
    

    Count looks likes count =[5,5,4]. The complexity of this step is O(n). The second step involves the accumulation of frequencies where we add frequencies of previous elements to the current element.
    F(j) = summation of f[i] for all i=0

    F(j) = summation of f[i] for all i<=j and i>=0
    
    for(i=1;i<K;i++){
       Count[i] +=count[i-1];
    }
    

    The second step runs for K times hence the complexity of O(K). The third step involves distribution. Let’s create an output array called as B of size n For every element in A we check the corresponding count in count array and place the element at that position. So, first 1 will be at position 4 (there are total 5 and array is an index from 0), the first two will be placed at 9 and the first three will be at 13. Subsequently, lower positions will be filled as we encounter them in the input array.

    for(i=0; i<n;i++){
           B[--count[a[i]]] = a[i];
    }
    

    This step has complexity of O(n)

    #include<stdlib.h>
    #include<stdio.h>
    
    void count_sort(int a[], int n, int K){
    
            int count [K];
            int i,j;
            int b[n];
            for(i=0; i<=K;i++){
                    count[i]=0;
            }
    
            for(i=0;i<n;i++){
                    count[a[i]]++;
            }
    
            for(i=1;i<=K;i++){
                    count[i] +=count[i-1];
            }
            for(j=0;j<n;j++){
                    b[j]=0;
            }
    
            for(i=0; i<n;i++){
                    count[a[i]]--;
                    b[count[a[i]]] = a[i];
            }
    }
    int main(){
            int a[]= {0,1,1,1,0,0,0,3,3,3,4,4};
            int n  =  sizeof(a)/sizeof(a[0]);
            
            count_sort(a,n,4);
    }
    
    Iter 0 :0  0  0  0  1  0  0  0  0  0  0  0  0  0  
    Iter 1 :0  0  0  0  1  0  0  0  0  2  0  0  0  0  
    Iter 2 :0  0  0  0  1  0  0  0  0  2  0  0  0  3  
    Iter 3 :0  0  0  0  1  0  0  0  2  2  0  0  0  3  
    Iter 4 :0  0  0  0  1  0  0  2  2  2  0  0  0  3  
    Iter 5 :0  0  0  0  1  0  0  2  2  2  0  0  3  3  
    Iter 6 :0  0  0  0  1  0  0  2  2  2  0  3  3  3  
    Iter 7 :0  0  0  1  1  0  0  2  2  2  0  3  3  3  
    Iter 8 :0  0  1  1  1  0  0  2  2  2  0  3  3  3  
    Iter 9 :0  1  1  1  1  0  0  2  2  2  0  3  3  3  
    Iter 10 :1  1  1  1  1  0  0  2  2  2  0  3  3  3  
    Iter 11 :1  1  1  1  1  0  0  2  2  2  3  3  3  3  
    Iter 12 :1  1  1  1  1  0  2  2  2  2  3  3  3  3  
    Iter 13 :1  1  1  1  1  2  2  2  2  2  3  3  3  3  
    
    Final O/P :1  1  1  1  1  2  2  2  2  2  3  3  3  3
    

    Total complexity of the algorithm being O(K)+ O(n) + O(K) +O(n) = O(K+n). Please refer to the master theorem, how this is derived. Now if we see, K+n remains in order of n till the time K<n, if K goes on to n^2, the complexity of algorithm goes for polynomial time instead of linear time.

    There is immediate application of this algorithm in following problem:
    Let’s there is an array which contains Black, White and Red balls, we need to arrange these balls in such a way that all black balls come first, white second and Red at last. Hint: assign black 0, white 1 and Red 2 and see. 🙂

    In the next post, we would discuss how extra space can be saved, how initial positions of elements can be maintained. We would go through an interesting problem to discuss the above optimization.

    Why this filling from the end of the span? Because this step makes count sort stable sort. Here we have seen a sorting algorithm which can give us o/p linear time given some particular conditions met on the i/p data.