Go back

Data Structures and Algorithms - Time Complexities

Sorting AlgorithmBest CaseAverage CaseWorst Case
Bubble SortO(n2)O(n^2)O(n2)O(n^2)O(n2)O(n^2)
Insertion SortO(n)O(n)O(n2)O(n^2)O(n2)O(n^2)
Selection SortO(n2)O(n^2)O(n2)O(n^2)O(n2)O(n^2)
Quick SortO(nlogn)O(n\log{n})O(nlogn)O(n\log{n})O(n2)O(n^2)
Merge SortO(nlogn)O(n\log{n})O(nlogn)O(n\log{n})O(nlogn)O(n\log{n})

The resonse behind each time compexities

Bubble Sort & Selection Sort

For the first run we go through all the n elements For second run we go through n-1 elements For nth run we go through 1 element

So total runs = n+(n1)+...+1n + (n-1) + ... + 1 =n(n+1)2= \frac{n(n+1)}{2} =n22+n2= \frac{n^2}{2} + \frac{n}{2}

Now for time complexity calculation we only focus on the highest degree and we ignore the coefficients. So the time complexity stands here O(n2)O(n^2)

This is for best case average case and worst case.

Insertion sort - best case

For average and worst case the calculation is same as bubble and selection sort

But for best case, we only check 1 element in the internal loop every time (which is for already sorted array)

Hence the complexity becomes O(n1)=O(n)O(n*1) = O(n)

Quick Sort and Merge sort

In all cases of merge sort and in Best and average case of quicksort we divide the array in two halves each time. And in each layer of the tree of division we go through n elements.

Now the height of a tree is logn\log{n}

So the merge of n elements goes logn\log{n} times

So the time complexity stands O(nlogn)O(n \log{n})

Quick Sort worst case

In worst case of Quick Sort the time complexity becomes O(n2)O(n^2) beacuse if we select the worst pivot every time we first time go through n elements then n-1 elements and finally reach 1, which makes the total runs as -

n+(n1)+...+1n + (n-1) + ... + 1 =n(n+1)2= \frac{n(n+1)}{2} =n22+n2= \frac{n^2}{2} + \frac{n}{2}

So the time complexity stands here O(n2)O(n^2)