Hello folks ๐, today we are implementing Quicksort algorithm. Quicksort is an in-place sorting algorithm. Developed by British computer scientist Tony Hoare in 1959 and published in 1961, it is still a commonly used algorithm for sorting. When implemented well, it can be somewhat faster than merge sort and about two or three times faster than heapsort.

QuickSort is a Divide and Conquer algorithm. It picks an element as a pivot and partitions the given array around the picked pivot. There are many different versions of quickSort that pick pivot in different ways.

- Always pick the first element as pivot.
- Always pick the last element as the pivot (implemented below)
- Pick a random element as a pivot.
- Pick median as a pivot.

The key process in quicksort is partition. Target of partition is, given an array and an element x of an array as pivot, put x at its correct position in a sorted array and put all smaller elements (smaller than x) before x, and put all greater elements (greater than x) after x.

```
# Start and end points to first and last element of an array respectively
def partition(start, end, arr):
# Initialize pivot's index to start
pivot_index = start
pivot = arr[pivot_index]
# This loop run till start pointer crosses end pointer
# and it does we swap the pivot with element on end pointer
while start < end:
# Increment the start pointer till it find an element greater than pivot
while start < len(arr) and arr[start] <= pivot:
start += 1
# Decrement the end pointer till it finds an element less then pivot
while arr[end] > pivot:
end -= 1
# if start & end have not crossed each other, swap then numbers
if (start < end):
arr[start], arr[end] = arr[end], arr[start]
# swap pivot element with element in end pointer. it puts pivot at its correct place
arr[end], arr[pivot_index] = arr[pivot_index], arr[end]
# return end pointer to divide arr into 2
return end
def quickSort(start, end, arr):
if start < end:
# p is paftition idx, arr p is at right place
p = partition(start, end, arr)
# sort element before and after partition
quickSort(start, p - 1, arr)
quickSort(p + 1, end, arr)
arr = [3, 12, 5, 2, 9, 1, 4]
quickSort(0, len(arr) - 1, arr)
print(arr)
# [1, 2, 3, 4, 5, 9, 12]
```

*Worst Case Time Complexity*:
The worst-case occurs when the partition process always picks the greatest or smallest element as pivot. If we consider the above strategy where last element is always picked as pivot, the worst case would occur when the array is sorted in increasing or decreasing order.

```
A(n) = A(0) + A(n-1) + ...(n)
is equivalent to
A(n) = A(n-1) + ...(n)
```

The worst-case time complexity of Quick Sort would be O(n^2).

*Best case scenario*:
The best-case scenario occurs when the partition are balanced, for example: sizes on either side of the pivot element are either are equal or are have size difference of 1 of each other.
Case 1: When size of the sublist on either side of pivot becomes equal occurs when the subarray has an odd element and the pivot is right in the middle after partitioning, each partition have `(n-1) / 2`

elements.

Case 2: The size difference of 1 between the two sublists on either side of pivot happens if the subarray has an even number, n, of elements. One partition will have n/2 elements with the other having (n/2)-1.

I either cases, each partiotion will have at most `n/2`

elements. The best-case complexity of the quick sort algorithm is `O(n logn)`

.

If you like this article, let me know in the comments below or follow me on Twitter and Instagram