The problem of sorting is an ideal problem to introduce many concepts of fundamental algorithm design. Some good reasons to study sorting are:

sorting is used by many applications,

sorting is the (first) step of many algorithms,

many techniques can be illustrated by studying sorting.

For us the last reason is most important. In particular, sorting allows us to study the algorithmic techniques that we have seen for the searching problem on a more complex algorithmic problem. In the following I give an overview of the sorting algorithms discussed in the course. Before I do so, let me define the swap routine that is used by many of the sorting algorithms.

In [1]:

```
def swap(A, i, j):
A[i], A[j] = A[j], A[i]
```

In [9]:

```
def selection_sort(A):
n = len(A)
for i in range(n):
smallest = i
for j in range(i+1 , n):
if A[j] < A[smallest]: smallest = j
swap(A, smallest, i )
A = [5,2,8,6]
selection_sort(A)
print(A)
```

*incremental algorithm*. That means that it incrementally computes the solution for an increasing subset of the input. Specifically it sorts the subarray $A[0..i-1]$ for increasing $i$. While Insertion Sort is slow asymptotically, it is fast for small input.

In [1]:

```
def insertion_sort(A):
n = len(A)
# at the beginning of the j-th iteration the subarray A[0..j-1] is sorted
for j in range(1,n):
key = A[j]
i = j -1
while i >= 0 and A[i] > key:
A[i+1] = A[i]
i = i -1
A[i+1] = key
A = [5,2,8,6]
insertion_sort(A)
print(A)
```

*divide&conquer* algorithm. A divide&conquer algorithm divides the problem into smaller subproblems, solves the subproblems recursively, and then combines the solutions to the subproblems.

In [4]:

```
def merge_sort(A):
n = len(A)
if n>1:
mid = n//2
A1 = A[:mid]
A2 = A[mid:]
# recursive calls:
merge_sort(A1)
merge_sort(A2)
# merge solutions
i=0
j=0
while i<mid and mid+j<n:
if A1[i]<A2[j]:
A[i+j]=A1[i]
i+=1
else:
A[i+j]=A2[j]
j+=1
while i<mid:
A[i+j]=A1[i]
i+=1
while mid+j<n:
A[i+j]=A2[j]
j+=1
A = [5,2,8,6]
merge_sort(A)
print(A)
```

*divide&conquer* algorithm. In contrast to Merge Sort the main work of the algorithm happens before the recursive calls.

In [3]:

```
def partition(A, p, r):
x = A[r]
i = p-1
for j in range(p,r):
if A[j] <= x:
i+=1
swap(A, i, j)
swap(A, i+1, r)
return i+1
def quick_sort(A, p=0, r=None):
if r==None: r=len(A)-1
if p<r:
q = partition(A, p, r)
quick_sort(A, p, q-1)
quick_sort(A, q+1, r)
A = [5,2,8,6]
quick_sort(A)
print(A)
```

In [18]:

```
def counting_sort(A, k):
B = [0] * len(A)
C = [0] * (k+1)
for x in A:
C[x] += 1
for i in range(1,k+1):
C[i] += C[i-1]
for j in reversed(range(len(A))):
B[C[A[j]]-1] = A[j]
C[A[j]] -= 1
return B
A = [5,2,8,6]
print(counting_sort(A, 8))
```

The running time of Counting Sort is $O(n+k)$.

In [38]:

```
def counting_sort_digit(A, base, div):
k = base-1
B = [0] * len(A)
C = [0] * (k+1)
for x in A:
C[(x//div)%base] += 1
for i in range(1,k+1):
C[i] += C[i-1]
for j in reversed(range(len(A))):
index = (A[j]//div)%base
B[C[index]-1] = A[j]
C[index] -= 1
return B
def radix_sort(A, base, d):
div = 1
for i in range(d):
A = counting_sort_digit(A, base, div)
div *= base
return A
A = [15,26,18,16]
print(radix_sort(A, 10, 2))
```

The running time of Radix Sort is $O(d(n+base))$.