Selection Sort
Steps
Complexity
Time
Best ⇔ Average ⇔ Worst: O(n2)
Space
O(1)
Bubble Sort
Steps
Complexity
Time
Best: O(n) ♦ Average: O(n2) ♦ Worst: O(n2)
Space
O(1)
Insertion Sort (Pairwise Swaps)
Steps
Complexity
Time
Best: O(n) ♦ Average: O(n2) ♦ Worst: O(n2)
Space
O(1)
Insertion Sort (Shifting)
Steps
Complexity
Time
Best: O(n) ♦ Average: O(n2) ♦ Worst: O(n2)
Space
O(1)
Shellsort
Steps
Complexity
Time
Best: O(n log n) ♦ Worst: O(n3/2)
Space
O(1)
*Analysis is an open topic
*Analysis depends on gap sequence used; this assumes knuth’s sequence
Mergesort
Steps
Complexity
Time
Best ⇔ Average ⇔ Worst: O(n log n)
Space
O(n)
*Stack space is log(n) (depth of the recursive tree)
*Merge operation requires O(n) (in call stack, space for merge has not yet been allocated)
Quicksort
Steps
Complexity
Time
Best: O(n log n) ♦ Average: O(n log n) ♦ Worst: O(n2)
Space
Best ⇔ Average: O(log n) ♦ Worst: O(n)
*n2 worst case occurs when array is sorted in reverse order, and partition operation continually divides array at beginning
*Requires constant (ie: no) aux space (in-place operation)
*Stack depth is log(n) in best and average case, but in worst case, array in each recursive call is only 1 smaller than previous, so height of tree is n (instead of log(n))
Parition
Steps
Heapsort
Steps
Complexity
Time
Best ⇔ Average ⇔ Worst: O(n log n)
Space
O(1)
Bucket Sort
Steps
Complexity
Time
Best: O(n + k) ♦ Average: O(n + k) ♦ Worst: O(n2)
Space
O(n + k)
*k is the number of buckets
*n2 running time in worst case is because of insertion sort
Counting Sort
Steps
Complexity
Time
Best ⇔ Average ⇔ Worst: O(n + k)
Space
O(n + k)
*k is the number of keys
Radix Sort
Steps:
Complexity
Time
Best: O(n) ♦ Average: O(w(n + k)) ♦ Worst: O(w(n + k))
Space
O(n + k)
*Complexity assumes counting sort
*Linear result requires some analysis, but involves minimizing running time by choosing “best k”
*w is the length of the word; k is the number of keys (size of “alphabet”)