Chapter 9

Sorting Algorithms

Master fundamental and advanced sorting techniques from bubble sort to radix sort. Learn algorithm properties like stability and in-place behavior, understand complexity tradeoffs, and know exactly which sort to use in any interview or production scenario.

13
Sections
10+
Algorithms
50+
Code Examples
10
Practice Problems
Table of Contents
9.1

Sorting Overview

Sorting is the process of arranging data in a specified order. It's fundamental to computer science: enables binary search, simplifies numerous algorithmic problems, and is often a bottleneck in performance-critical systems. Understanding when and which sorting algorithm to use is essential for any software engineer.

Why Sorting Matters

Sorting unlocks efficiency. An unsorted array requires O(n) for linear search; a sorted array enables O(log n) binary search. Many problems become trivial once data is ordered: finding medians, detecting duplicates, merging datasets, and computing ranges all become simpler. In interviews, candidates who recognize when sorting solves a problem stand out.

Comparison vs Non-Comparison Sorts

Comparison Sorts
Bubble, Selection, Insertion, Merge, Quick, Heap
Lower bound: Ω(n log n) proven by information theory
Can sort any comparable type
More memory-efficient for small datasets
Non-Comparison Sorts
Counting, Radix, Bucket
Can achieve O(n) under specific constraints
Require knowledge of data distribution
Excellent for integers, strings with known ranges

Stability in Sorting

A sort is stable if elements with equal keys maintain their relative order. Example: sorting [("Alice", 25), ("Bob", 30), ("Charlie", 25)] by age—stable sorts keep Alice before Charlie; unstable sorts might not.

ℹ️
When stability matters: Multi-key sorting (sort by age, then keep original order for same age). Sorting objects where you need to preserve insertion order for equal elements. When combining sorts (primary key by price, secondary by rating).

In-Place vs Out-of-Place

In-place: Uses O(1) or O(log n) extra space (quick sort, heap sort). Out-of-place: Uses O(n) extra space (merge sort, counting sort). In-place is preferable for memory-constrained systems; out-of-place often enables stability and parallelization.

Master Comparison Table

Algorithm Best Case Avg Case Worst Case Space Stable In-Place
Bubble Sort O(n) O(n²) O(n²) O(1) Yes Yes
Selection Sort O(n²) O(n²) O(n²) O(1) No Yes
Insertion Sort O(n) O(n²) O(n²) O(1) Yes Yes
Merge Sort O(n log n) O(n log n) O(n log n) O(n) Yes No
Quick Sort O(n log n) O(n log n) O(n²) O(log n) No Yes
Heap Sort O(n log n) O(n log n) O(n log n) O(1) No Yes
Counting Sort O(n+k) O(n+k) O(n+k) O(k) Yes No
Radix Sort O(nk) O(nk) O(nk) O(n+k) Yes No
Bucket Sort O(n+k) O(n+k) O(n²) O(n+k) Yes No
9.2

Bubble Sort

The simplest sorting algorithm. Repeatedly scan the array, swapping adjacent elements if they're in wrong order. Largest values "bubble" to the end. Useful primarily for teaching or tiny datasets where simplicity trumps speed.

Algorithm Walkthrough

Bubble sort makes multiple passes through the array. On each pass, it compares adjacent pairs and swaps if the left element is greater. After pass i, the last i elements are in final position. Early termination optimization: if a pass makes no swaps, the array is sorted.

1
Pass 1: Compare arr[0] and arr[1], swap if needed. Continue to end. Largest element is now at final position.
2
Pass 2: Repeat but ignore last element. Second-largest element finds final position.
3
Repeat: Continue until no swaps occur or array is sorted.

Full C# Implementation

Bubble Sort (Optimized)
With early termination flag for nearly sorted data.
01
Best
O(n)
Average
O(n²)
Worst
O(n²)
Space
O(1)
C#
public static void BubbleSort(int[] arr) {
    int n = arr.Length;
    for (int i = 0; i < n - 1; i++) {
        bool swapped = false;
        // Last i elements already in place
        for (int j = 0; j < n - i - 1; j++) {
            if (arr[j] > arr[j + 1]) {
                // Swap
                int temp = arr[j];
                arr[j] = arr[j + 1];
                arr[j + 1] = temp;
                swapped = true;
            }
        }
        // If no swaps, array is sorted
        if (!swapped) break;
    }
}
When to Use Bubble Sort
Only for n < 50 with simplicity as priority. Nearly sorted arrays with early termination. Educational purposes. Never in production for unsorted data.
9.3

Selection Sort

Find the minimum element in unsorted portion and place it at the beginning. Repeat. Always O(n²) regardless of input, but minimizes memory writes—useful when writes are expensive.

Algorithm: Step-by-Step

1
Find minimum: Search unsorted portion (index i to n-1) for smallest element.
2
Swap: Place minimum at position i. Now arr[0..i] is sorted with minimum values.
3
Repeat: Move i forward and repeat until entire array sorted.

C# Implementation

C#
public static void SelectionSort(int[] arr) {
    int n = arr.Length;
    for (int i = 0; i < n - 1; i++) {
        int minIdx = i;
        // Find minimum in remaining array
        for (int j = i + 1; j < n; j++) {
            if (arr[j] < arr[minIdx]) {
                minIdx = j;
            }
        }
        // Swap minimum to position i
        int temp = arr[i];
        arr[i] = arr[minIdx];
        arr[minIdx] = temp;
    }
}

Why Always O(n²)?

Even if the array is already sorted, selection sort must scan the remaining unsorted portion to confirm there's no smaller element. First scan: n-1 comparisons. Second: n-2. Total: (n-1) + (n-2) + ... + 1 = n(n-1)/2 = Θ(n²).

Selection Sort Pros
Minimal memory writes (at most n swaps)
Deterministic O(n²) regardless of input
Simple implementation
Selection Sort Cons
Always O(n²)—no best case improvement
Not stable (same-key elements may change order)
Poor cache locality
9.4

Insertion Sort

Build sorted array one element at a time. Excellent for nearly sorted data, small arrays, and as the base case in hybrid sorts like Timsort. Best case O(n) when array is already sorted.

Algorithm: Build Left to Right

Maintain a sorted region on the left. For each new element, insert it into correct position within the sorted region by shifting larger elements right.

1
Start with element 1: Elements 0..0 form sorted region.
2
For each element i from 1 to n-1: Find correct position in sorted region 0..i-1.
3
Shift and insert: Shift larger elements right, insert element at correct position.

Full C# Implementation

Insertion Sort
Stable, in-place, excellent for nearly sorted data.
02
Best
O(n)
Average
O(n²)
Worst
O(n²)
Space
O(1)
C#
public static void InsertionSort(int[] arr) {
    for (int i = 1; i < arr.Length; i++) {
        int key = arr[i];
        int j = i - 1;
        // Shift larger elements right
        while (j >= 0 && arr[j] > key) {
            arr[j + 1] = arr[j];
            j--;
        }
        // Insert key at correct position
        arr[j + 1] = key;
    }
}

Binary Insertion Sort Variant

Use binary search to find insertion position—reduces comparisons from O(n²) to O(n log n). However, shifts still cost O(n), so overall remains O(n²). Useful when comparisons are expensive but shifts are cheap.

C#
public static void BinaryInsertionSort(int[] arr) {
    for (int i = 1; i < arr.Length; i++) {
        int key = arr[i];
        // Binary search for position
        int pos = BinarySearch(arr, 0, i - 1, key);
        // Shift and insert
        for (int j = i - 1; j >= pos; j--) {
            arr[j + 1] = arr[j];
        }
        arr[pos] = key;
    }
}

private static int BinarySearch(int[] arr, int left, int right, int key) {
    while (left <= right) {
        int mid = left + (right - left) / 2;
        if (arr[mid] < key) left = mid + 1;
        else right = mid - 1;
    }
    return left;
}

Role in Timsort

Python's built-in sort (Timsort) and Java's Arrays.sort() use insertion sort as the base case for small subarrays (typically 32-64 elements). Why? Insertion sort has low overhead, excellent cache behavior on small arrays, and O(n) best case for nearly sorted data—common within larger sorted runs.

Use Insertion Sort When
Data nearly sorted. Array size < 50. Hybrid sort's base case. Stable sort needed and memory constrained. Comparisons expensive, shifts cheap.
9.5

Merge Sort

Divide-and-conquer algorithm with guaranteed O(n log n) time in all cases. Stable sort that requires O(n) extra space. Excellent for linked lists and external sorting (sorting data larger than memory).

Algorithm: Divide & Conquer

1
Divide: Split array at midpoint. Recursively sort left half and right half.
2
Conquer: Base case: arrays of size 1 are trivially sorted.
3
Merge: Combine two sorted halves into single sorted array in O(n) time.

Recursive Merge Sort Implementation

Merge Sort (Top-Down)
Classic recursive divide-and-conquer implementation.
03
Time All
O(n log n)
Space
O(n)
Stable
Yes
In-Place
No
C#
public static void MergeSort(int[] arr, int left, int right) {
    if (left < right) {
        int mid = left + (right - left) / 2;
        MergeSort(arr, left, mid);
        MergeSort(arr, mid + 1, right);
        Merge(arr, left, mid, right);
    }
}

private static void Merge(int[] arr, int left, int mid, int right) {
    // Copy to temp arrays
    int[] leftArr = new int[mid - left + 1];
    int[] rightArr = new int[right - mid];

    for (int i = 0; i < leftArr.Length; i++)
        leftArr[i] = arr[left + i];
    for (int i = 0; i < rightArr.Length; i++)
        rightArr[i] = arr[mid + 1 + i];

    // Merge back
    int l = 0, r = 0, k = left;
    while (l < leftArr.Length && r < rightArr.Length) {
        if (leftArr[l] <= rightArr[r]) {
            arr[k++] = leftArr[l++];
        } else {
            arr[k++] = rightArr[r++];
        }
    }
    // Copy remaining elements
    while (l < leftArr.Length)
        arr[k++] = leftArr[l++];
    while (r < rightArr.Length)
        arr[k++] = rightArr[r++];
}

Bottom-Up (Iterative) Merge Sort

Build sorted array bottom-up instead of recursively. Start with size 1, merge to size 2, then 4, 8, etc. Avoids recursion overhead and is more cache-friendly.

C#
public static void BottomUpMergeSort(int[] arr) {
    int n = arr.Length;

    // Start with merge subarrays of size 1, then 2, 4, 8...
    for (int currSize = 1; currSize < n; currSize *= 2) {
        for (int leftStart = 0; leftStart < n - 1; leftStart += currSize * 2) {
            int mid = Math.Min(leftStart + currSize - 1, n - 1);
            int rightEnd = Math.Min(leftStart + currSize * 2 - 1, n - 1);
            Merge(arr, leftStart, mid, rightEnd);
        }
    }
}

Merge Sort for Linked Lists

Merge sort is ideal for linked lists (unlike quicksort which needs random access). It maintains O(n log n) time and O(log n) space (recursion stack) without needing contiguous memory.

C#
public class ListNode {
    public int val;
    public ListNode next;
}

public ListNode MergeSortList(ListNode head) {
    if (head == null || head.next == null) return head;

    // Find middle using slow/fast pointers
    ListNode slow = head, fast = head.next;
    ListNode prev = null;
    while (fast != null && fast.next != null) {
        prev = slow;
        slow = slow.next;
        fast = fast.next.next;
    }

    if (prev != null) prev.next = null;

    ListNode left = MergeSortList(head);
    ListNode right = MergeSortList(slow);

    return MergeLists(left, right);
}

private ListNode MergeLists(ListNode l1, ListNode l2) {
    ListNode dummy = new ListNode { val = 0 };
    ListNode curr = dummy;
    while (l1 != null && l2 != null) {
        if (l1.val < l2.val) {
            curr.next = l1;
            l1 = l1.next;
        } else {
            curr.next = l2;
            l2 = l2.next;
        }
        curr = curr.next;
    }
    curr.next = l1 ?? l2;
    return dummy.next;
}

Counting Inversions Using Merge Sort

An inversion is a pair (i, j) where i < j but arr[i] > arr[j]. Count inversions by augmenting merge sort: when merging, if an element from the right subarray is chosen before one from the left, that's an inversion.

C#
public static long CountInversions(int[] arr) {
    return MergeSortCount(arr, 0, arr.Length - 1);
}

private static long MergeSortCount(int[] arr, int left, int right) {
    long invCount = 0;
    if (left < right) {
        int mid = left + (right - left) / 2;
        invCount += MergeSortCount(arr, left, mid);
        invCount += MergeSortCount(arr, mid + 1, right);
        invCount += MergeCount(arr, left, mid, right);
    }
    return invCount;
}

private static long MergeCount(int[] arr, int left, int mid, int right) {
    int[] leftArr = new int[mid - left + 1];
    int[] rightArr = new int[right - mid];
    long invCount = 0;

    for (int i = 0; i < leftArr.Length; i++)
        leftArr[i] = arr[left + i];
    for (int i = 0; i < rightArr.Length; i++)
        rightArr[i] = arr[mid + 1 + i];

    int l = 0, r = 0, k = left;
    while (l < leftArr.Length && r < rightArr.Length) {
        if (leftArr[l] <= rightArr[r]) {
            arr[k++] = leftArr[l++];
        } else {
            arr[k++] = rightArr[r++];
            invCount += (leftArr.Length - l);
        }
    }
    while (l < leftArr.Length)
        arr[k++] = leftArr[l++];
    while (r < rightArr.Length)
        arr[k++] = rightArr[r++];

    return invCount;
}
ℹ️
Merge Sort Guarantee: Θ(n log n) worst-case is unmatched by quicksort. When you need guaranteed performance (real-time systems, worst-case SLAs), merge sort is the answer.
9.6

Quick Sort

Fastest average-case sort (O(n log n)) and most cache-friendly. Based on partitioning around a pivot. Worst case O(n²) but rarely happens in practice with good pivot selection. Preferred in production (built into most languages).

Partitioning: Lomuto Scheme

Pick last element as pivot. Partition array so all elements ≤ pivot are on left, all > pivot on right. Return pivot's final position. Time O(n).

1
Initialize i: Start at -1 (will track rightmost element ≤ pivot).
2
Scan j from low to high-1: If arr[j] < pivot, swap with arr[i+1], increment i.
3
Place pivot: Swap pivot with arr[i+1]. Return i+1 as pivot's final position.

Quick Sort with Lomuto Partition

Quick Sort (Lomuto)
Simple partition scheme. Pivot is last element.
04
Avg
O(n log n)
Worst
O(n²)
Space
O(log n)
Stable
No
C#
public static void QuickSort(int[] arr) {
    QuickSortHelper(arr, 0, arr.Length - 1);
}

private static void QuickSortHelper(int[] arr, int low, int high) {
    if (low < high) {
        int pi = Partition(arr, low, high);
        QuickSortHelper(arr, low, pi - 1);
        QuickSortHelper(arr, pi + 1, high);
    }
}

private static int Partition(int[] arr, int low, int high) {
    int pivot = arr[high];
    int i = low - 1;

    for (int j = low; j < high; j++) {
        if (arr[j] < pivot) {
            i++;
            // Swap arr[i] and arr[j]
            int temp = arr[i];
            arr[i] = arr[j];
            arr[j] = temp;
        }
    }
    // Place pivot at its correct position
    int pivotTemp = arr[i + 1];
    arr[i + 1] = arr[high];
    arr[high] = pivotTemp;

    return i + 1;
}

Hoare Partition Scheme

More efficient than Lomuto. Two pointers start at opposite ends, move toward center. Swap when left finds element > pivot and right finds element < pivot. Often 3x faster in practice.

C#
private static int HoarePartition(int[] arr, int low, int high) {
    int pivot = arr[low];
    int i = low - 1;
    int j = high + 1;

    while (true) {
        // Find element on left > pivot
        do { i++; } while (arr[i] < pivot);
        // Find element on right < pivot
        do { j--; } while (arr[j] > pivot);

        // Swap if pointers haven't crossed
        if (i < j) {
            int temp = arr[i];
            arr[i] = arr[j];
            arr[j] = temp;
        } else {
            return j;
        }
    }
}

Randomized Pivot Selection

Worst case O(n²) occurs when pivot is always smallest/largest element (sorted array with middle pivot). Solution: pick random element as pivot. Guarantees O(n log n) with high probability regardless of input.

C#
private static Random rand = new Random();

private static int RandomizedPartition(int[] arr, int low, int high) {
    // Pick random index and swap with high
    int randomIdx = low + rand.Next(high - low + 1);
    int temp = arr[randomIdx];
    arr[randomIdx] = arr[high];
    arr[high] = temp;

    // Continue with standard Lomuto partition
    return Partition(arr, low, high);
}

3-Way Partition for Duplicates

If array has many duplicates, standard partitioning wastes time. 3-way partition (Dutch National Flag) divides array into < pivot, = pivot, > pivot. Elements equal to pivot are already in final position—no recursion needed.

C#
private static void QuickSortThreeWay(int[] arr, int low, int high) {
    if (low < high) {
        int[] indices = PartitionThreeWay(arr, low, high);
        QuickSortThreeWay(arr, low, indices[0] - 1);
        QuickSortThreeWay(arr, indices[1] + 1, high);
    }
}

private static int[] PartitionThreeWay(int[] arr, int low, int high) {
    int pivot = arr[high];
    int i = low;
    int j = high - 1;

    while (i <= j) {
        if (arr[i] < pivot) {
            i++;
        } else if (arr[i] > pivot) {
            int temp = arr[i];
            arr[i] = arr[j];
            arr[j] = temp;
            j--;
        } else {
            i++;
        }
    }
    // Place pivot
    int pivotTemp = arr[i];
    arr[i] = arr[high];
    arr[high] = pivotTemp;

    return new int[] { i - // first equal, i // first greater };
}

Quick Select: Finding Kth Smallest Element

Quick select uses the partitioning logic from quicksort but without recursing on both sides. After partitioning, if pivot's position equals k, we found it. Otherwise, recurse only on relevant side. Average O(n).

C#
public static int QuickSelect(int[] arr, int k) {
    // Find kth smallest (0-indexed)
    return QuickSelectHelper(arr, 0, arr.Length - 1, k - 1);
}

private static int QuickSelectHelper(int[] arr, int low, int high, int k) {
    if (low == high) return arr[low];

    int pi = Partition(arr, low, high);

    if (k == pi) {
        return arr[k];
    } else if (k < pi) {
        return QuickSelectHelper(arr, low, pi - 1, k);
    } else {
        return QuickSelectHelper(arr, pi + 1, high, k);
    }
}
⚠️
Worst Case Warning: Sorted array + fixed pivot (e.g., middle) = O(n²). Always use randomized pivot or median-of-three in production code.
9.7

Heap Sort

Guaranteed O(n log n) in all cases, in-place, unstable. Build max-heap from array, repeatedly extract max element and place at end. Not cache-friendly despite asymptotic optimality—quicksort usually faster in practice.

Heapify Process

Transform unsorted array into valid max-heap (parent ≥ children). Start from last non-leaf node, sift down each element to maintain heap property.

Heap Sort Implementation

Heap Sort
Build heap, extract max repeatedly.
05
Time All
O(n log n)
Space
O(1)
Stable
No
In-Place
Yes
C#
public static void HeapSort(int[] arr) {
    int n = arr.Length;

    // Build max heap in O(n)
    for (int i = n / 2 - 1; i >= 0; i--) {
        Heapify(arr, n, i);
    }

    // Extract elements from heap one by one
    for (int i = n - 1; i > 0; i--) {
        // Swap root (max) with last element
        int temp = arr[0];
        arr[0] = arr[i];
        arr[i] = temp;

        // Heapify reduced heap
        Heapify(arr, i, 0);
    }
}

private static void Heapify(int[] arr, int n, int i) {
    int largest = i;
    int left = 2 * i + 1;
    int right = 2 * i + 2;

    // Find largest among node, left child, right child
    if (left < n && arr[left] > arr[largest]) {
        largest = left;
    }
    if (right < n && arr[right] > arr[largest]) {
        largest = right;
    }

    // If largest not root, swap and recurse
    if (largest != i) {
        int temp = arr[i];
        arr[i] = arr[largest];
        arr[largest] = temp;
        Heapify(arr, n, largest);
    }
}

Why Heap Sort Isn't Cache-Friendly

Heap sort accesses memory in scattered patterns (jumping between parent and child indices). Unlike merge sort or quicksort which process sequential memory, heap sort misses the cache frequently, leading to poor real-world performance despite O(n log n) guarantee.

Use Heap Sort When
You need guaranteed O(n log n) worst case with O(1) space. Memory extremely constrained. But prefer quicksort or introsort for typical scenarios.
9.8

Counting Sort & Radix Sort

Non-comparison sorts that break the O(n log n) lower bound. Counting sort works for small integer ranges. Radix sort processes integers digit-by-digit using counting sort. Both O(n+k) where k is range or digit count.

Counting Sort

Assumes integers in range [0, k]. Count frequency of each value, reconstruct sorted array. Stable if we iterate counts backward. Space O(k).

C#
public static int[] CountingSort(int[] arr, int maxVal) {
    int[] count = new int[maxVal + 1];

    // Count frequencies
    foreach (int num in arr) {
        count[num]++;
    }

    // Reconstruct sorted array
    int[] result = new int[arr.Length];
    int idx = 0;
    for (int i = 0; i <= maxVal; i++) {
        for (int j = 0; j < count[i]; j++) {
            result[idx++] = i;
        }
    }

    return result;
}

// Stable version using cumulative counts
public static int[] CountingSortStable(int[] arr, int maxVal) {
    int[] count = new int[maxVal + 1];
    int[] result = new int[arr.Length];

    // Count frequencies
    foreach (int num in arr) {
        count[num]++;
    }

    // Convert to cumulative counts
    for (int i = 1; i <= maxVal; i++) {
        count[i] += count[i - 1];
    }

    // Iterate backwards to maintain stability
    for (int i = arr.Length - 1; i >= 0; i--) {
        result[count[arr[i]] - 1] = arr[i];
        count[arr[i]]--;
    }

    return result;
}

Radix Sort: Least Significant Digit (LSD)

Sort by rightmost digit first using counting sort, then second-rightmost, etc. Each digit pass is stable, so earlier digits' ordering is preserved. Time O(d·n) where d is number of digits.

C#
public static void RadixSortLSD(int[] arr) {
    int maxNum = Array.MaxValue(arr);

    // Sort by each digit position (10 digits for base 10)
    for (int exp = 1; maxNum / exp > 0; exp *= 10) {
        CountingSortByDigit(arr, exp);
    }
}

private static void CountingSortByDigit(int[] arr, int exp) {
    int[] count = new int[10];
    int[] result = new int[arr.Length];

    // Count by current digit
    foreach (int num in arr) {
        count[(num / exp) % 10]++;
    }

    // Cumulative count
    for (int i = 1; i < 10; i++) {
        count[i] += count[i - 1];
    }

    // Build result (iterate backwards for stability)
    for (int i = arr.Length - 1; i >= 0; i--) {
        int digit = (arr[i] / exp) % 10;
        result[count[digit] - 1] = arr[i];
        count[digit]--;
    }

    // Copy result back
    for (int i = 0; i < arr.Length; i++) {
        arr[i] = result[i];
    }
}
ℹ️
Radix Sort Best For: Integers with known max value. Strings with known alphabet size. When comparison-based sort would be O(n log n) but digits/characters are few.
9.9

Bucket Sort

Distribute elements into buckets based on value ranges, sort each bucket independently, concatenate. Excellent for uniformly distributed data. O(n+k) average, O(n²) worst case.

Algorithm

1
Create buckets: Divide range [min, max] into k equal-sized buckets.
2
Distribute: Place each element into appropriate bucket.
3
Sort buckets: Sort each bucket (using insertion sort for small buckets).
4
Concatenate: Merge sorted buckets in order.

C# Implementation

C#
public static void BucketSort(double[] arr) {
    int n = arr.Length;
    int bucketCount = Math.Max(1, n / 10); // heuristic

    var buckets = new List<double>[bucketCount];
    for (int i = 0; i < bucketCount; i++) {
        buckets[i] = new List<double>();
    }

    // Find min and max
    double min = arr[0], max = arr[0];
    foreach (double num in arr) {
        if (num < min) min = num;
        if (num > max) max = num;
    }

    double range = (max - min) / bucketCount;

    // Distribute elements
    foreach (double num in arr) {
        int bucketIdx = (num == max) ? bucketCount - 1 : (int)((num - min) / range);
        buckets[bucketIdx].Add(num);
    }

    // Sort individual buckets
    int idx = 0;
    foreach (var bucket in buckets) {
        bucket.Sort(); // or InsertionSort for small buckets
        foreach (double num in bucket) {
            arr[idx++] = num;
        }
    }
}
9.10

C# Built-in Sorting

C# uses IntroSort: quicksort with heapsort fallback and insertion sort for small subarrays. Extremely optimized. Use Array.Sort() or List.Sort() and supply custom comparers for non-default ordering.

IntroSort: QuickSort + HeapSort + InsertionSort

Starts with quicksort. If recursion depth exceeds 2·log(n), switches to heapsort to avoid O(n²). For subarrays < 16 elements, uses insertion sort. Combines best of all worlds.

Array.Sort() vs List.Sort()

C#
// In-place sort of array
int[] arr = { 3, 1, 4, 1, 5 };
Array.Sort(arr);

// In-place sort of list
var list = new List<int> { 3, 1, 4, 1, 5 };
list.Sort();

// LINQ OrderBy (creates new collection, not in-place)
var sorted = arr.OrderBy(x => x).ToList();

Custom Comparers: IComparer

Implement IComparer interface for custom sorting logic. CompareTo returns negative if first < second, zero if equal, positive if first > second.

C#
public class Person {
    public string Name { get; set; }
    public int Age { get; set; }
}

// Sort by age ascending
public class AgeComparer : IComparer<Person> {
    public int Compare(Person a, Person b) {
        return a.Age.CompareTo(b.Age);
    }
}

var people = new List<Person> {
    new Person { Name = "Alice", Age = 30 },
    new Person { Name = "Bob", Age = 25 }
};

people.Sort(new AgeComparer());

Lambda Comparers with Comparison Delegate

C#
// Sort by age descending using lambda
people.Sort((a, b) => b.Age.CompareTo(a.Age));

// Sort by name (case-insensitive)
people.Sort((a, b) => string.Compare(a.Name, b.Name, StringComparison.OrdinalIgnoreCase));

// Sort by age, then by name
people.Sort((a, b) => {
    int ageCompare = a.Age.CompareTo(b.Age);
    return ageCompare != 0 ? ageCompare : a.Name.CompareTo(b.Name);
});

LINQ OrderBy with Multiple Keys

C#
// Multi-key sort: by age ascending, then by name ascending
var sorted = people
    .OrderBy(p => p.Age)
    .ThenBy(p => p.Name)
    .ToList();

// Descending age, ascending name
sorted = people
    .OrderByDescending(p => p.Age)
    .ThenBy(p => p.Name)
    .ToList();
9.11

Stability & Comprehensive Comparison

Stability determines whether equal-key elements maintain relative order. Critical for multi-key sorting and data integrity in complex systems.

What is Stability?

If array contains [("Alice", 25), ("Bob", 30), ("Charlie", 25)] and we sort by age, a stable sort preserves Alice before Charlie (both age 25). An unstable sort might swap them.

Stable Sorts
Bubble, Insertion, Merge, Counting, Radix, Bucket
Preserve relative order of equal elements
Required for multi-stage sorts
Unstable Sorts
Selection, Quick, Heap
May reorder equal elements
Simpler, often faster in practice

When Stability Matters

Scenario: Sort employee data by age, then by seniority (hire date). If unstable sort after age, it loses seniority ordering. Solution: sort by hire date first, then age with stable sort (preserves hire date order for same age).

Algorithm Comparison Summary

See the master table in Section 9.1 for detailed complexity and property comparison across all algorithms.

9.12

Practice Problems with Solutions

1. Sort Colors (LeetCode 75)

Sort array containing 0s, 1s, 2s in-place with one pass. Use 3-way partition (Dutch National Flag).

C#
public void SortColors(int[] nums) {
    int low = 0, mid = 0, high = nums.Length - 1;
    while (mid <= high) {
        if (nums[mid] == 0) {
            swap(nums, low++, mid++);
        } else if (nums[mid] == 2) {
            swap(nums, mid, high--);
        } else {
            mid++;
        }
    }
}

private void swap(int[] arr, int i, int j) {
    int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp;
}

2. Merge Intervals (LeetCode 56)

Sort intervals by start time, merge overlapping intervals.

C#
public int[][] Merge(int[][] intervals) {
    if (intervals.Length == 0) return new int[0][];

    Array.Sort(intervals, (a, b) => a[0].CompareTo(b[0]));

    var result = new List<int[]>();
    var current = new int[] { intervals[0][0], intervals[0][1] };

    foreach (var interval in intervals) {
        if (interval[0] <= current[1]) {
            current[1] = Math.Max(current[1], interval[1]);
        } else {
            result.Add(current);
            current = new int[] { interval[0], interval[1] };
        }
    }
    result.Add(current);

    return result.ToArray();
}

3. Largest Number (LeetCode 179)

Arrange numbers to form largest possible number. Custom comparator: compare concatenations.

C#
public string LargestNumber(int[] nums) {
    var strs = Array.ConvertAll(nums, x => x.ToString());
    Array.Sort(strs, (a, b) => {
        string ab = a + b;
        string ba = b + a;
        return ba.CompareTo(ab); // descending
    });

    if (strs[0][0] == '0') return "0";

    return string.Concat(strs);
}

4. Kth Largest Element (LeetCode 215)

Find kth largest element. Use quickselect O(n) average, or heap O(n log k).

C#
// Method 1: Sort (simplest)
public int FindKthLargest(int[] nums, int k) {
    Array.Sort(nums);
    return nums[nums.Length - k];
}

// Method 2: Min heap of size k
public int FindKthLargestHeap(int[] nums, int k) {
    var heap = new PriorityQueue<int, int>();
    foreach (int num in nums) {
        heap.Enqueue(num, num);
        if (heap.Count > k) heap.Dequeue();
    }
    return heap.Peek();
}

5. Valid Anagram (LeetCode 242)

Check if two strings are anagrams. Sort both and compare.

C#
public bool IsAnagram(string s, string t) {
    char[] s1 = s.ToCharArray();
    char[] t1 = t.ToCharArray();
    Array.Sort(s1);
    Array.Sort(t1);
    return new string(s1) == new string(t1);
}

6. Sort List (LeetCode 148)

Merge sort for linked list (O(n log n) time, O(log n) space).

C#
public ListNode SortList(ListNode head) {
    if (head?.next == null) return head;

    ListNode slow = head, fast = head.next, prev = null;
    while (fast?.next != null) {
        prev = slow;
        slow = slow.next;
        fast = fast.next.next;
    }

    prev.next = null;
    ListNode left = SortList(head);
    ListNode right = SortList(slow);

    return Merge(left, right);
}
9.13

Interview Tips & Best Practices

Choosing the Right Sort

Decision Tree
Guaranteed O(n log n) needed? Use merge sort or heap sort.
In-place O(1) space critical? Use quicksort (with randomized pivot) or heap sort.
Nearly sorted data? Use insertion sort or bubble sort with early termination.
Small integers (known max)? Use counting sort or radix sort.
Stability required? Use merge sort, counting sort, or insertion sort.
Production code? Use language built-ins (IntroSort in C#, Timsort in Python).

Red Flags in Interviews

⚠️
Saying "use quicksort": Good instinct, but explain worst-case and randomization.
Forgetting stability: Ask "do equal elements need to maintain order?"
Not considering space: "Can I use O(n) extra space?" determines merge vs. quicksort.
Always sorting: Sometimes sorting isn't optimal. Hash tables, binary search, or frequency counting might be faster.

Stability Questions

Interviewer asks: "Why does this sort fail for this data structure?" Your answer: Likely stability issue. Explain: equal-key elements lost relative order, causing incorrect secondary sort.

Custom Comparator Mistakes

C#
// WRONG: Subtraction can overflow
return a - b; // If a = -2B31, b = 1, result overflows

// CORRECT: Use CompareTo or safe comparison
return a.CompareTo(b);

Time Complexity Proof Sketches

Why merge sort is O(n log n): Splits array n times (log n depth). Each level does n merging work. Total: n · log n.

Why quicksort average O(n log n): Each partition splits roughly in half. Expected depth log n. Each level O(n) work.

Why counting sort is O(n+k): Count pass O(n). Reconstruct pass O(n+k) (k buckets).

Final Interview Checklist
□ Understand the problem (sortable type, constraints, stability needed)
□ Pick algorithm with clearest reasoning
□ Explain time/space complexity
□ Code carefully (off-by-one errors, edge cases)
□ Test on examples (already sorted, reverse sorted, duplicates)
□ Ask clarifications: "Can I modify in-place?" "Memory constraints?"