Chapter 14

Divide & Conquer

Master the paradigm that powers sorting, searching, and matrix algorithms. Learn to decompose problems, analyze recursive complexity with the Master Theorem, and implement proven algorithms from Merge Sort to Closest Pair.

20
Topics Covered
12
Algorithms
50+
Code Examples
15
Practice Problems
Table of Contents
14.1

The Divide & Conquer Paradigm

Divide & Conquer is a top-down algorithmic paradigm that solves problems by breaking them into smaller, independent subproblems, solving those recursively, and combining their solutions.

Three Core Steps

1
Divide: Break the problem into independent subproblems of roughly equal size. This reduction must approach a base case.
2
Conquer: Solve the subproblems recursively. If they are small enough, solve them directly (base case).
3
Combine: Merge the solutions of subproblems into the solution of the original problem.

Why Divide & Conquer?

Many problems have naturally recursive structure. By dividing into equal-sized subproblems, logarithmic trees emerge, leading to O(n log n) complexity rather than O(n²) naive approaches. Examples: Merge Sort, Quick Sort, Binary Search.

Key Insight: The power of D&C comes from reducing problem size exponentially. If each level processes all n elements and there are log n levels, work = n · log n.
14.2

Master Theorem & Recurrence Relations

The Master Theorem provides a closed-form solution for recurrences of the form: T(n) = a·T(n/b) + f(n), where a ≥ 1, b > 1.

The Three Cases

Case 1: f(n) dominates
If f(n) = O(n^log_b(a) - ε) for some ε > 0, then T(n) = Θ(n^log_b(a))
Intuition: Subproblem work dominates leaf nodes in recursion tree.
Case 2: Equal work
If f(n) = Θ(n^log_b(a)), then T(n) = Θ(n^log_b(a) · log n)
Intuition: Each level does roughly equal work; log n levels.
Case 3: f(n) is smaller
If f(n) = Ω(n^log_b(a) + ε) for some ε > 0 and a·f(n/b) ≤ c·f(n) for c < 1, then T(n) = Θ(f(n))
Intuition: Non-recursive work dominates; root level is most expensive.

Master Theorem Examples

Recurrence Values Case Solution
T(n) = 2T(n/2) + n a=2, b=2, f(n)=n Case 2 Θ(n log n)
T(n) = 3T(n/2) + n a=3, b=2, f(n)=n Case 1 (n < n^1.58) Θ(n^1.58)
T(n) = T(n/2) + n² a=1, b=2, f(n)=n² Case 3 Θ(n²)
T(n) = 4T(n/2) + n a=4, b=2, f(n)=n Case 1 (n < n²) Θ(n²)
Master Theorem Tip: Compare f(n) to n^log_b(a). Whichever grows faster (polynomially) determines the solution.
14.3

Merge Sort

A stable, comparison-based sorting algorithm that divides the array into halves, recursively sorts them, and merges. Guaranteed O(n log n) time, O(n) space.

Algorithm Overview

1
Divide: Split array at midpoint into left and right halves.
2
Conquer: Recursively sort left and right halves.
3
Combine: Merge two sorted halves into one sorted array.

Top-Down Implementation (C#)

C#
public class MergeSort {
    public static void Sort(int[] arr) {
        if (arr.Length <= 1) return;
        MergeSortHelper(arr, 0, arr.Length - 1);
    }

    private static void MergeSortHelper(int[] arr, int left, int right) {
        // Base case: single element
        if (left >= right) return;

        // Divide
        int mid = left + (right - left) / 2;

        // Conquer
        MergeSortHelper(arr, left, mid);
        MergeSortHelper(arr, mid + 1, right);

        // Combine
        Merge(arr, left, mid, right);
    }

    private static void Merge(int[] arr, int left, int mid, int right) {
        // Create temp arrays
        int[] leftArr = new int[mid - left + 1];
        int[] rightArr = new int[right - mid];

        // Copy data to temp arrays
        for (int i = 0; i < leftArr.Length; i++)
            leftArr[i] = arr[left + i];
        for (int i = 0; i < rightArr.Length; i++)
            rightArr[i] = arr[mid + 1 + i];

        // Merge the temp arrays back
        int l = 0, r = 0, k = left;
        while (l < leftArr.Length && r < rightArr.Length) {
            if (leftArr[l] <= rightArr[r]) {
                arr[k++] = leftArr[l++];
            } else {
                arr[k++] = rightArr[r++];
            }
        }

        // Copy remaining elements
        while (l < leftArr.Length)
            arr[k++] = leftArr[l++];
        while (r < rightArr.Length)
            arr[k++] = rightArr[r++];
    }
}

Complexity Analysis

Time (Best)
Θ(n log n)
Time (Average)
Θ(n log n)
Time (Worst)
Θ(n log n)
Space
O(n)

Bottom-Up Merge Sort

Iterative version that avoids recursion overhead by merging subarrays of increasing size (1, 2, 4, 8, ...).

C#
public static void MergeSortIterative(int[] arr) {
    int n = arr.Length;

    // Start with merge subarrays of size 1, then 2, 4, 8, ...
    for (int currSize = 1; currSize < n; currSize *= 2) {
        for (int leftStart = 0; leftStart < n; leftStart += currSize * 2) {
            int mid = Math.Min(leftStart + currSize - 1, n - 1);
            int rightEnd = Math.Min(leftStart + currSize * 2 - 1, n - 1);

            if (mid < rightEnd)
                Merge(arr, leftStart, mid, rightEnd);
        }
    }
}

Count Inversions Using Merge Sort

An inversion is a pair (i, j) where i < j but arr[i] > arr[j]. Merge Sort can count inversions in O(n log n) by counting splits during merge.

C#
public class InversionCounter {
    public static long CountInversions(int[] arr) {
        if (arr.Length <= 1) return 0;
        long[] count = new long[1];
        MergeSortCount(arr, 0, arr.Length - 1, count);
        return count[0];
    }

    private static void MergeSortCount(int[] arr, int left, int right, long[] count) {
        if (left >= right) return;
        int mid = left + (right - left) / 2;
        MergeSortCount(arr, left, mid, count);
        MergeSortCount(arr, mid + 1, right, count);
        MergeCount(arr, left, mid, right, count);
    }

    private static void MergeCount(int[] arr, int left, int mid, int right, long[] count) {
        int[] leftArr = new int[mid - left + 1];
        int[] rightArr = new int[right - mid];

        for (int i = 0; i < leftArr.Length; i++)
            leftArr[i] = arr[left + i];
        for (int i = 0; i < rightArr.Length; i++)
            rightArr[i] = arr[mid + 1 + i];

        int l = 0, r = 0, k = left;
        while (l < leftArr.Length && r < rightArr.Length) {
            if (leftArr[l] <= rightArr[r]) {
                arr[k++] = leftArr[l++];
            } else {
                // All remaining elements in left are inversions with rightArr[r]
                count[0] += leftArr.Length - l;
                arr[k++] = rightArr[r++];
            }
        }

        while (l < leftArr.Length)
            arr[k++] = leftArr[l++];
        while (r < rightArr.Length)
            arr[k++] = rightArr[r++];
    }
}
Merge Sort Advantage: Stable sort (preserves order of equal elements) and guaranteed O(n log n) make it ideal for linked lists and external sorting.
14.4

Quick Sort & Partitioning Strategies

A highly efficient in-place sorting algorithm that partitions around a pivot. Average O(n log n), worst O(n²). Faster than Merge Sort in practice due to cache locality.

Lomuto Partition Scheme

Simple partition strategy: place pivot at end, scan left accumulating smaller elements, swap at boundary.

C#
public class QuickSort {
    public static void Sort(int[] arr) {
        QuickSortHelper(arr, 0, arr.Length - 1);
    }

    private static void QuickSortHelper(int[] arr, int left, int right) {
        if (left < right) {
            int pi = LomutoPartition(arr, left, right);
            QuickSortHelper(arr, left, pi - 1);
            QuickSortHelper(arr, pi + 1, right);
        }
    }

    // Partition using Lomuto scheme
    private static int LomutoPartition(int[] arr, int left, int right) {
        int pivot = arr[right];  // Choose rightmost as pivot
        int i = left - 1;     // Index of smaller element - indicates right end of left partition

        for (int j = left; j < right; j++) {
            if (arr[j] < pivot) {
                i++;
                Swap(arr, i, j);
            }
        }
        Swap(arr, i + 1, right);
        return i + 1;
    }

    private static void Swap(int[] arr, int i, int j) {
        int temp = arr[i];
        arr[i] = arr[j];
        arr[j] = temp;
    }
}

Hoare Partition Scheme

More efficient: use two pointers from opposite ends, swap when inversion found. Fewer swaps than Lomuto.

C#
private static int HoarePartition(int[] arr, int left, int right) {
    int pivot = arr[left];
    int i = left - 1;
    int j = right + 1;

    while (true) {
        // Find leftmost element >= pivot
        do {
            i++;
        } while (arr[i] < pivot);

        // Find rightmost element <= pivot
        do {
            j--;
        } while (arr[j] > pivot);

        // If pointers crossed, partition is done
        if (i >= j) return j;

        Swap(arr, i, j);
    }
}

Randomized Quick Sort

Pick pivot randomly to avoid worst-case O(n²) on sorted input. Expected O(n log n) even on adversarial data.

C#
private static int RandomizedQuickSort(int[] arr, int left, int right) {
    Random rand = new Random();
    int randomIdx = left + rand.Next(right - left + 1);
    Swap(arr, randomIdx, right);  // Move random pivot to end
    return LomutoPartition(arr, left, right);
}

3-Way Partition (Dutch National Flag)

For arrays with many duplicates, partition into three regions: < pivot, = pivot, > pivot. Avoids redundant comparisons.

C#
private static void QuickSort3Way(int[] arr, int left, int right) {
    if (left < right) {
        int lt = left;      // arr[left..lt-1] < pivot
        int gt = right;     // arr[gt+1..right] > pivot
        int i = left;       // arr[lt..i-1] == pivot
        int pivot = arr[left];

        while (i <= gt) {
            if (arr[i] < pivot) {
                Swap(arr, i++, lt++);
            } else if (arr[i] > pivot) {
                Swap(arr, i, gt--);
            } else {
                i++;
            }
        }
        QuickSort3Way(arr, left, lt - 1);
        QuickSort3Way(arr, gt + 1, right);
    }
}

Complexity Analysis

Time (Best)
Θ(n log n)
Time (Average)
Θ(n log n)
Time (Worst)
O(n²)
Space
O(log n)
Quick Sort Pitfall: Worst case O(n²) occurs on already-sorted input with pivot=first/last. Use randomized pivot or median-of-three to avoid.
14.5

Quick Select: Finding kth Element in O(n)

Find the kth smallest element without full sorting. Uses partitioning like Quick Sort. Average O(n), worst O(n²), but typically much faster than sorting.

Algorithm Concept

After partitioning around pivot at index p: if k == p, found; if k < p, recurse left; if k > p, recurse right. Only one partition branch is explored, unlike Quick Sort which explores both.

Quick Select Implementation (C#)

C#
public class QuickSelect {
    public static int FindKthSmallest(int[] arr, int k) {
        if (k < 1 || k > arr.Length)
            throw new ArgumentException("Invalid k");
        return QuickSelectHelper(arr, 0, arr.Length - 1, k - 1);  // k-1 for 0-indexed
    }

    private static int QuickSelectHelper(int[] arr, int left, int right, int k) {
        if (left == right)
            return arr[left];

        // Partition and get pivot index
        int pi = Partition(arr, left, right);

        if (k == pi) {
            return arr[k];
        } else if (k < pi) {
            return QuickSelectHelper(arr, left, pi - 1, k);
        } else {
            return QuickSelectHelper(arr, pi + 1, right, k);
        }
    }

    private static int Partition(int[] arr, int left, int right) {
        Random rand = new Random();
        int randomIdx = left + rand.Next(right - left + 1);
        Swap(arr, randomIdx, right);

        int pivot = arr[right];
        int i = left - 1;

        for (int j = left; j < right; j++) {
            if (arr[j] < pivot) {
                i++;
                Swap(arr, i, j);
            }
        }
        Swap(arr, i + 1, right);
        return i + 1;
    }

    private static void Swap(int[] arr, int i, int j) {
        int temp = arr[i];
        arr[i] = arr[j];
        arr[j] = temp;
    }
}

Example Walkthrough

Find 3rd smallest in [3, 1, 4, 1, 5, 9, 2, 6]:

1
Partition around random pivot (e.g., 5 at index 4). Rearrange: [3, 1, 2, 1, 5, 9, 4, 6]. Pivot now at index 4.
2
k=2 (0-indexed), pi=4. Since k < pi, search left half [3, 1, 2, 1].
3
Partition left half around pivot (e.g., 2). Results: [1, 1, 2, 3]. Pivot now at index 2.
4
k=2, pi=2. k == pi, return 2 (3rd smallest in original array).

Complexity Analysis

Time (Average)
Θ(n)
Time (Worst)
O(n²)
Space
O(log n)
Quick Select vs Sort: If k is close to 0 or n (finding min/max), Quick Select is much faster than sorting. For k near n/2 (median), both are similar.
14.6

Binary Search as Divide & Conquer

Binary Search is a fundamental D&C algorithm: divide search space by half each iteration, recurse on relevant half. O(log n) time.

Recursive Binary Search (C#)

C#
public class BinarySearch {
    public static int Search(int[] arr, int target) {
        return BinarySearchHelper(arr, target, 0, arr.Length - 1);
    }

    private static int BinarySearchHelper(int[] arr, int target, int left, int right) {
        // Base case: element not found
        if (left > right)
            return -1;

        // Divide
        int mid = left + (right - left) / 2;

        // Check middle element
        if (arr[mid] == target) {
            return mid;  // Found
        } else if (arr[mid] < target) {
            // Conquer: search right half
            return BinarySearchHelper(arr, target, mid + 1, right);
        } else {
            // Conquer: search left half
            return BinarySearchHelper(arr, target, left, mid - 1);
        }
    }
}

Iterative Binary Search

More space-efficient than recursion; no call stack overhead.

C#
public static int BinarySearchIterative(int[] arr, int target) {
    int left = 0, right = arr.Length - 1;

    while (left <= right) {
        int mid = left + (right - left) / 2;

        if (arr[mid] == target)
            return mid;
        else if (arr[mid] < target)
            left = mid + 1;
        else
            right = mid - 1;
    }
    return -1;
}

Master Theorem Analysis

T(n) = T(n/2) + O(1). Here a=1, b=2, f(n)=1 = O(n^0). By Case 2 (f(n) = Θ(n^log_b(a) = n^0)), T(n) = Θ(log n).

Binary Search Requirement: Input array must be sorted. If unsorted, O(n) linear search is only correct approach.
14.7

Closest Pair of Points

Find two points in 2D space with minimum Euclidean distance. Naïve O(n²), D&C achieves O(n log n) by cleverly dividing and merging.

Divide & Conquer Strategy

1
Sort by x-coordinate: Pre-sort all points once (outside recursion).
2
Divide: Split points in half by x-coordinate. Recursively find closest pair in each half.
3
Conquer: Find pairs spanning the division. Only check points within distance d (where d = min of left/right distances) of dividing line, and sort by y within that strip.
4
Return: Minimum of three: left closest pair, right closest pair, cross-partition closest pair.

C# Implementation

C#
public class Point {
    public double X, Y;
    public Point(double x, double y) {
        X = x; Y = y;
    }
}

public class ClosestPair {
    public Point P1, P2;
    public double Distance;
    public ClosestPair(Point p1, Point p2) {
        P1 = p1; P2 = p2;
        Distance = Distance(p1, p2);
    }
}

public class ClosestPairFinder {
    private static double Distance(Point p1, Point p2) {
        double dx = p1.X - p2.X, dy = p1.Y - p2.Y;
        return Math.Sqrt(dx * dx + dy * dy);
    }

    public static ClosestPair FindClosestPair(Point[] points) {
        // Sort by x-coordinate
        Array.Sort(points, (a, b) => a.X.CompareTo(b.X));
        return FindClosestPairHelper(points, 0, points.Length - 1);
    }

    private static ClosestPair FindClosestPairHelper(Point[] points, int left, int right) {
        // Base case: brute force for small sets
        if (right - left <= 2) {
            return BruteForce(points, left, right);
        }

        // Divide
        int mid = left + (right - left) / 2;
        double midX = points[mid].X;

        // Conquer: find closest pairs in left and right
        ClosestPair leftPair = FindClosestPairHelper(points, left, mid);
        ClosestPair rightPair = FindClosestPairHelper(points, mid + 1, right);

        // Current best distance
        double d = Math.Min(leftPair.Distance, rightPair.Distance);
        ClosestPair bestPair = leftPair.Distance < rightPair.Distance ? leftPair : rightPair;

        // Find closest pair across the dividing line
        var stripPoints = new List<Point>();
        for (int i = left; i <= right; i++) {
            if (Math.Abs(points[i].X - midX) < d) {
                stripPoints.Add(points[i]);
            }
        }

        // Sort strip by y-coordinate and check nearby points
        stripPoints.Sort((a, b) => a.Y.CompareTo(b.Y));
        for (int i = 0; i < stripPoints.Count; i++) {
            for (int j = i + 1; j < stripPoints.Count &&
                 (stripPoints[j].Y - stripPoints[i].Y) < d; j++) {
                double dist = Distance(stripPoints[i], stripPoints[j]);
                if (dist < bestPair.Distance) {
                    bestPair = new ClosestPair(stripPoints[i], stripPoints[j]);
                }
            }
        }
        return bestPair;
    }

    private static ClosestPair BruteForce(Point[] points, int left, int right) {
        ClosestPair best = new ClosestPair(points[left], points[left + 1]);
        for (int i = left; i <= right; i++) {
            for (int j = i + 1; j <= right; j++) {
                double dist = Distance(points[i], points[j]);
                if (dist < best.Distance) {
                    best = new ClosestPair(points[i], points[j]);
                }
            }
        }
        return best;
    }
}

Complexity Analysis

Time
Θ(n log n)
Space
O(n)

Recurrence: T(n) = 2T(n/2) + O(n log n). The strip processing takes O(n log n) due to sorting; applying Master Theorem gives T(n) = O(n log² n). With preprocessing sort and careful strip handling, can optimize to O(n log n).

14.8

Strassen's Matrix Multiplication Concept

Classic D&C result: multiply two n×n matrices in O(n^2.807) instead of naïve O(n³). Remarkable algorithmic breakthrough.

Naïve Approach: O(n³)

Standard definition: C[i][j] = Σ A[i][k] · B[k][j]. Three nested loops, all of length n.

Strassen's Idea: Divide into 2×2 Blocks

Split each n×n matrix into four (n/2)×(n/2) blocks. Instead of 8 multiplications (naive D&C would give T(n) = 8T(n/2) + O(n²) = O(n³)), perform only 7 multiplications and more additions.

Naïve D&C Recurrence
T(n) = 8T(n/2) + O(n²)
By Master Theorem: T(n) = Θ(n³)
Strassen's Recurrence
T(n) = 7T(n/2) + O(n²)
By Master Theorem: T(n) = Θ(n^log₂(7)) ≈ Θ(n^2.807)

The 7 Matrix Multiplications (Strassen's Formulas)

Given matrices split as:

A = [A₁₁ A₁₂]    B = [B₁₁ B₁₂]    C = [C₁₁ C₁₂]
    [A₂₁ A₂₂]        [B₂₁ B₂₂]        [C₂₁ C₂₂]
      
1
M₁ = (A₁₁ + A₂₂)(B₁₁ + B₂₂)
2
M₂ = (A₂₁ + A₂₂)B₁₁
3
M₃ = A₁₁(B₁₂ - B₂₂)
4
M₄ = A₂₂(B₂₁ - B₁₁)
5
M₅ = (A₁₁ + A₁₂)B₂₂
6
M₆ = (A₂₁ - A₁₁)(B₁₁ + B₁₂)
7
M₇ = (A₁₂ - A₂₂)(B₂₁ + B₂₂)

Then C blocks are computed from these M values:

Strassen's Impact: Not practical for small n (overhead of matrix additions). Useful for very large matrices (n > 500) in specialized applications. More complex variants (Coppersmith-Winograd) push complexity toward n^2.376.
14.9

Maximum Subarray Problem

Find contiguous subarray with largest sum. D&C solution: O(n log n). Kadane's algorithm: O(n). Both approaches demonstrated.

Divide & Conquer Approach

Maximum subarray either lies entirely in left half, right half, or spans the middle. Recursively check left, right, and middle-crossing maximum.

C#
public class SubarrayResult {
    public int LeftIndex, RightIndex, Sum;
    public SubarrayResult(int left, int right, int sum) {
        LeftIndex = left; RightIndex = right; Sum = sum;
    }
}

public class MaximumSubarray {
    public static SubarrayResult FindMaxSubarrayDC(int[] arr) {
        return FindMaxSubarrayHelper(arr, 0, arr.Length - 1);
    }

    private static SubarrayResult FindMaxSubarrayHelper(int[] arr, int left, int right) {
        // Base case: single element
        if (left == right)
            return new SubarrayResult(left, right, arr[left]);

        // Divide
        int mid = left + (right - left) / 2;

        // Conquer: find max in left and right halves
        SubarrayResult leftMax = FindMaxSubarrayHelper(arr, left, mid);
        SubarrayResult rightMax = FindMaxSubarrayHelper(arr, mid + 1, right);

        // Find maximum subarray crossing the middle
        SubarrayResult midMax = FindMaxCrossingSubarray(arr, left, mid, right);

        // Return maximum of three
        if (leftMax.Sum >= rightMax.Sum && leftMax.Sum >= midMax.Sum)
            return leftMax;
        else if (rightMax.Sum >= midMax.Sum)
            return rightMax;
        else
            return midMax;
    }

    private static SubarrayResult FindMaxCrossingSubarray(int[] arr, int left, int mid, int right) {
        int leftSum = int.MinValue;
        int sum = 0, maxLeftIdx = mid;
        // Find max sum in left half ending at mid
        for (int i = mid; i >= left; i--) {
            sum += arr[i];
            if (sum > leftSum) {
                leftSum = sum;
                maxLeftIdx = i;
            }
        }

        int rightSum = int.MinValue;
        sum = 0; int maxRightIdx = mid + 1;
        // Find max sum in right half starting from mid+1
        for (int i = mid + 1; i <= right; i++) {
            sum += arr[i];
            if (sum > rightSum) {
                rightSum = sum;
                maxRightIdx = i;
            }
        }

        return new SubarrayResult(maxLeftIdx, maxRightIdx, leftSum + rightSum);
    }
}

Kadane's Algorithm (Greedy/DP, O(n))

Much simpler and faster: at each position, track max sum ending here. If extending decreases sum, start fresh.

C#
public static SubarrayResult FindMaxSubarrayKadane(int[] arr) {
    int maxSum = arr[0], currentSum = arr[0];
    int maxStart = 0, maxEnd = 0;
    int tempStart = 0;

    for (int i = 1; i < arr.Length; i++) {
        // If adding current element makes sum worse, start fresh
        if (currentSum + arr[i] < arr[i]) {
            currentSum = arr[i];
            tempStart = i;
        } else {
            currentSum += arr[i];
        }

        // Update max if current sum is better
        if (currentSum > maxSum) {
            maxSum = currentSum;
            maxStart = tempStart;
            maxEnd = i;
        }
    }

    return new SubarrayResult(maxStart, maxEnd, maxSum);
}

Comparison

Kadane's Algorithm
O(n) time, O(1) space
Simpler to code and understand
Optimal solution
Divide & Conquer Approach
O(n log n) time
More complex logic
Educational value for D&C pattern
Interview Tip: Kadane's algorithm is the preferred solution for maximum subarray. D&C version demonstrates recursive thinking but isn't practical.
14.10

Count Inversions (Detailed Review)

Extended discussion of inversion counting. Covered in Merge Sort section (14.3); this section reinforces the pattern and variations.

What is an Inversion?

Pair (i, j) where i < j but arr[i] > arr[j]. Inversions measure how "unsorted" an array is. Example: [2, 1, 3] has 1 inversion (2,1).

Why Count Inversions?

Complexity Note: Brute force: O(n²). Merge Sort approach: O(n log n). For small n, brute force is acceptable; for large n, Merge Sort method is necessary.
14.11

Karatsuba Multiplication Concept

Multiply two n-digit numbers in O(n^1.585) instead of naïve O(n²). Classic result showing D&C power.

Naïve Approach: O(n²)

Standard grade-school multiplication: multiply each digit of first number by each digit of second, add results. For n-digit numbers, O(n²) multiplications.

Karatsuba's Insight

Split numbers into high and low parts. For n-digit numbers x and y:

x = x₁·10^(n/2) + x₀
y = y₁·10^(n/2) + y₀

x·y = (x₁·y₁)·10^n + ((x₁·y₀ + x₀·y₁))·10^(n/2) + (x₀·y₀)
      

Standard approach: 4 multiplications of (n/2)-digit numbers. Karatsuba uses only 3 by clever substitution:

m₁ = x₁·y₁
m₂ = x₀·y₀
m₃ = (x₁+x₀)·(y₁+y₀) = x₁·y₁ + x₁·y₀ + x₀·y₁ + x₀·y₀

Then: x₁·y₀ + x₀·y₁ = m₃ - m₁ - m₂
      

Recurrence & Complexity

Naïve Recurrence
T(n) = 4T(n/2) + O(n)
Solution: Θ(n²)
Karatsuba Recurrence
T(n) = 3T(n/2) + O(n)
Solution: Θ(n^log₂(3)) ≈ Θ(n^1.585)
Practical Impact: Faster for very large numbers (cryptography, big integer libraries). Modern computers use Toom-Cook or FFT-based methods for even better asymptotic complexity.
14.12

Practice Problems & Full Solutions

Apply D&C patterns to interview-style problems. Complete C# implementations provided.

Problem 1: Pow(x, n) - O(log n)

Prompt: Implement x^n for integer n (may be negative). Use D&C to achieve O(log n).

C#
public class Power {
    public static double MyPow(double x, int n) {
        long N = n;  // Handle int overflow for n = int.MinValue
        if (N < 0) {
            x = 1 / x;
            N = -N;
        }
        return MyPowHelper(x, N);
    }

    private static double MyPowHelper(double x, long n) {
        // Base case: x^0 = 1
        if (n == 0) return 1.0;

        // Divide: compute x^(n/2)
        double half = MyPowHelper(x, n / 2);

        // Combine based on even/odd
        if (n % 2 == 0) {
            return half * half;
        } else {
            return half * half * x;
        }
    }
}

Complexity: T(n) = T(n/2) + O(1). Master Theorem Case 2: Θ(log n).

Problem 2: Majority Element - O(n)

Prompt: Find element appearing > n/2 times. Use D&C (also solvable with Boyer-Moore).

C#
public class MajorityElement {
    public static int FindMajority(int[] nums) {
        return MajorityHelper(nums, 0, nums.Length - 1);
    }

    private static int MajorityHelper(int[] nums, int left, int right) {
        // Base case: single element
        if (left == right)
            return nums[left];

        // Divide
        int mid = left + (right - left) / 2;

        // Conquer: find majority in left and right
        int leftMaj = MajorityHelper(nums, left, mid);
        int rightMaj = MajorityHelper(nums, mid + 1, right);

        // Combine: count occurrences and return majority
        int leftCount = CountInRange(nums, left, right, leftMaj);
        if (leftCount > (right - left + 1) / 2)
            return leftMaj;

        int rightCount = CountInRange(nums, left, right, rightMaj);
        if (rightCount > (right - left + 1) / 2)
            return rightMaj;

        return -1;  // No majority (shouldn't reach if input guaranteed to have majority)
    }

    private static int CountInRange(int[] nums, int left, int right, int target) {
        int count = 0;
        for (int i = left; i <= right; i++) {
            if (nums[i] == target) count++;
        }
        return count;
    }
}

Complexity: T(n) = 2T(n/2) + O(n). Master Theorem Case 2: Θ(n log n). Note: Boyer-Moore voting is O(n) with O(1) space, but this demonstrates D&C.

Problem 3: Sort List (Linked List) - O(n log n)

Prompt: Sort a linked list using Merge Sort (D&C ideal for linked structures).

C#
public class ListNode {
    public int val;
    public ListNode next;
    public ListNode(int x) { val = x; }
}

public class SortList {
    public static ListNode Sort(ListNode head) {
        if (head == null || head.next == null)
            return head;

        // Find middle of list
        ListNode mid = FindMiddle(head);
        ListNode left = head;
        ListNode right = mid.next;
        mid.next = null;  // Split into two lists

        // Recursively sort both halves
        left = Sort(left);
        right = Sort(right);

        // Merge sorted halves
        return Merge(left, right);
    }

    private static ListNode FindMiddle(ListNode head) {
        ListNode slow = head, fast = head.next;
        while (fast != null && fast.next != null) {
            slow = slow.next;
            fast = fast.next.next;
        }
        return slow;
    }

    private static ListNode Merge(ListNode l1, ListNode l2) {
        ListNode dummy = new ListNode(0);
        ListNode current = dummy;

        while (l1 != null && l2 != null) {
            if (l1.val <= l2.val) {
                current.next = l1;
                l1 = l1.next;
            } else {
                current.next = l2;
                l2 = l2.next;
            }
            current = current.next;
        }

        current.next = (l1 != null) ? l1 : l2;
        return dummy.next;
    }
}

Complexity: T(n) = 2T(n/2) + O(n). Master Theorem Case 2: Θ(n log n). Space: O(log n) recursion depth.

Problem 4: Median of Two Sorted Arrays - O(log(min(m,n)))

Prompt: Find median of two sorted arrays without merging. Use binary search on smaller array.

C#
public class MedianOfTwoArrays {
    public static double FindMedianSortedArrays(int[] nums1, int[] nums2) {
        // Ensure nums1 is smaller
        if (nums1.Length > nums2.Length)
            return FindMedianSortedArrays(nums2, nums1);

        int m = nums1.Length, n = nums2.Length;
        int left = 0, right = m;

        while (left <= right) {
            int partition1 = (left + right) / 2;
            int partition2 = (m + n + 1) / 2 - partition1;

            // Handle edge cases
            int leftMax1 = (partition1 == 0) ? int.MinValue : nums1[partition1 - 1];
            int rightMin1 = (partition1 == m) ? int.MaxValue : nums1[partition1];
            int leftMax2 = (partition2 == 0) ? int.MinValue : nums2[partition2 - 1];
            int rightMin2 = (partition2 == n) ? int.MaxValue : nums2[partition2];

            // Valid partition found
            if (leftMax1 <= rightMin2 && leftMax2 <= rightMin1) {
                if ((m + n) % 2 == 0) {
                    return (Math.Max(leftMax1, leftMax2) + Math.Min(rightMin1, rightMin2)) / 2.0;
                } else {
                    return Math.Max(leftMax1, leftMax2);
                }
            } else if (leftMax1 > rightMin2) {
                right = partition1 - 1;
            } else {
                left = partition1 + 1;
            }
        }
        return -1.0;  // Shouldn't reach
    }
}

Complexity: T(n) = T(n/2) + O(1). Master Theorem Case 2: Θ(log(min(m,n))).

14.13

Divide & Conquer Interview Tips & Paradigm Comparison

Recognizing D&C Problems

TIP 1
Problems mentioning "divide", "split", "halves"
Arrays that are naturally dividable (like binary search, merge sort) or trees with recursive structure suggest D&C.
TIP 2
Recurrence relations
If analyzing gives T(n) = aT(n/b) + f(n), use Master Theorem immediately. Helps verify optimal complexity.
TIP 3
Combining subproblem results is straightforward
D&C shines when merge/combine is simple (O(n)). If combine is expensive, other approaches may be better.

D&C vs. Dynamic Programming vs. Greedy

Aspect Divide & Conquer Dynamic Programming Greedy
Subproblems Independent, non-overlapping Overlapping, with optimal substructure Local optimal choices
Approach Top-down (or bottom-up iterative) Bottom-up tabulation or top-down memoization Single pass, never backtrack
Complexity Usually O(n log n) or O(n^log_b(a)) Varies; often O(nm) or O(n²) Usually O(n) or O(n log n) for sorting
Examples Merge Sort, Quick Sort, Binary Search Fibonacci, Knapsack, Edit Distance Activity Selection, Huffman Coding
Memory O(n) to O(log n) auxiliary O(n²) or O(nm) for table O(1) or O(log n) typically

Common Mistakes in Interviews

Mistake 1: Forgetting to define base case. Always specify when recursion stops, otherwise stack overflow.
Mistake 2: Not verifying Master Theorem applies. It only works for T(n) = aT(n/b) + f(n) with a ≥ 1, b > 1, constant divisions.
Mistake 3: Ignoring space complexity. Recursion uses O(log n) call stack; sometimes bottom-up iteration is preferred.
Mistake 4: Subproblems aren't independent. If subproblems overlap (like tree paths), consider DP with memoization instead.

Best Practices

1
Clarify divisions: Ask if array is sorted, if splitting must be equal, if duplicates exist. These affect strategy.
2
Write recurrence first: Before coding, write T(n) = ... and apply Master Theorem to verify expected complexity.
3
Test with small examples: Trace through [1, 2, 3] or single-element cases to catch off-by-one errors.
4
Consider iterative variants: Bottom-up versions avoid recursion overhead and potential stack overflow on very large inputs.
5
Communicate complexity: Explicitly state time and space complexity. Mention worst-case scenarios (e.g., Quick Sort on sorted input).
Interview Gold: When asked about sorting, mention both Merge Sort (stable, guaranteed O(n log n)) and Quick Sort (faster average, in-place). Interviewers appreciate understanding trade-offs.