Master fundamental and advanced sorting techniques from bubble sort to radix sort. Learn algorithm properties like stability and in-place behavior, understand complexity tradeoffs, and know exactly which sort to use in any interview or production scenario.
Sorting is the process of arranging data in a specified order. It's fundamental to computer science: enables binary search, simplifies numerous algorithmic problems, and is often a bottleneck in performance-critical systems. Understanding when and which sorting algorithm to use is essential for any software engineer.
Sorting unlocks efficiency. An unsorted array requires O(n) for linear search; a sorted array enables O(log n) binary search. Many problems become trivial once data is ordered: finding medians, detecting duplicates, merging datasets, and computing ranges all become simpler. In interviews, candidates who recognize when sorting solves a problem stand out.
A sort is stable if elements with equal keys maintain their relative order. Example: sorting [("Alice", 25), ("Bob", 30), ("Charlie", 25)] by age—stable sorts keep Alice before Charlie; unstable sorts might not.
In-place: Uses O(1) or O(log n) extra space (quick sort, heap sort). Out-of-place: Uses O(n) extra space (merge sort, counting sort). In-place is preferable for memory-constrained systems; out-of-place often enables stability and parallelization.
| Algorithm | Best Case | Avg Case | Worst Case | Space | Stable | In-Place |
|---|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No | Yes |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes | No |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No | Yes |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No | Yes |
| Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) | Yes | No |
| Radix Sort | O(nk) | O(nk) | O(nk) | O(n+k) | Yes | No |
| Bucket Sort | O(n+k) | O(n+k) | O(n²) | O(n+k) | Yes | No |
The simplest sorting algorithm. Repeatedly scan the array, swapping adjacent elements if they're in wrong order. Largest values "bubble" to the end. Useful primarily for teaching or tiny datasets where simplicity trumps speed.
Bubble sort makes multiple passes through the array. On each pass, it compares adjacent pairs and swaps if the left element is greater. After pass i, the last i elements are in final position. Early termination optimization: if a pass makes no swaps, the array is sorted.
public static void BubbleSort(int[] arr) { int n = arr.Length; for (int i = 0; i < n - 1; i++) { bool swapped = false; // Last i elements already in place for (int j = 0; j < n - i - 1; j++) { if (arr[j] > arr[j + 1]) { // Swap int temp = arr[j]; arr[j] = arr[j + 1]; arr[j + 1] = temp; swapped = true; } } // If no swaps, array is sorted if (!swapped) break; } }
Find the minimum element in unsorted portion and place it at the beginning. Repeat. Always O(n²) regardless of input, but minimizes memory writes—useful when writes are expensive.
public static void SelectionSort(int[] arr) { int n = arr.Length; for (int i = 0; i < n - 1; i++) { int minIdx = i; // Find minimum in remaining array for (int j = i + 1; j < n; j++) { if (arr[j] < arr[minIdx]) { minIdx = j; } } // Swap minimum to position i int temp = arr[i]; arr[i] = arr[minIdx]; arr[minIdx] = temp; } }
Even if the array is already sorted, selection sort must scan the remaining unsorted portion to confirm there's no smaller element. First scan: n-1 comparisons. Second: n-2. Total: (n-1) + (n-2) + ... + 1 = n(n-1)/2 = Θ(n²).
Build sorted array one element at a time. Excellent for nearly sorted data, small arrays, and as the base case in hybrid sorts like Timsort. Best case O(n) when array is already sorted.
Maintain a sorted region on the left. For each new element, insert it into correct position within the sorted region by shifting larger elements right.
public static void InsertionSort(int[] arr) { for (int i = 1; i < arr.Length; i++) { int key = arr[i]; int j = i - 1; // Shift larger elements right while (j >= 0 && arr[j] > key) { arr[j + 1] = arr[j]; j--; } // Insert key at correct position arr[j + 1] = key; } }
Use binary search to find insertion position—reduces comparisons from O(n²) to O(n log n). However, shifts still cost O(n), so overall remains O(n²). Useful when comparisons are expensive but shifts are cheap.
public static void BinaryInsertionSort(int[] arr) { for (int i = 1; i < arr.Length; i++) { int key = arr[i]; // Binary search for position int pos = BinarySearch(arr, 0, i - 1, key); // Shift and insert for (int j = i - 1; j >= pos; j--) { arr[j + 1] = arr[j]; } arr[pos] = key; } } private static int BinarySearch(int[] arr, int left, int right, int key) { while (left <= right) { int mid = left + (right - left) / 2; if (arr[mid] < key) left = mid + 1; else right = mid - 1; } return left; }
Python's built-in sort (Timsort) and Java's Arrays.sort() use insertion sort as the base case for small subarrays (typically 32-64 elements). Why? Insertion sort has low overhead, excellent cache behavior on small arrays, and O(n) best case for nearly sorted data—common within larger sorted runs.
Divide-and-conquer algorithm with guaranteed O(n log n) time in all cases. Stable sort that requires O(n) extra space. Excellent for linked lists and external sorting (sorting data larger than memory).
public static void MergeSort(int[] arr, int left, int right) { if (left < right) { int mid = left + (right - left) / 2; MergeSort(arr, left, mid); MergeSort(arr, mid + 1, right); Merge(arr, left, mid, right); } } private static void Merge(int[] arr, int left, int mid, int right) { // Copy to temp arrays int[] leftArr = new int[mid - left + 1]; int[] rightArr = new int[right - mid]; for (int i = 0; i < leftArr.Length; i++) leftArr[i] = arr[left + i]; for (int i = 0; i < rightArr.Length; i++) rightArr[i] = arr[mid + 1 + i]; // Merge back int l = 0, r = 0, k = left; while (l < leftArr.Length && r < rightArr.Length) { if (leftArr[l] <= rightArr[r]) { arr[k++] = leftArr[l++]; } else { arr[k++] = rightArr[r++]; } } // Copy remaining elements while (l < leftArr.Length) arr[k++] = leftArr[l++]; while (r < rightArr.Length) arr[k++] = rightArr[r++]; }
Build sorted array bottom-up instead of recursively. Start with size 1, merge to size 2, then 4, 8, etc. Avoids recursion overhead and is more cache-friendly.
public static void BottomUpMergeSort(int[] arr) { int n = arr.Length; // Start with merge subarrays of size 1, then 2, 4, 8... for (int currSize = 1; currSize < n; currSize *= 2) { for (int leftStart = 0; leftStart < n - 1; leftStart += currSize * 2) { int mid = Math.Min(leftStart + currSize - 1, n - 1); int rightEnd = Math.Min(leftStart + currSize * 2 - 1, n - 1); Merge(arr, leftStart, mid, rightEnd); } } }
Merge sort is ideal for linked lists (unlike quicksort which needs random access). It maintains O(n log n) time and O(log n) space (recursion stack) without needing contiguous memory.
public class ListNode { public int val; public ListNode next; } public ListNode MergeSortList(ListNode head) { if (head == null || head.next == null) return head; // Find middle using slow/fast pointers ListNode slow = head, fast = head.next; ListNode prev = null; while (fast != null && fast.next != null) { prev = slow; slow = slow.next; fast = fast.next.next; } if (prev != null) prev.next = null; ListNode left = MergeSortList(head); ListNode right = MergeSortList(slow); return MergeLists(left, right); } private ListNode MergeLists(ListNode l1, ListNode l2) { ListNode dummy = new ListNode { val = 0 }; ListNode curr = dummy; while (l1 != null && l2 != null) { if (l1.val < l2.val) { curr.next = l1; l1 = l1.next; } else { curr.next = l2; l2 = l2.next; } curr = curr.next; } curr.next = l1 ?? l2; return dummy.next; }
An inversion is a pair (i, j) where i < j but arr[i] > arr[j]. Count inversions by augmenting merge sort: when merging, if an element from the right subarray is chosen before one from the left, that's an inversion.
public static long CountInversions(int[] arr) { return MergeSortCount(arr, 0, arr.Length - 1); } private static long MergeSortCount(int[] arr, int left, int right) { long invCount = 0; if (left < right) { int mid = left + (right - left) / 2; invCount += MergeSortCount(arr, left, mid); invCount += MergeSortCount(arr, mid + 1, right); invCount += MergeCount(arr, left, mid, right); } return invCount; } private static long MergeCount(int[] arr, int left, int mid, int right) { int[] leftArr = new int[mid - left + 1]; int[] rightArr = new int[right - mid]; long invCount = 0; for (int i = 0; i < leftArr.Length; i++) leftArr[i] = arr[left + i]; for (int i = 0; i < rightArr.Length; i++) rightArr[i] = arr[mid + 1 + i]; int l = 0, r = 0, k = left; while (l < leftArr.Length && r < rightArr.Length) { if (leftArr[l] <= rightArr[r]) { arr[k++] = leftArr[l++]; } else { arr[k++] = rightArr[r++]; invCount += (leftArr.Length - l); } } while (l < leftArr.Length) arr[k++] = leftArr[l++]; while (r < rightArr.Length) arr[k++] = rightArr[r++]; return invCount; }
Fastest average-case sort (O(n log n)) and most cache-friendly. Based on partitioning around a pivot. Worst case O(n²) but rarely happens in practice with good pivot selection. Preferred in production (built into most languages).
Pick last element as pivot. Partition array so all elements ≤ pivot are on left, all > pivot on right. Return pivot's final position. Time O(n).
public static void QuickSort(int[] arr) { QuickSortHelper(arr, 0, arr.Length - 1); } private static void QuickSortHelper(int[] arr, int low, int high) { if (low < high) { int pi = Partition(arr, low, high); QuickSortHelper(arr, low, pi - 1); QuickSortHelper(arr, pi + 1, high); } } private static int Partition(int[] arr, int low, int high) { int pivot = arr[high]; int i = low - 1; for (int j = low; j < high; j++) { if (arr[j] < pivot) { i++; // Swap arr[i] and arr[j] int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } } // Place pivot at its correct position int pivotTemp = arr[i + 1]; arr[i + 1] = arr[high]; arr[high] = pivotTemp; return i + 1; }
More efficient than Lomuto. Two pointers start at opposite ends, move toward center. Swap when left finds element > pivot and right finds element < pivot. Often 3x faster in practice.
private static int HoarePartition(int[] arr, int low, int high) { int pivot = arr[low]; int i = low - 1; int j = high + 1; while (true) { // Find element on left > pivot do { i++; } while (arr[i] < pivot); // Find element on right < pivot do { j--; } while (arr[j] > pivot); // Swap if pointers haven't crossed if (i < j) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } else { return j; } } }
Worst case O(n²) occurs when pivot is always smallest/largest element (sorted array with middle pivot). Solution: pick random element as pivot. Guarantees O(n log n) with high probability regardless of input.
private static Random rand = new Random(); private static int RandomizedPartition(int[] arr, int low, int high) { // Pick random index and swap with high int randomIdx = low + rand.Next(high - low + 1); int temp = arr[randomIdx]; arr[randomIdx] = arr[high]; arr[high] = temp; // Continue with standard Lomuto partition return Partition(arr, low, high); }
If array has many duplicates, standard partitioning wastes time. 3-way partition (Dutch National Flag) divides array into < pivot, = pivot, > pivot. Elements equal to pivot are already in final position—no recursion needed.
private static void QuickSortThreeWay(int[] arr, int low, int high) { if (low < high) { int[] indices = PartitionThreeWay(arr, low, high); QuickSortThreeWay(arr, low, indices[0] - 1); QuickSortThreeWay(arr, indices[1] + 1, high); } } private static int[] PartitionThreeWay(int[] arr, int low, int high) { int pivot = arr[high]; int i = low; int j = high - 1; while (i <= j) { if (arr[i] < pivot) { i++; } else if (arr[i] > pivot) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; j--; } else { i++; } } // Place pivot int pivotTemp = arr[i]; arr[i] = arr[high]; arr[high] = pivotTemp; return new int[] { i - // first equal, i // first greater }; }
Quick select uses the partitioning logic from quicksort but without recursing on both sides. After partitioning, if pivot's position equals k, we found it. Otherwise, recurse only on relevant side. Average O(n).
public static int QuickSelect(int[] arr, int k) { // Find kth smallest (0-indexed) return QuickSelectHelper(arr, 0, arr.Length - 1, k - 1); } private static int QuickSelectHelper(int[] arr, int low, int high, int k) { if (low == high) return arr[low]; int pi = Partition(arr, low, high); if (k == pi) { return arr[k]; } else if (k < pi) { return QuickSelectHelper(arr, low, pi - 1, k); } else { return QuickSelectHelper(arr, pi + 1, high, k); } }
Guaranteed O(n log n) in all cases, in-place, unstable. Build max-heap from array, repeatedly extract max element and place at end. Not cache-friendly despite asymptotic optimality—quicksort usually faster in practice.
Transform unsorted array into valid max-heap (parent ≥ children). Start from last non-leaf node, sift down each element to maintain heap property.
public static void HeapSort(int[] arr) { int n = arr.Length; // Build max heap in O(n) for (int i = n / 2 - 1; i >= 0; i--) { Heapify(arr, n, i); } // Extract elements from heap one by one for (int i = n - 1; i > 0; i--) { // Swap root (max) with last element int temp = arr[0]; arr[0] = arr[i]; arr[i] = temp; // Heapify reduced heap Heapify(arr, i, 0); } } private static void Heapify(int[] arr, int n, int i) { int largest = i; int left = 2 * i + 1; int right = 2 * i + 2; // Find largest among node, left child, right child if (left < n && arr[left] > arr[largest]) { largest = left; } if (right < n && arr[right] > arr[largest]) { largest = right; } // If largest not root, swap and recurse if (largest != i) { int temp = arr[i]; arr[i] = arr[largest]; arr[largest] = temp; Heapify(arr, n, largest); } }
Heap sort accesses memory in scattered patterns (jumping between parent and child indices). Unlike merge sort or quicksort which process sequential memory, heap sort misses the cache frequently, leading to poor real-world performance despite O(n log n) guarantee.
Non-comparison sorts that break the O(n log n) lower bound. Counting sort works for small integer ranges. Radix sort processes integers digit-by-digit using counting sort. Both O(n+k) where k is range or digit count.
Assumes integers in range [0, k]. Count frequency of each value, reconstruct sorted array. Stable if we iterate counts backward. Space O(k).
public static int[] CountingSort(int[] arr, int maxVal) { int[] count = new int[maxVal + 1]; // Count frequencies foreach (int num in arr) { count[num]++; } // Reconstruct sorted array int[] result = new int[arr.Length]; int idx = 0; for (int i = 0; i <= maxVal; i++) { for (int j = 0; j < count[i]; j++) { result[idx++] = i; } } return result; } // Stable version using cumulative counts public static int[] CountingSortStable(int[] arr, int maxVal) { int[] count = new int[maxVal + 1]; int[] result = new int[arr.Length]; // Count frequencies foreach (int num in arr) { count[num]++; } // Convert to cumulative counts for (int i = 1; i <= maxVal; i++) { count[i] += count[i - 1]; } // Iterate backwards to maintain stability for (int i = arr.Length - 1; i >= 0; i--) { result[count[arr[i]] - 1] = arr[i]; count[arr[i]]--; } return result; }
Sort by rightmost digit first using counting sort, then second-rightmost, etc. Each digit pass is stable, so earlier digits' ordering is preserved. Time O(d·n) where d is number of digits.
public static void RadixSortLSD(int[] arr) { int maxNum = Array.MaxValue(arr); // Sort by each digit position (10 digits for base 10) for (int exp = 1; maxNum / exp > 0; exp *= 10) { CountingSortByDigit(arr, exp); } } private static void CountingSortByDigit(int[] arr, int exp) { int[] count = new int[10]; int[] result = new int[arr.Length]; // Count by current digit foreach (int num in arr) { count[(num / exp) % 10]++; } // Cumulative count for (int i = 1; i < 10; i++) { count[i] += count[i - 1]; } // Build result (iterate backwards for stability) for (int i = arr.Length - 1; i >= 0; i--) { int digit = (arr[i] / exp) % 10; result[count[digit] - 1] = arr[i]; count[digit]--; } // Copy result back for (int i = 0; i < arr.Length; i++) { arr[i] = result[i]; } }
Distribute elements into buckets based on value ranges, sort each bucket independently, concatenate. Excellent for uniformly distributed data. O(n+k) average, O(n²) worst case.
public static void BucketSort(double[] arr) { int n = arr.Length; int bucketCount = Math.Max(1, n / 10); // heuristic var buckets = new List<double>[bucketCount]; for (int i = 0; i < bucketCount; i++) { buckets[i] = new List<double>(); } // Find min and max double min = arr[0], max = arr[0]; foreach (double num in arr) { if (num < min) min = num; if (num > max) max = num; } double range = (max - min) / bucketCount; // Distribute elements foreach (double num in arr) { int bucketIdx = (num == max) ? bucketCount - 1 : (int)((num - min) / range); buckets[bucketIdx].Add(num); } // Sort individual buckets int idx = 0; foreach (var bucket in buckets) { bucket.Sort(); // or InsertionSort for small buckets foreach (double num in bucket) { arr[idx++] = num; } } }
C# uses IntroSort: quicksort with heapsort fallback and insertion sort for small subarrays. Extremely optimized. Use Array.Sort() or List.Sort() and supply custom comparers for non-default ordering.
Starts with quicksort. If recursion depth exceeds 2·log(n), switches to heapsort to avoid O(n²). For subarrays < 16 elements, uses insertion sort. Combines best of all worlds.
// In-place sort of array int[] arr = { 3, 1, 4, 1, 5 }; Array.Sort(arr); // In-place sort of list var list = new List<int> { 3, 1, 4, 1, 5 }; list.Sort(); // LINQ OrderBy (creates new collection, not in-place) var sorted = arr.OrderBy(x => x).ToList();
Implement IComparer interface for custom sorting logic. CompareTo returns negative if first < second, zero if equal, positive if first > second.
public class Person { public string Name { get; set; } public int Age { get; set; } } // Sort by age ascending public class AgeComparer : IComparer<Person> { public int Compare(Person a, Person b) { return a.Age.CompareTo(b.Age); } } var people = new List<Person> { new Person { Name = "Alice", Age = 30 }, new Person { Name = "Bob", Age = 25 } }; people.Sort(new AgeComparer());
// Sort by age descending using lambda people.Sort((a, b) => b.Age.CompareTo(a.Age)); // Sort by name (case-insensitive) people.Sort((a, b) => string.Compare(a.Name, b.Name, StringComparison.OrdinalIgnoreCase)); // Sort by age, then by name people.Sort((a, b) => { int ageCompare = a.Age.CompareTo(b.Age); return ageCompare != 0 ? ageCompare : a.Name.CompareTo(b.Name); });
// Multi-key sort: by age ascending, then by name ascending var sorted = people .OrderBy(p => p.Age) .ThenBy(p => p.Name) .ToList(); // Descending age, ascending name sorted = people .OrderByDescending(p => p.Age) .ThenBy(p => p.Name) .ToList();
Stability determines whether equal-key elements maintain relative order. Critical for multi-key sorting and data integrity in complex systems.
If array contains [("Alice", 25), ("Bob", 30), ("Charlie", 25)] and we sort by age, a stable sort preserves Alice before Charlie (both age 25). An unstable sort might swap them.
Scenario: Sort employee data by age, then by seniority (hire date). If unstable sort after age, it loses seniority ordering. Solution: sort by hire date first, then age with stable sort (preserves hire date order for same age).
See the master table in Section 9.1 for detailed complexity and property comparison across all algorithms.
Sort array containing 0s, 1s, 2s in-place with one pass. Use 3-way partition (Dutch National Flag).
public void SortColors(int[] nums) { int low = 0, mid = 0, high = nums.Length - 1; while (mid <= high) { if (nums[mid] == 0) { swap(nums, low++, mid++); } else if (nums[mid] == 2) { swap(nums, mid, high--); } else { mid++; } } } private void swap(int[] arr, int i, int j) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; }
Sort intervals by start time, merge overlapping intervals.
public int[][] Merge(int[][] intervals) { if (intervals.Length == 0) return new int[0][]; Array.Sort(intervals, (a, b) => a[0].CompareTo(b[0])); var result = new List<int[]>(); var current = new int[] { intervals[0][0], intervals[0][1] }; foreach (var interval in intervals) { if (interval[0] <= current[1]) { current[1] = Math.Max(current[1], interval[1]); } else { result.Add(current); current = new int[] { interval[0], interval[1] }; } } result.Add(current); return result.ToArray(); }
Arrange numbers to form largest possible number. Custom comparator: compare concatenations.
public string LargestNumber(int[] nums) { var strs = Array.ConvertAll(nums, x => x.ToString()); Array.Sort(strs, (a, b) => { string ab = a + b; string ba = b + a; return ba.CompareTo(ab); // descending }); if (strs[0][0] == '0') return "0"; return string.Concat(strs); }
Find kth largest element. Use quickselect O(n) average, or heap O(n log k).
// Method 1: Sort (simplest) public int FindKthLargest(int[] nums, int k) { Array.Sort(nums); return nums[nums.Length - k]; } // Method 2: Min heap of size k public int FindKthLargestHeap(int[] nums, int k) { var heap = new PriorityQueue<int, int>(); foreach (int num in nums) { heap.Enqueue(num, num); if (heap.Count > k) heap.Dequeue(); } return heap.Peek(); }
Check if two strings are anagrams. Sort both and compare.
public bool IsAnagram(string s, string t) { char[] s1 = s.ToCharArray(); char[] t1 = t.ToCharArray(); Array.Sort(s1); Array.Sort(t1); return new string(s1) == new string(t1); }
Merge sort for linked list (O(n log n) time, O(log n) space).
public ListNode SortList(ListNode head) { if (head?.next == null) return head; ListNode slow = head, fast = head.next, prev = null; while (fast?.next != null) { prev = slow; slow = slow.next; fast = fast.next.next; } prev.next = null; ListNode left = SortList(head); ListNode right = SortList(slow); return Merge(left, right); }
Interviewer asks: "Why does this sort fail for this data structure?" Your answer: Likely stability issue. Explain: equal-key elements lost relative order, causing incorrect secondary sort.
// WRONG: Subtraction can overflow return a - b; // If a = -2B31, b = 1, result overflows // CORRECT: Use CompareTo or safe comparison return a.CompareTo(b);
Why merge sort is O(n log n): Splits array n times (log n depth). Each level does n merging work. Total: n · log n.
Why quicksort average O(n log n): Each partition splits roughly in half. Expected depth log n. Each level O(n) work.
Why counting sort is O(n+k): Count pass O(n). Reconstruct pass O(n+k) (k buckets).