Merge Sort (Divide & Conquer Paradigm)
Classic divide-and-conquer sorting algorithm that recursively divides array into halves until single elements, then merges sorted subarrays. Invented by John von Neumann in 1945, it's one of the most efficient general-purpose sorting algorithms with guaranteed O(n log n) performance. The algorithm perfectly demonstrates the divide-and-conquer paradigm: divide problem, solve subproblems recursively, combine solutions.
Visualization
Interactive visualization for Merge Sort (Divide & Conquer Paradigm)
🔵 Blue: Merging | 🟢 Green: Sorted
Interactive visualization with step-by-step execution
Implementation
1function mergeSort(arr: number[]): number[] {
2 if (arr.length <= 1) return arr;
3
4 const mid = Math.floor(arr.length / 2);
5 const left = mergeSort(arr.slice(0, mid));
6 const right = mergeSort(arr.slice(mid));
7
8 return merge(left, right);
9}
10
11function merge(left: number[], right: number[]): number[] {
12 const result: number[] = [];
13 let i = 0, j = 0;
14
15 while (i < left.length && j < right.length) {
16 if (left[i] <= right[j]) {
17 result.push(left[i++]);
18 } else {
19 result.push(right[j++]);
20 }
21 }
22
23 return result.concat(left.slice(i), right.slice(j));
24}Deep Dive
Theoretical Foundation
Merge sort divides the array into two halves, recursively sorts each half, then merges the two sorted halves. The divide step takes O(1), there are O(log n) levels of recursion, and merging takes O(n) at each level, giving O(n log n) total time. The merge operation compares elements from both halves and copies the smaller one, maintaining sorted order.
Complexity
Time
O(n log n)
O(n log n)
O(n log n)
Space
O(n)
Applications
Industry Use
External sorting (sorting large files)
Inversion count problems
Sorting linked lists
Parallel sorting algorithms
TimSort (Python, Java) uses merge sort
Database query optimization
Version control diff algorithms
Use Cases
Related Algorithms
Binary Search (Divide & Conquer)
Efficient search algorithm for sorted arrays using divide-and-conquer. At each step, eliminate half of remaining elements by comparing target with middle element. One of the most fundamental algorithms in computer science with O(log n) time complexity.
Closest Pair of Points
Find two points with minimum Euclidean distance among n points in 2D plane using divide-and-conquer. Naive O(n²) approach checks all pairs; divide-and-conquer achieves O(n log n) by recursively finding closest pairs in left/right halves and checking split pairs efficiently.
Karatsuba Multiplication
Fast multiplication algorithm for large integers using divide-and-conquer. Discovered by Anatoly Karatsuba in 1960, it was the first algorithm to demonstrate multiplication can be done faster than O(n²). Reduces three recursive multiplications instead of four, achieving O(n^log₂3) ≈ O(n^1.585) complexity.
Closest Pair of Points
Find two closest points in 2D plane. Divide-and-conquer approach achieves O(n log n). Classic algorithm demonstrating geometric divide-and-conquer.