0/1 Knapsack Problem
A classic optimization problem where you must select items with given weights and values to maximize total value without exceeding the knapsack's capacity. Each item can be taken only once (0 or 1). This is a fundamental problem in combinatorial optimization, resource allocation, and decision-making scenarios.
Visualization
Interactive visualization for 0/1 Knapsack Problem
0/1 Knapsack Problem
• Time: O(n × W) where n = items, W = capacity
• Space: O(n × W) for DP table
• Classic dynamic programming problem
Interactive visualization with step-by-step execution
Implementation
1function knapsack01(weights: number[], values: number[], capacity: number): number {
2 const n = weights.length;
3 const dp: number[][] = Array(n + 1).fill(0).map(() => Array(capacity + 1).fill(0));
4
5 for (let i = 1; i <= n; i++) {
6 for (let w = 0; w <= capacity; w++) {
7 if (weights[i - 1] <= w) {
8 dp[i][w] = Math.max(
9 values[i - 1] + dp[i - 1][w - weights[i - 1]],
10 dp[i - 1][w]
11 );
12 } else {
13 dp[i][w] = dp[i - 1][w];
14 }
15 }
16 }
17
18 return dp[n][capacity];
19}
20
21// Space optimized version
22function knapsack01Optimized(weights: number[], values: number[], capacity: number): number {
23 const dp: number[] = Array(capacity + 1).fill(0);
24
25 for (let i = 0; i < weights.length; i++) {
26 for (let w = capacity; w >= weights[i]; w--) {
27 dp[w] = Math.max(dp[w], values[i] + dp[w - weights[i]]);
28 }
29 }
30
31 return dp[capacity];
32}Deep Dive
Theoretical Foundation
The 0/1 Knapsack problem uses dynamic programming to build a table where dp[i][w] represents the maximum value achievable using the first i items with a weight limit of w. The solution avoids the exponential time complexity of brute force by storing and reusing previously computed subproblem solutions.
Complexity
Time
O(nW)
O(nW)
O(nW)
Space
O(nW) or O(W) with optimization
Applications
Industry Use
Resource allocation in project management
Portfolio optimization in finance
Cargo loading in transportation
Memory management in computing
Budget allocation with constraints
Cutting stock problems in manufacturing
Use Cases
Related Algorithms
Longest Common Subsequence (LCS)
Finds the longest subsequence common to two sequences. A subsequence is a sequence that appears in the same relative order but not necessarily contiguously. This is fundamental in diff utilities, DNA sequence analysis, and version control systems like Git.
Edit Distance (Levenshtein Distance)
Edit Distance, also known as Levenshtein Distance, computes the minimum number of single-character edits (insertions, deletions, or substitutions) required to transform one string into another. Named after Soviet mathematician Vladimir Levenshtein who introduced it in 1965, this fundamental algorithm has applications in spell checking, DNA sequence analysis, natural language processing, and plagiarism detection.
Longest Increasing Subsequence (LIS)
Longest Increasing Subsequence (LIS) finds the length of the longest subsequence in an array where all elements are in strictly increasing order. Unlike a subarray, a subsequence doesn't need to be contiguous - elements can be selected from anywhere as long as their order is preserved. This classic DP problem has two solutions: O(n²) dynamic programming and O(n log n) binary search with patience sorting. Applications include stock price analysis, scheduling, Box Stacking Problem, and Building Bridges Problem.
Coin Change Problem
The Coin Change Problem finds the minimum number of coins needed to make a given amount using unlimited supplies of given coin denominations. It's a classic example of both dynamic programming and greedy algorithms, with two main variants: finding the minimum number of coins (optimization) and counting the number of ways to make change (counting). This problem has direct applications in currency systems, vending machines, and resource optimization.