DSA Explorer
QuicksortMerge SortBubble SortInsertion SortSelection SortHeap SortCounting SortRadix SortBucket SortShell SortTim SortCocktail Shaker SortComb SortGnome SortPancake SortPatience SortCycle SortStrand SortWiggle Sort (Wave Sort)Bead Sort (Gravity Sort)Binary Insertion SortBitonic SortBogo Sort (Stupid Sort)Stooge SortOdd-Even Sort (Brick Sort)Pigeonhole SortIntro Sort (Introspective Sort)Tree Sort (BST Sort)Dutch National Flag (3-Way Partitioning)
Binary SearchLinear SearchJump SearchInterpolation SearchExponential SearchTernary SearchFibonacci SearchQuick Select (k-th Smallest)Median of Medians (Deterministic Select)Hill climbingSimulated AnnealingTabu SearchBinary Tree DFS SearchSentinel Linear SearchDouble Linear SearchTernary Search (Unimodal Function)Search in 2D Matrix
Binary Search Tree (BST)StackQueueHash Table (Hash Map)Heap (Priority Queue)Linked ListTrie (Prefix Tree)Binary TreeTrie (Prefix Tree)Floyd's Cycle Detection (Tortoise and Hare)Merge Two Sorted Linked ListsCheck if Linked List is PalindromeFind Middle of Linked ListBalanced Parentheses (Valid Parentheses)Next Greater ElementInfix to Postfix ConversionMin Stack (O(1) getMin)Largest Rectangle in HistogramDaily Temperatures (Monotonic Stack)Evaluate Reverse Polish NotationInfix Expression Evaluation (Two Stacks)Min Heap & Max HeapSliding Window MaximumTrapping Rain WaterRotate Matrix 90 DegreesSpiral Matrix TraversalSet Matrix ZeroesHash Table with ChainingOpen Addressing (Linear Probing)Double HashingCuckoo Hashing
Depth-First Search (DFS)Breadth-First Search (BFS)Dijkstra's AlgorithmFloyd-Warshall AlgorithmKruskal's AlgorithmPrim's AlgorithmTopological SortA* Pathfinding AlgorithmKahn's Algorithm (Topological Sort)Ford-Fulkerson Max FlowEulerian Path/CircuitBipartite Graph CheckBorůvka's Algorithm (MST)Bidirectional DijkstraPageRank AlgorithmBellman-Ford AlgorithmTarjan's Strongly Connected ComponentsArticulation Points (Cut Vertices)Find Bridges (Cut Edges)Articulation Points (Cut Vertices)Finding Bridges (Cut Edges)
0/1 Knapsack ProblemLongest Common Subsequence (LCS)Edit Distance (Levenshtein Distance)Longest Increasing Subsequence (LIS)Coin Change ProblemFibonacci Sequence (DP)Matrix Chain MultiplicationRod Cutting ProblemPalindrome Partitioning (Min Cuts)Subset Sum ProblemWord Break ProblemLongest Palindromic SubsequenceMaximal Square in MatrixInterleaving StringEgg Drop ProblemUnique Paths in GridCoin Change II (Count Ways)Decode WaysWildcard Pattern MatchingRegular Expression MatchingDistinct SubsequencesMaximum Product SubarrayHouse RobberClimbing StairsPartition Equal Subset SumKadane's Algorithm (Maximum Subarray)
A* Search AlgorithmConvex Hull (Graham Scan)Line Segment IntersectionCaesar CipherVigenère CipherRSA EncryptionHuffman CompressionRun-Length Encoding (RLE)Lempel-Ziv-Welch (LZW)Canny Edge DetectionGaussian Blur FilterSobel Edge FilterHarris Corner DetectionHistogram EqualizationMedian FilterLaplacian FilterMorphological ErosionMorphological DilationImage Thresholding (Otsu's Method)Conway's Game of LifeLangton's AntRule 30 Cellular AutomatonFast Fourier Transform (FFT)Butterworth FilterSpectrogram (STFT)
Knuth-Morris-Pratt (KMP) AlgorithmRabin-Karp AlgorithmBoyer-Moore AlgorithmAho-Corasick AlgorithmManacher's AlgorithmSuffix ArraySuffix Tree (Ukkonen's Algorithm)Trie for String MatchingEdit Distance for StringsLCS for String MatchingHamming DistanceJaro-Winkler DistanceDamerau-Levenshtein DistanceBitap Algorithm (Shift-Or, Baeza-Yates-Gonnet)Rolling Hash (Rabin-Karp Hash)Manacher's AlgorithmZ AlgorithmLevenshtein Distance

Candy Distribution

Greedy Methods
O(n) time, O(n) space
Intermediate

Distribute the minimum number of candies to children such that: (1) each child gets at least one candy, and (2) children with higher ratings than their neighbors get more candies. The elegant two-pass greedy algorithm ensures both left and right neighbor constraints are satisfied independently, then combines them. Classic problem demonstrating multi-pass greedy strategy.

Prerequisites:
Arrays
Greedy thinking
Two-pass techniques

Visualization

Interactive visualization for Candy Distribution

Interactive visualization with step-by-step execution

Implementation

Language:
1function candy(ratings: number[]): number {
2  const n = ratings.length;
3  const candies = new Array(n).fill(1);
4  
5  for (let i = 1; i < n; i++) {
6    if (ratings[i] > ratings[i-1]) {
7      candies[i] = candies[i-1] + 1;
8    }
9  }
10  
11  for (let i = n - 2; i >= 0; i--) {
12    if (ratings[i] > ratings[i+1]) {
13      candies[i] = Math.max(candies[i], candies[i+1] + 1);
14    }
15  }
16  
17  return candies.reduce((a, b) => a + b, 0);
18}

Deep Dive

Theoretical Foundation

Candy Distribution requires satisfying local constraints (higher rating → more candy than neighbors) while minimizing total. Key insight: handle left and right neighbor constraints independently. First pass (left-to-right): if rating[i] > rating[i-1], give candies[i] = candies[i-1] + 1. Second pass (right-to-left): if rating[i] > rating[i+1], ensure candies[i] >= candies[i+1] + 1 by taking max. After two passes, all constraints satisfied with minimum candies. Time: O(n) two passes. Space: O(n) for candies array. This demonstrates greedy approach handling bidirectional constraints.

Complexity

Time

Best

O(n)

Average

O(n)

Worst

O(n)

Space

Required

O(n)

Applications

Industry Use

1

Fair resource distribution with constraints

2

Salary allocation based on performance

3

Bonus distribution in organizations

4

Reward systems in games

5

Grade curve adjustment

Use Cases

Resource distribution
Fair allocation
Optimization problems

Related Algorithms

Huffman Coding

Huffman Coding is a lossless data compression algorithm that creates optimal prefix-free variable-length codes based on character frequencies. Developed by David A. Huffman in 1952 as a student at MIT, it uses a greedy approach to build a binary tree where frequent characters get shorter codes. This algorithm is fundamental in ZIP, JPEG, MP3, and many compression formats.

Greedy Methods

Activity Selection Problem

Select the maximum number of non-overlapping activities from a set, where each activity has a start and end time. This classic greedy algorithm demonstrates the greedy choice property: always selecting the activity that finishes earliest leaves the most room for remaining activities. Used in scheduling problems, resource allocation, and interval management. Achieves optimal solution with O(n log n) time complexity.

Greedy Methods

Fractional Knapsack Problem

Given items with values and weights, and a knapsack with capacity, select items (or fractions thereof) to maximize total value. Unlike the 0/1 knapsack where items must be taken whole, the fractional knapsack allows taking fractions of items. The greedy approach of taking items in order of value-to-weight ratio yields the optimal solution in O(n log n) time. This demonstrates when greedy algorithms work vs. when dynamic programming is needed.

Greedy Methods

Job Sequencing with Deadlines

Schedule jobs with deadlines and profits to maximize total profit. Each job takes 1 unit time and has a deadline and profit. The greedy strategy is to sort jobs by profit (descending) and greedily schedule each job as late as possible before its deadline. This maximizes profit while respecting constraints. Used in task scheduling, CPU job management, and project planning.

Greedy Methods
DSA Explorer

Master Data Structures and Algorithms through interactive visualizations and detailed explanations. Our platform helps you understand complex concepts with clear examples and real-world applications.

Quick Links

  • About DSA Explorer
  • All Algorithms
  • Data Structures
  • Contact Support

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy
  • Code of Conduct

Stay Updated

Subscribe to our newsletter for the latest algorithm explanations, coding challenges, and platform updates.

We respect your privacy. Unsubscribe at any time.

© 2026 Momin Studio. All rights reserved.

SitemapAccessibility
v1.0.0