Common Coding Patterns Flashcards

(134 cards)

1
Q

What is the two pointers pattern?

A

Using two pointers that move through the data structure (often from opposite ends or same end with different speeds) to solve problems in linear time. Common with sorted arrays and linked lists.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Give three example problems that use the two pointer pattern.

A

1) Two Sum II (sorted array), 2) Remove duplicates from sorted array, 3) Container with most water, 4) Valid palindrome, 5) Three sum

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the sliding window pattern?

A

Maintaining a ‘window’ that slides over the data, expanding or contracting based on conditions. Used for substring/subarray problems with contiguous elements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the two types of sliding window problems?

A

Fixed-size window (e.g., max sum of k consecutive elements) and variable-size window (e.g., smallest subarray with sum >= target).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Give three example problems that use the sliding window pattern.

A

1) Maximum sum subarray of size k, 2) Longest substring without repeating characters, 3) Minimum window substring, 4) Longest substring with k distinct characters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the fast and slow pointers pattern?

A

Using two pointers moving at different speeds through a linked list or array. The fast pointer moves faster (usually 2x), creating a predictable relationship between their positions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Give three example problems that use fast and slow pointers.

A

1) Linked list cycle detection, 2) Find the middle of linked list, 3) Palindrome linked list, 4) Find the start of a cycle, 5) Happy number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the merge intervals pattern?

A

Dealing with overlapping intervals by sorting and then merging or processing them. Often involves comparing end of one interval with start of next.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Give three example problems that use the merge intervals pattern.

A

1) Merge overlapping intervals, 2) Insert interval, 3) Meeting rooms (can a person attend all?), 4) Meeting rooms II (min rooms needed)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the cyclic sort pattern?

A

When given an array containing numbers in a range (e.g., 1 to n), place each number at its correct index by swapping. Useful for finding missing/duplicate numbers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Give three example problems that use the cyclic sort pattern.

A

1) Find the missing number, 2) Find all missing numbers, 3) Find the duplicate number, 4) Find all duplicates

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the in-place linked list reversal pattern?

A

Reversing a linked list or portion of it without extra space by manipulating pointers. Track previous, current, and next nodes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the template for reversing a linked list in-place?

A

prev = null, curr = head. While curr: save next = curr.next, reverse pointer curr.next = prev, advance prev = curr, curr = next. Return prev.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the tree BFS pattern?

A

Level-order traversal using a queue. Process all nodes at current level before moving to next level. Useful for level-based problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Give three example problems that use tree BFS.

A

1) Level order traversal, 2) Zigzag traversal, 3) Level averages, 4) Minimum depth of tree, 5) Right side view of tree

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the three types of tree DFS traversals?

A

Pre-order (root, left, right), In-order (left, root, right), Post-order (left, right, root). Each has different use cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

When would you use in-order traversal?

A

In-order traversal of a BST visits nodes in sorted order. Useful for finding kth smallest element or validating BST property.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

When would you use post-order traversal?

A

When you need to process children before parents: calculating tree height, deleting trees, evaluating expression trees.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the two heaps pattern?

A

Using a max-heap and min-heap together to track a dynamic dataset, often for finding the median or maintaining a division of elements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

How do you find the median of a stream using two heaps?

A

Max-heap stores smaller half, min-heap stores larger half. Balance sizes (differ by at most 1). Median is top of larger heap or average of both tops.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is the backtracking pattern?

A

Building solutions incrementally, abandoning a partial solution (‘backtracking’) as soon as it’s determined it can’t lead to a valid solution. Explores all possibilities systematically.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is the template for backtracking?

A

Base case: add valid solution. For each choice: 1) make the choice, 2) recurse, 3) undo the choice (backtrack). Often track ‘used’ elements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Give three example problems that use backtracking.

A

1) Subsets, 2) Permutations, 3) Combination sum, 4) N-Queens, 5) Sudoku solver, 6) Word search, 7) Generate parentheses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is the modified binary search pattern?

A

Adapting binary search for non-obvious applications: searching in rotated arrays, finding boundaries, searching in infinite/unknown-size arrays.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What's the key insight for binary search on rotated sorted arrays?
One half is always sorted. Check which half is sorted by comparing mid with left/right, then determine which half to search based on target.
26
What is the top K elements pattern?
Finding or maintaining the K largest/smallest/most frequent elements, typically using a heap of size K or QuickSelect.
27
What's the approach for finding the K most frequent elements?
Count frequencies with a hash map, then either: 1) use a min-heap of size k, or 2) bucket sort where index = frequency.
28
What is the K-way merge pattern?
Merging K sorted collections efficiently using a min-heap to always get the next smallest element across all collections.
29
What is the monotonic stack pattern?
A stack that maintains elements in increasing or decreasing order. Used for finding the next greater/smaller element for each position.
30
Give three example problems that use monotonic stack.
1) Next greater element, 2) Daily temperatures, 3) Largest rectangle in histogram, 4) Trapping rain water
31
What is the topological sort pattern?
Ordering vertices of a DAG such that for every edge (u, v), u comes before v. Used for dependency resolution and scheduling.
32
What are the two ways to implement topological sort?
1) Kahn's algorithm: BFS with in-degree tracking. 2) DFS: add vertex to result after all descendants are processed (reverse post-order).
33
Give three example problems that use topological sort.
1) Course schedule, 2) Alien dictionary, 3) Task scheduling with dependencies, 4) Build order
34
What is the Union-Find pattern?
A data structure for efficiently tracking connected components, detecting cycles in undirected graphs, and grouping elements. Uses union by rank and path compression.
35
What are the key optimizations for Union-Find?
Path compression: make nodes point directly to root during find. Union by rank: attach smaller tree under larger tree. Together achieve nearly O(1) per operation.
36
Give three example problems that use Union-Find.
1) Number of connected components, 2) Redundant connection (find the extra edge), 3) Accounts merge, 4) Number of islands
37
What is the 0/1 Knapsack DP pattern?
Problems where you have items with properties (value, weight) and must choose to include or exclude each item exactly once to optimize some goal with constraints.
38
What is the unbounded knapsack DP pattern?
Like 0/1 knapsack but you can use items multiple times. Examples: coin change, rod cutting, ribbon cutting.
39
What is the Fibonacci numbers DP pattern?
Problems where each state depends on previous states in a simple recurrence. Often can optimize space to O(1) by keeping only necessary previous values.
40
What is the palindromic subsequence DP pattern?
Problems involving palindromes in strings. Often use 2D DP where dp[i][j] represents the answer for substring from i to j.
41
What is the longest common subsequence DP pattern?
Comparing two sequences to find common elements while preserving order. Uses 2D DP where dp[i][j] = answer for first i elements of seq1 and first j of seq2.
42
What is the matrix chain multiplication DP pattern?
Problems where you need to find optimal way to split/combine a sequence. Usually O(n³) with 2D DP on intervals.
43
What is the key insight for prefix sum arrays?
Pre-compute cumulative sums so you can find sum of any range [i, j] in O(1): rangeSum = prefix[j+1] - prefix[i]. Trade O(n) space for O(1) queries.
44
What is the difference map (sweep line) pattern?
For range update problems: increment at range start, decrement after range end, then compute prefix sum to get final values. O(n + k) for k ranges.
45
What is the bit manipulation pattern?
Using bitwise operations (AND, OR, XOR, shifts) to solve problems efficiently. XOR is particularly useful: a ^ a = 0, a ^ 0 = a.
46
How do you find the single number in an array where every other number appears twice?
XOR all numbers together. Duplicates cancel out (x ^ x = 0), leaving only the single number.
47
What is the interval scheduling pattern?
Selecting maximum non-overlapping intervals. Greedy: sort by end time, always pick earliest ending interval that doesn't conflict.
48
What is the greedy algorithm pattern?
Making locally optimal choices at each step hoping to find global optimum. Only works when problem has greedy-choice property and optimal substructure.
49
How do you approach string matching problems?
Consider: two pointers, sliding window for substrings, hash map for character counts, trie for prefix matching, DP for edit distance/LCS.
50
What is the bidirectional BFS pattern?
Searching from both start and goal simultaneously, meeting in the middle. Reduces time from O(b^d) to O(b^(d/2)) where b is branching factor.
51
What is the island pattern (matrix DFS/BFS)?
Exploring connected cells in a 2D grid. Use DFS/BFS from each unvisited target cell, marking visited cells to avoid re-processing.
52
How do you approach 'number of ways' DP problems?
Usually additive: dp[i] = sum of dp[previous states that lead to i]. Count paths by summing path counts from predecessors.
53
How do you approach 'minimum/maximum' DP problems?
Usually uses min/max: dp[i] = min/max over all previous states. Track optimal value, not count.
54
What questions should you ask before choosing a DP approach?
1) Does the problem have optimal substructure? 2) Are there overlapping subproblems? 3) What are the states? 4) What's the recurrence relation? 5) What are base cases?
55
What are the TWO main variations of the two pointers pattern?
1) Opposite ends: pointers start at beginning and end, move toward each other. 2) Same direction: both start at beginning, move at different speeds or conditions (like fast/slow or sliding window).
56
What signals suggest you should use the two pointers pattern?
Sorted array/list, need to find pairs with certain sum/difference, need to compare elements from both ends, need to partition array in-place, or problem mentions 'in-place' with O(1) space.
57
What is the time and space complexity of the two pointers pattern?
Typically O(n) time (each pointer traverses at most n elements) and O(1) space (just storing pointer indices).
58
Walk through the two pointers approach for Two Sum II (sorted array).
Left pointer at index 0, right pointer at last index. Calculate sum: if sum equals target, return indices. If sum < target, move left pointer right (need larger). If sum > target, move right pointer left (need smaller).
59
What signals suggest you should use the sliding window pattern?
Problems asking for 'contiguous subarray/substring', 'longest/shortest with condition', 'maximum/minimum sum of k elements', or anything involving a range that can expand/contract.
60
What is the time and space complexity of sliding window?
Time: O(n) - each element is added and removed from window at most once. Space: O(1) for fixed window, O(k) for variable window tracking k distinct elements.
61
What is the template for variable-size sliding window?
Initialize left=0. For each right position: expand window by including element at right. While window is invalid: shrink by removing element at left, increment left. Update result if window is valid.
62
What is the template for fixed-size sliding window?
For first k elements, build initial window. Then for each new element: add it to window, remove leftmost element, update result. Window always has exactly k elements.
63
Walk through sliding window for 'longest substring without repeating characters'.
Use Set to track chars in window. Expand right pointer, add char to Set. If char already in Set, shrink from left (remove chars from Set) until no duplicate. Track max length seen.
64
What signals suggest you should use fast and slow pointers?
Linked list cycle detection, finding middle of linked list, finding kth element from end, detecting if linked list is a palindrome, or any cyclic structure.
65
Why does fast and slow pointers work for cycle detection?
If there's a cycle, fast pointer will eventually 'lap' the slow pointer and they'll meet. If no cycle, fast pointer reaches null. The speed difference guarantees they meet within the cycle.
66
How do you find the START of a cycle using fast and slow pointers?
After detecting cycle (fast meets slow), reset one pointer to head. Move both at same speed (1 step). They'll meet at cycle start. This works due to mathematical relationship of distances.
67
What is the time and space complexity of fast and slow pointers?
Time: O(n) - in worst case traverse entire list. Space: O(1) - only two pointer variables regardless of list size.
68
What signals suggest you should use merge intervals?
Problems involving time intervals, ranges, scheduling, or anything with start/end pairs that might overlap. Keywords: 'overlapping', 'merge', 'intersect', 'conflicts'.
69
What is the template for merge intervals?
Sort intervals by start time. Initialize result with first interval. For each remaining interval: if it overlaps with last in result (start <= last.end), merge by updating end = max(ends). Otherwise, add as new interval.
70
How do you check if two intervals overlap?
Intervals [a,b] and [c,d] overlap if a <= d AND c <= b. Equivalently, they DON'T overlap if b < c OR d < a.
71
What is the time and space complexity of merge intervals?
Time: O(n log n) due to sorting. The merging pass is O(n). Space: O(n) for result array (or O(log n) for sorting if done in-place).
72
What signals suggest you should use cyclic sort?
Array contains numbers in range 1 to n (or 0 to n-1), asked to find missing/duplicate numbers, or need to place elements at their 'correct' index.
73
What is the template for cyclic sort?
i = 0. While i < n: if element at i is not at its correct position AND correct position has different element, swap element to its correct position. Else increment i. After sorting, scan for mismatches.
74
What is the time and space complexity of cyclic sort?
Time: O(n) - each element is swapped at most once to its correct position. Space: O(1) - sorting is done in-place.
75
Walk through cyclic sort for 'find the missing number' in [3,0,1].
Correct positions: 0→index0, 1→index1, 2→index2, 3→index3. Swap until each is in place. After sorting, iterate and find where index != value. That's the missing number.
76
What signals suggest you should use in-place linked list reversal?
Problems asking to reverse a linked list or portion of it, checking palindrome in linked list, or reordering nodes without extra space.
77
Walk through reversing a linked list in-place for 1→2→3→null.
prev=null, curr=1. Save next=2, point 1→null, prev=1, curr=2. Save next=3, point 2→1, prev=2, curr=3. Save next=null, point 3→2, prev=3, curr=null. Return prev (3→2→1→null).
78
How do you reverse a portion of a linked list (from position m to n)?
Traverse to position m-1 (connection point). Reverse n-m+1 nodes using standard reversal. Connect: node before m points to new head (old n), old m points to node after n.
79
What is the time and space complexity of linked list reversal?
Time: O(n) - traverse list once. Space: O(1) - only a few pointer variables.
80
What signals suggest you should use BFS for trees?
Level-order traversal, finding minimum depth, connecting nodes at same level, zigzag traversal, or any problem requiring processing level by level.
81
What is the template for tree BFS?
Initialize queue with root. While queue not empty: record queue size (level size). For each node in current level: process node, add children to queue. After inner loop, you've completed one level.
82
What is the time and space complexity of tree BFS?
Time: O(n) - visit each node once. Space: O(w) where w is max width of tree. Worst case O(n) for complete binary tree's last level.
83
What signals suggest you should use DFS for trees?
Path sum problems, tree diameter, validating BST, finding ancestors, serialization, or any problem requiring exploring full paths from root to leaf.
84
What is the template for recursive tree DFS?
Base case: if node is null, return base value. Recursive case: process current node, recurse on left subtree, recurse on right subtree. Combine results based on problem (pre/in/post-order).
85
What is the template for iterative tree DFS (pre-order)?
Initialize stack with root. While stack not empty: pop node, process it, push right child then left child (so left is processed first). This gives pre-order traversal.
86
What is the time and space complexity of tree DFS?
Time: O(n) - visit each node once. Space: O(h) where h is height of tree. Worst case O(n) for skewed tree, O(log n) for balanced tree.
87
What signals suggest you should use two heaps?
Finding median of data stream, sliding window median, or any problem requiring quick access to both the smaller and larger halves of data.
88
How do you maintain balance between two heaps for finding median?
After each insertion, rebalance so heaps differ in size by at most 1. If maxHeap.size > minHeap.size + 1, move maxHeap top to minHeap. Vice versa for minHeap.
89
What is the time and space complexity of the two heaps pattern?
Insert: O(log n) for heap operations. Find median: O(1). Space: O(n) to store all elements across both heaps.
90
What signals suggest you should use backtracking?
Problems asking for 'all combinations', 'all permutations', 'all possible', 'generate all', or constraint satisfaction problems like Sudoku, N-Queens.
91
What is the difference between backtracking and brute force?
Backtracking prunes invalid paths early (abandons partial solutions that can't work), while brute force explores all possibilities. Backtracking is optimized brute force.
92
What is the time complexity of backtracking for permutations vs combinations?
Permutations of n elements: O(n!). Combinations/subsets of n elements: O(2^n). These are inherent to the problem, not the algorithm.
93
Walk through backtracking for generating all subsets of [1,2].
Start with []. Choice: include 1 or not. Path with 1: [1], then include 2 or not → [1,2] and [1]. Path without 1: [], then include 2 or not → [2] and []. Result: [[], [1], [2], [1,2]].
94
What signals suggest you should use modified binary search?
Sorted or rotated sorted array, searching for boundary (first/last occurrence), searching for insertion point, or answer is monotonic (if x works, all values > x work).
95
What is the template for binary search finding leftmost occurrence?
While left < right: mid = left + (right - left) / 2. If condition is satisfied at mid, right = mid (answer could be mid or left of it). Else left = mid + 1. Return left.
96
What is the template for binary search finding rightmost occurrence?
While left < right: mid = left + (right - left + 1) / 2 (round up). If condition satisfied at mid, left = mid. Else right = mid - 1. Return left.
97
What is 'binary search on answer' and when do you use it?
When the answer is a number in a range and you can check if a given answer is feasible. Binary search over possible answers, checking feasibility for each. Example: minimum capacity to ship packages in d days.
98
What signals suggest you should use top K elements?
Problems with 'k largest', 'k smallest', 'k most frequent', 'k closest', or any ranking/selection of k items from a larger set.
99
Should you use a min-heap or max-heap for 'K largest elements'?
Use a MIN-heap of size K. The smallest of the K largest is at the top. If new element > heap top, remove top and add new element. Heap always contains K largest seen.
100
What is the time complexity of finding K largest using a heap vs sorting?
Heap: O(n log k) - process each of n elements with O(log k) heap operation. Sorting: O(n log n). Heap is better when k << n.
101
What signals suggest you should use K-way merge?
Merging K sorted arrays/lists, finding smallest range covering elements from K lists, or any problem with multiple sorted sequences to combine.
102
What is the template for K-way merge?
Add first element from each list to min-heap (with list index and element index). Extract min, add to result, insert next element from same list. Repeat until heap empty.
103
What is the time and space complexity of K-way merge?
Time: O(n log k) where n is total elements across all lists and k is number of lists. Space: O(k) for the heap.
104
What signals suggest you should use monotonic stack?
Problems asking for 'next greater element', 'previous smaller element', 'days until warmer', or span calculations. Key: need to look back/forward for first element satisfying a condition.
105
What is the template for 'next greater element' using monotonic stack?
Iterate right to left. While stack not empty AND stack top <= current: pop. If stack empty, no greater element; else stack top is answer. Push current element. Stack stays decreasing.
106
What is the time and space complexity of monotonic stack?
Time: O(n) - each element pushed and popped at most once. Space: O(n) worst case for stack.
107
Walk through monotonic stack for next greater element in [2,1,2,4,3].
Right to left: 3→stack[3], ans=-1. 4→pop 3, stack[4], ans=-1. 2→stack[4,2], ans=4. 1→stack[4,2,1], ans=2. 2→pop 1, stack[4,2], ans=4. Result: [4,2,4,-1,-1].
108
What signals suggest you should use topological sort?
Ordering tasks with dependencies, course prerequisites, build order, detecting cycles in directed graphs, or any problem with 'must come before' relationships.
109
What is Kahn's algorithm for topological sort?
Calculate in-degree for all vertices. Add all vertices with in-degree 0 to queue. While queue not empty: remove vertex, add to result, reduce in-degree of neighbors, add any with in-degree 0 to queue.
110
How do you detect a cycle using topological sort?
If after Kahn's algorithm completes, not all vertices are in result, there's a cycle. The remaining vertices are part of or depend on a cycle.
111
What is the time and space complexity of topological sort?
Time: O(V + E) - process each vertex and edge once. Space: O(V) for the in-degree array and queue/result.
112
What signals suggest you should use Union-Find?
Grouping/clustering problems, finding connected components, detecting cycles in undirected graphs, or problems asking 'are X and Y connected?'
113
What are the two main operations in Union-Find?
Find(x): return the root/representative of x's set. Union(x,y): merge the sets containing x and y by connecting their roots.
114
What is path compression in Union-Find?
During Find, make every node on the path point directly to the root. Flattens tree structure, making future operations faster.
115
What is union by rank in Union-Find?
When merging two sets, attach the shorter tree under the root of the taller tree. Keeps trees balanced, ensuring O(log n) operations.
116
What is the time complexity of Union-Find with both optimizations?
Nearly O(1) per operation - technically O(α(n)) where α is the inverse Ackermann function, which is ≤ 4 for any practical input size.
117
What signals suggest you should use dynamic programming?
Optimal substructure (optimal solution contains optimal solutions to subproblems), overlapping subproblems (same subproblems solved multiple times), counting problems, or optimization problems.
118
What is the difference between memoization and tabulation?
Memoization (top-down): recursive with cache, solves only needed subproblems. Tabulation (bottom-up): iterative, fills table from base cases, solves all subproblems.
119
When would you prefer memoization over tabulation?
When not all subproblems need to be solved, or when the recursive structure is more natural/easier to understand. Also useful when state space is sparse.
120
When would you prefer tabulation over memoization?
When you need all subproblems anyway, want to avoid recursion stack overhead, or need to optimize space by only keeping necessary previous states.
121
What is the template for 1D dynamic programming?
Define dp[i] = answer for subproblem of size i. Identify base cases (dp[0], dp[1]). Write recurrence: dp[i] = f(dp[i-1], dp[i-2], ...). Fill table from base cases up.
122
What is the template for 2D dynamic programming?
Define dp[i][j] = answer for subproblem with parameters i,j. Identify base cases (first row/column). Write recurrence relating dp[i][j] to smaller subproblems. Fill table row by row or column by column.
123
How do you identify the state variables for DP?
Ask: what information do I need to know to solve a subproblem? Common states: index/position in array, remaining capacity, length processed, boolean flags for constraints.
124
What is the recurrence relation for 0/1 Knapsack?
dp[i][w] = max(dp[i-1][w], dp[i-1][w-weight[i]] + value[i]). Either don't take item i (first term) or take it if it fits (second term).
125
How can you optimize 0/1 Knapsack space from O(nW) to O(W)?
Use 1D array, iterate weights in reverse. dp[w] = max(dp[w], dp[w-weight[i]] + value[i]). Reverse order ensures we use previous row's values.
126
What is the recurrence for Longest Common Subsequence?
If chars match: dp[i][j] = dp[i-1][j-1] + 1. If not: dp[i][j] = max(dp[i-1][j], dp[i][j-1]). Match extends LCS, no match takes best of excluding either char.
127
What is the recurrence for Edit Distance (Levenshtein)?
If chars match: dp[i][j] = dp[i-1][j-1]. If not: dp[i][j] = 1 + min(dp[i-1][j], dp[i][j-1], dp[i-1][j-1]). Three options: delete, insert, or replace.
128
What is the recurrence for Coin Change (minimum coins)?
dp[amount] = min(dp[amount], dp[amount - coin] + 1) for each coin. Base case: dp[0] = 0. Initialize others to infinity.
129
What is the recurrence for Longest Increasing Subsequence?
dp[i] = max(dp[j] + 1) for all j < i where nums[j] < nums[i]. Each dp[i] is length of LIS ending at index i. O(n²) basic, O(n log n) with binary search.
130
What signals suggest you should use graph BFS?
Shortest path in unweighted graph, level-order traversal, finding all nodes at distance k, or any problem where you need to explore neighbors before going deeper.
131
What is the template for graph BFS?
Initialize queue with starting node(s), mark as visited. While queue not empty: dequeue node, process it, for each unvisited neighbor: mark visited, enqueue. Track distance/parent if needed.
132
What signals suggest you should use graph DFS?
Detecting cycles, topological sort, finding connected components, path finding (all paths, not shortest), or problems requiring exploration of entire branches.
133
What is the template for graph DFS?
Mark current node as visited. Process current node. For each neighbor: if not visited, recurse on neighbor. For cycle detection, also track nodes in current recursion stack.
134
What is the time and space complexity of BFS and DFS on graphs?
Both: Time O(V + E) - visit each vertex and edge once. Space O(V) for visited set and queue/recursion stack.