What is Big-O notation and why does it matter in interviews?
Big-O describes how an algorithm’s time or space grows as input size n increases. Ignores constants and lower-order terms. Interviewers ALWAYS ask: what is the time and space complexity of your solution? Order from best to worst: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!). Always state complexity unprompted at the end of your solution — it shows engineering maturity
What is the Big-O of all common Array operations?
Access by index: O(1). Search (unsorted): O(n). Search (sorted + binary search): O(log n). Insert at end (amortized): O(1). Insert at middle/beginning: O(n) (shift elements). Delete at end: O(1). Delete at middle/beginning: O(n). Space: O(n). Key insight: arrays are great for index access but expensive for insertions/deletions in the middle
What is the Big-O of all common LinkedList operations?
Access by index: O(n). Search: O(n). Insert at head: O(1). Insert at tail: O(1) if tail pointer held — else O(n). Insert at middle (given node): O(1). Delete head: O(1). Delete tail: O(n) for singly linked — O(1) doubly linked. Delete middle (given node): O(1). Space: O(n). Key insight: LinkedList is great for O(1) head inserts/deletes but O(n) for random access
What is the Big-O of all common HashMap/HashSet operations?
Insert: O(1) average — O(n) worst case (all keys hash to same bucket). Delete: O(1) average. Search/Get: O(1) average. Space: O(n). Key insight: HashMap is the most powerful tool in coding challenges — use it to reduce O(n²) brute force to O(n) by trading space for time. Always consider: can I precompute a frequency map or prefix sum?
What is the Big-O of all common Stack and Queue operations?
Stack (push/pop/peek): O(1) for all. Queue (enqueue/dequeue/peek): O(1) for all. Both use O(n) space. Stack implemented with: array or LinkedList. Queue implemented with: LinkedList or circular array (deque). Java: Stack class (legacy — use Deque) + ArrayDeque (fast stack+queue) + LinkedList (queue). Key insight: stack = LIFO (last in first out). Queue = FIFO (first in first out)
What is the Big-O of all common Tree operations?
BST (balanced): Search O(log n) — Insert O(log n) — Delete O(log n). BST (unbalanced/worst): O(n) for all. Binary Tree traversal (DFS/BFS): O(n) — must visit all nodes. Height of balanced tree: O(log n). Height of skewed tree: O(n). Space for DFS: O(h) where h=height — O(log n) balanced — O(n) skewed. Space for BFS: O(w) where w=max width — O(n) worst case
What is the Big-O of all common Sorting algorithms?
Bubble Sort: O(n²) time — O(1) space — stable. Selection Sort: O(n²) — O(1) — not stable. Insertion Sort: O(n²) worst — O(n) nearly sorted — O(1) — stable — best for small/nearly sorted. Merge Sort: O(n log n) always — O(n) space — stable. Quick Sort: O(n log n) average — O(n²) worst — O(log n) space — not stable. Heap Sort: O(n log n) — O(1) — not stable. Counting/Radix: O(n+k) — for integers. Java Arrays.sort(): Timsort O(n log n)
What is the Big-O of all common Heap/Priority Queue operations?
Insert: O(log n). Extract min/max: O(log n). Peek min/max: O(1). Build heap from array: O(n) (not O(n log n) — important!). Search: O(n). Space: O(n). Java: PriorityQueue (min-heap by default — use Collections.reverseOrder() for max-heap). Key insight: use heap when you need the k-th largest/smallest or need to repeatedly extract min/max. Top-K problems: O(n log k) with a heap of size k
What is the Big-O of Graph traversal algorithms?
BFS: O(V+E) time — O(V) space (queue + visited set) — V=vertices E=edges. DFS: O(V+E) time — O(V) space (stack/recursion + visited). Dijkstra: O((V+E) log V) with min-heap. Bellman-Ford: O(VE) — handles negative weights. Floyd-Warshall (all pairs): O(V³). Topological Sort (Kahn’s or DFS): O(V+E). Union-Find (path compression + rank): O(α(n)) ≈ O(1) amortized
What is the Big-O of Binary Search?
Time: O(log n). Space: O(1) iterative — O(log n) recursive (call stack). Requires: sorted array. Each step eliminates half the search space. Template: lo=0 hi=n-1. while(lo<=hi): mid=lo+(hi-lo)/2. If arr[mid]==target return mid. If arr[mid]<target lo=mid+1. Else hi=mid-1. Return -1. Key: use lo+(hi-lo)/2 not (lo+hi)/2 to avoid integer overflow. Applies to: sorted arrays + answer-space binary search (bisect on a value range)
MEMORY BOOSTER: Big-O quick reference
O(1): HashMap get/put - array index access - stack push/pop. O(log n): binary search - balanced BST ops - heap insert/extract. O(n): linear scan - LinkedList traversal - tree traversal - BFS/DFS. O(n log n): merge sort - quicksort avg - heap sort - sorting. O(n²): bubble/selection/insertion sort - nested loops over array. O(2ⁿ): subsets - Fibonacci naive - exponential backtracking. O(n!): permutations. Space trick: O(n) HashMap to reduce time from O(n²) -> O(n)
What are the 14 essential coding interview patterns?
1) Sliding Window. 2) Two Pointers. 3) Fast and Slow Pointers. 4) Merge Intervals. 5) Cyclic Sort. 6) In-place LinkedList Reversal. 7) Tree BFS. 8) Tree DFS. 9) Two Heaps. 10) Subsets/Combinations. 11) Modified Binary Search. 12) Bitwise XOR. 13) Top-K Elements. 14) K-way Merge. Bonus: Dynamic Programming + Backtracking + Graph (BFS/DFS/Union-Find) + Monotonic Stack. Recognizing the pattern is 80% of solving the problem
What is the Sliding Window pattern and when do you use it?
Use when: contiguous subarray/substring problem + find max/min/average/longest/shortest window satisfying a condition. Avoids O(n²) nested loops. Types: Fixed window (size k) + Variable window (expand/shrink based on condition). Template: left=0 - for right in range(n): add arr[right] to window. While window invalid: remove arr[left] - left++. Update answer with current window. Examples: Maximum sum subarray of size k + Longest substring without repeating chars + Minimum window substring
What is the Two Pointers pattern and when do you use it?
Use when: sorted array + find pair/triplet + remove duplicates + palindrome check + comparing two arrays. Two variants: Opposite ends (left=0 right=n-1 move toward each other — for sorted array pair sum) + Same direction (both start at 0 move at different speeds — for remove duplicates/merge). Examples: Two Sum II (sorted) + 3Sum + Container With Most Water + Remove Duplicates + Merge Sorted Array. Key: usually requires sorted input for opposite-ends variant
What is the Fast and Slow Pointers pattern?
Also called Floyd’s Tortoise and Hare. Use when: linked list cycle detection + finding middle of linked list + finding cycle start. Fast pointer moves 2 steps — slow moves 1 step. If cycle: fast and slow meet inside cycle. Middle of list: when fast reaches end — slow is at middle. Template: slow=head fast=head. While fast and fast.next: slow=slow.next - fast=fast.next.next. If slow==fast: cycle found. Examples: Linked List Cycle + Happy Number + Find Cycle Start
What is the Merge Intervals pattern?
Use when: overlapping intervals — insert interval — merge intervals — find free slots. Sort by start time first. Template: sort intervals by start. result=[intervals[0]]. For each interval: if interval.start <= result.last.end: merge (update end = max(end - interval.end)). Else: add to result. Time: O(n log n) for sort. Examples: Merge Intervals + Insert Interval + Non-overlapping Intervals + Meeting Rooms II (use min-heap for count of simultaneous meetings)
What is the Modified Binary Search pattern?
Use when: sorted or rotated-sorted array + finding boundary + answer-space search. Not just arr[mid]==target — apply binary search logic to ANY monotonic condition. Templates: find leftmost (when arr[mid]>=target: hi=mid). Find rightmost (when arr[mid]<=target: lo=mid). Rotated array: check which half is sorted then decide direction. Answer-space: binary search on the answer value (not array index) — works when: feasible(x) is monotonic. Examples: Search Rotated Array + Find Min in Rotated + Koko Eating Bananas + Capacity to Ship
What is the Tree BFS pattern?
Use when: level-by-level traversal + shortest path in unweighted tree/graph + minimum depth. Template: queue=deque([root]). While queue: level_size=len(queue). For i in range(level_size): node=queue.popleft(). Process node. Add node.left and node.right if they exist. Result collected per level. Time O(n) Space O(n) (max width). Examples: Binary Tree Level Order + Zigzag Level Order + Minimum Depth + Connect Level Order Siblings + BFS shortest path in grid
What is the Tree DFS pattern?
Use when: path sum + path existence + tree structure problems + preorder/inorder/postorder. Recursive template: if not node: return base_case. left=dfs(node.left - …). right=dfs(node.right - …). return combine(node.val - left - right). Iterative: use explicit stack. Preorder: root-left-right. Inorder: left-root-right (gives sorted BST). Postorder: left-right-root. Examples: Path Sum + Max Path Sum + Diameter + Lowest Common Ancestor + Validate BST + Serialize/Deserialize
What is the Two Heaps pattern?
Use when: find median from data stream + sliding window median + schedule tasks to minimize cost. Use a max-heap for lower half and min-heap for upper half. Keep them balanced (sizes differ by at most 1). Median: if equal size = average of tops. If unequal = top of larger heap. Template: add to max_heap -> rebalance by moving max_heap.top() to min_heap if top > min_heap.top() -> balance sizes. Examples: Find Median from Data Stream + Sliding Window Median
What is the Top-K Elements pattern?
Use when: find k largest/smallest/most frequent elements. Use heap of size k (not sort entire array). Min-heap of size k for k-largest: add element — if heap size > k: pop min — remaining k elements are k-largest. O(n log k) vs O(n log n) for full sort. Max-heap for k-smallest. Examples: Kth Largest Element + Top K Frequent Elements + K Closest Points to Origin + Find K Pairs with Smallest Sums. Java: PriorityQueue. Python: heapq.nlargest() or maintain heap manually
What is the Subsets / Combinations / Permutations pattern?
Use when: generate all subsets + all combinations + all permutations + string permutations. Backtracking template: result=[]. def backtrack(start - current): result.append(current[:]). For i in range(start - n): current.append(nums[i]). backtrack(i+1 - current). current.pop(). Subsets: O(2ⁿ). Permutations: O(n!). Combinations k of n: O(C(n-k)). Key: add result copy at different points: entry=subsets - leaf=permutations. Pruning: sort + skip duplicates
What is the Backtracking pattern?
Backtracking explores all possibilities by making choices and undoing them. Template: def backtrack(state): if goal: add to results + return. For each choice: if valid(choice): make choice (modify state). backtrack(state). undo choice (restore state). Use when: find all solutions (not just one) + constraint satisfaction + generate combinations/permutations/subsets. Key optimizations: pruning (skip invalid branches early) + sort input (enables deduplication + early termination). Examples: N-Queens + Sudoku Solver + Word Search + Combination Sum
What is the Dynamic Programming pattern?
DP solves problems with optimal substructure and overlapping subproblems. Recognize DP when: find min/max/count/true-false + choices at each step + optimal depends on subproblems. Two approaches: Top-down (memoization: recursion + cache). Bottom-up (tabulation: fill dp array iteratively — usually faster). State definition is everything: dp[i] = answer for first i elements. Transition: dp[i] = f(dp[i-1] - dp[i-2] - …). Base cases: dp[0] dp[1]. Examples: Fibonacci + Climbing Stairs + Coin Change + Knapsack + LCS + LIS