Time Complexity
Describes how the runtime of an algorithm changes as the size of its input ($n$) grows.
Big O Notation ($O$)
A mathematical notation that describes the limiting behavior (worst-case scenario) of an algorithm’s runtime as the input size ($n$) grows.
$O(1)$ - Constant Time
Runtime is the same regardless of input size. \n\nPython Example: def get_first(data):\n return data[0] (Accessing an element by index)
$O(\log n)$ - Logarithmic Time
Runtime grows slowly
$O(n)$ - Linear Time
Runtime grows directly and proportionally with the input size ($n$). \n\nPython Example: for item in my_list:\n print(item) (Iterating through a list)
$O(n \log n)$ - Quasilinear Time
A very efficient time complexity for sorting algorithms. \n\nPython Example: sorted_list = sorted(my_list) (Python’s built-in Timsort)
$O(n^2)$ - Quadratic Time
Runtime grows as the square of the input size. Often involves nested loops. \n\nPython Example: for i in data:\n for j in data:\n print(i, j) (Nested iteration)
$O(2^n)$ - Exponential Time
Runtime doubles with each new element added to the input. Extremely slow. \n\nPython Example: A basic recursive Fibonacci function fib(n) = fib(n-1) + fib(n-2)
Big O Hierarchy (Fastest to Slowest)
$O(1) \ll O(\log n) \ll O(n) \ll O(n \log n) \ll O(n^2) \ll O(2^n)$