Comparing Function Growth: Intro

In computer science, we often need to compare the growth rates of functions such as $f(n) = n^2$ and $g(n) = 2^n - 1$ (both having the domain of $\mathbb{N}$.)

When designing an algorithm, which is a recipe-like method of solving a certain computational problem, we care about how fast its running time grows as the input (e.g., a set) of size $n$ increases. That is, which computer program runs faster: one that does the work of the size of $n^2$, or one that does the work of the size of $2^n - 1$, both reaching the same correct solution anyways?

Initially, for small values of $n$, the function $f(n) = n^2$ may be larger than $g(n) = 2^n - 1$, but as $n$ grows, $g(n)$ eventually surpasses $f(n)$: check out the diagram on slide 3.

This leads us to ask: How can we compare the long-term behavior of two functions in a mathematically precise way?

To answer this question, we will explore a few math tools: proof by induction, loop invariants, and asymptotic analysis using Big-O notation. We'll define these terms as we go through each tool.