Function Growth: Big-O Notation
Several more facts that will be useful when you take Brooklyn College's CISC 3220: Analysis of Algorithms course:
- If you know that $f(n) = \mathcal{O}(g(n))$ or $f(n) = o(g(n))$, then we call $g(n)$ an upper bound for $f(n)$ (we also say that $g(n)$ is assymptotically larger than $f(n)$) because $g(n)$ will surpass $f(n)$ for large inputs of $n$.
- If you know that $f(n) = \Omega(g(n))$ or $f(n) = \omega(g(n))$, then we call $g(n)$ a lower bound for $f(n)$ (we also say that $g(n)$ is assymptotically smaller than $f(n)$) because $f(n)$ will surpass $g(n)$ for large inputs of $n$.
- If you know that $f(n) = \Theta(g(n))$, then $g(n)$ is both an upper bound and a lower bound for $f(n)$.
- A function may have many upper bounds and many lower bounds. Some have no upper bound, or no lower bound.
- The least (smallest) upper bound (also called supremum) is the closest upper bound to the function.
- The greatest lower bound (also called infimum) is the closest lower bound to the function.
- If your current program runs at a speed of $g(n)$, and then you discover an algorithm $f(n) = \mathcal{O}(g(n))$, you should replace the $g(n)$ code with the $f(n)$ code. This is because $g(n)$ grows faster than $f(n)$, so a $g(n)$ program will run slower than a $f(n)$ one. Conclusion: code corresponding to a function that grows slower will result in an algorithm that runs faster!