Big-O notation allows us to compare how fast functions grow as $n$ becomes very large, which helps us compare algorithms based on their efficiency. The comparison rules are:
$f(n) = \mathcal{O}(g(n))$
) means that $f(n)$ grows no faster than $g(n)$ up to constant factors. Examples: $n^2 = \mathcal{O}(n^2)$, $100 n^2 = \mathcal{O}(n^2)$, $n^2 + n + 1 = \mathcal{O}(n^2)$, $n^2 = \mathcal{O}(n^3)$, and $n^2 = \mathcal{O}(2^n)$.$f(n) = o(g(n))$
) means $f(n)$ grows strictly slower than $g(n)$. Examples: $n^2 = \mathcal{o}(n^3)$, $1000n^2 + 500n + 2 = \mathcal{o}(n^3)$, $n^2 = \mathcal{o}(n^4)$, $n^2 = \mathcal{o}(2^n)$, $100 = \mathcal{o}(\log_2 n)$, $\log_2 n = \mathcal{o}(n)$, and $n = \mathcal{o}(n^2)$.$f(n) = \Omega(g(n))$
) means $f(n)$ grows at least as fast as $g(n)$. Examples: $n^2 = \mathcal{\Omega}(n^2)$, $100 n^2 = \mathcal{\Omega}(n^2)$, $n^2 + n + 1 = \mathcal{\Omega}(n^2)$, $n^3 = \mathcal{\Omega}(n^2)$, and $2^n = \mathcal{\Omega}(n^2)$.$f(n) = \omega(g(n))$
) means $f(n)$ grows strictly faster than $g(n)$. Examples: $n^5 = \omega(n^2)$, $n^4 + 300 = \omega(n)$, $2^n = \omega(n^2)$, $\log_2 n = \omega(1)$, $n = \omega(\log_2 n)$, and $n^2 = \omega(n)$.$f(n) = \Theta(g(n))$
) means $f(n)$ and $g(n)$ grow at the same rate, up to constants. Examples: $1000 = \Theta(1)$, $n^2 + 4n + 4 = \Theta(n^2)$, $n^5 + n^3 = \Theta(n^5)$, $e^x + n^2 = \Theta(e^x)$, and $\log_{10} n = \Theta(\log_2 n)$.