Range and Accuracy

Range is defined as the difference between the largest and smallest numbers that can be represented in a given format.

Accuracy tells how close a number is to another number.

Unfortunately, because computers can store only a limited amount of data (our disks don't have infinite capacities,) this means that we can't represent each and every number out there in the universe. For example, you can't store the number $\pi \approx 3.14159265\dots$, which has an infinite number of digits after the decimal point.