Recall that cache is a small RAM-like chip that is placed between the CPU and RAM and serves as a faster memory storage than RAM.
We already learned a few memory locality principles, which tell that, if we accessed a certain variable in main memory, there's a high chance that we'll access another near-by data.
We will now learn 3 different cache data organization methods called cache schemes, which instruct the CPU where to find data in cache.
Depending on which one of these schemes is used on a certain computer, the CPU will know how to check whether a particular variable or program instruction is stored in cache or not. If it is found in cache - hurray! we don't need to spend time to search RAM (which takes longer time to access than cache.) Otherwise, if that data isn't currently present in cache, we have no choice but to get it from RAM.