The reason these concepts are true is due to the manner in which computer programs are created. Generally, data that is related is stored in consecutive locations in memory. One common pattern in computing is that processing is performed on a single item and then the next. This means that if a lot of processing is done that the single item will be accessed more than once, thus leading to temporal locality of reference. Furthermore, moving to the next item implies that the next item will be read, hence spatial locality of reference, since memory locations are typically read in batches.
Increasing and exploiting locality of reference are common techniques for optimization. One can increase locality of reference by ensuring that related data is stored together by using arrays or by reducing code size. A cache is a simple example of exploiting locality of reference because it is a specially designed faster but smaller memory area, generally used to keep recently referenced data and data near recently referenced data, which can lead to potential performance increases.
Search Encyclopedia
|
Featured Article
|