Systems With Predictable Caching"The machine prefers cash." -- Soldier in the Rain 
These days machines would probably prefer cache, but for 1963 this was a reasonably acurate and remarkably concise prophesy for computer memory. Current computer architectures rely heavily on the use of cache memory. Integrated with the processor on a single large chip, caches enable the processor to operate at high speed. Most instructions and data can be rapidly accessed from the caches instead of from the main memory which is usually at least ten times slower. On-chip caches have grown steadily in size over the last decade, and now represent a significant proportion of the cost and power consumption of the processor chip.
Although it is normally the case that large caches offer better performance than small ones, it is also clear that the performance is not directly related to the size of the cache. Also, due to the trend towards optimising for averages cases to improve performance, it is far more difficult to reason about the performance of computer systems, and to give an accurate prediction of the performance of specific programs.
Henk Muller, David May, James Irwin, and Dan Page. Novel Caches for Predictable Computing. Technical Report CSTR-98-011, Department of Computer Science, University of Bristol, October 1998. Gzipped PostScript: 54354 bytes.
Dan Page, firstname.lastname@example.org, James Irwin, email@example.com, Henk Muller, Henk.Muller@bristol.ac.uk, David May, David.May@bristol.ac.uk. Last modified on Thursday 12 August 1999 at 13:04. © 1999 University of Bristol