Least recently used replacement method with protection

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S129000, C711S156000, C711S159000, C711S160000

Reexamination Certificate

active

06393525

ABSTRACT:

FIELD OF THE INVENTION
The invention relates generally to replacement of cached data in an electronic computer system, and more specifically to a least-recently-used replacement method that protects against undesirable replacement.
BACKGROUND OF THE INVENTION
Cache memory is often utilized in high-performance computing systems where fast system speeds require fast and expensive memory to access data at full speed. The faster a processor operates, the more quickly it must retrieve data and instructions from memory. This requires memory that can be accessed quickly, and due to the very high clock speeds often involved, the memory is typically located in close proximity to the processor. But, fast memory capable of operating as a cache is expensive, and locating large or custom amounts of memory close to the processor is often impractical. Therefore, a limited number of separate banks of cache memory, separate from a larger main system memory, are often placed near the processor core.
This cache memory typically consists of high-speed memory such as static random access memory (SRAM) and a cache controller. The controller manages data that is copied from the relatively slow main memory into the cache memory based on a prediction of what data the processor is likely to need soon. The cache memory often comprises between ten percent and one percent of the total system memory, but may vary over a greater range depending in part on the predictability of the memory access characteristics of the computing system.
Because successive memory accesses typically occur in a relatively small area of memory addresses, storing the most frequently accessed data in a cache can create significant improvements in system performance. Accessing this most frequently used data from the relatively fast cache memory eliminates forcing the processor to wait while the data is accessed from the slower main memory, and is referred to as a cache hit. If the data the processor needs is not located in cache but must be retrieved from main memory, the request is similarly said to be a cache miss.
The degree to which the cache effectively speeds up memory access can be measured by the number of memory requests that are cache hits and that are cache misses. It is the goal of the cache controller designer to place the data most likely to be needed by the processor in cache, maximizing the ratio of cache hits to cache misses. By employing such a scheme, the system can derive much of the benefit of having a high-speed memory, while reducing overall system cost by storing most data in relatively inexpensive lower-speed memory.
One such scheme for replacing less-often used data with more-often used data is to simply note which data stored in cache is least-recently used (LRU), and replace that data as new data is loaded into the cache. Such a method typically employs a number of LRU bits associated with each block of data stored in the cache, which contain data that indicate how long it has been since the data was last accessed. Such a scheme may displace data from a large set of data before it is needed again due to limited cache size. Further, the LRU algorithm may replace data that is used somewhat frequently with data that has been used more recently, even though the more recently used data is used much less frequently or only once. A method is therefore needed of ensuring that data that is more likely to be requested again is not replaced with data used more recently but less frequently.
SUMMARY OF THE INVENTION
A method of storing data in a cache memory system is provided. A cache data entry list records cache data entries stored in cache memory, and is divided into a filter list and a reuse list. The filter list is populated by storing all new cache data entries in the filter list, and the reuse list is populated by selectively promoting cache data entries from the filter list. Elements are evicted from the filter list and reuse list by operation of a protection process.


REFERENCES:
patent: 5778442 (1998-07-01), Ezzat et al.
patent: 5974507 (1999-10-01), Arimilli et al.
patent: 6105111 (2000-08-01), Hammarlund et al.
patent: 6138213 (2000-10-01), McMinn
“Method for improving least recently used buffer performance using multiple stack insertion points based on data types”,IBM Technical Disclaosure Bulletin, 36 (8), pp. 479-480, (Aug. 1993).
Karedla, R., et al., “Caching strategies to improve disk system performance”,Computer, 27 (3), pp. 38-46, (Mar. 1994).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Least recently used replacement method with protection does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Least recently used replacement method with protection, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Least recently used replacement method with protection will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2847597

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.