Method and apparatus for managing cache line replacement...

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S135000, C711S136000, C711S159000, C711S160000

Reexamination Certificate

active

06510493

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Technical Field
The present invention relates to data storage in general and, in particular, to a cache memory for storing data within a computer system. Still more particularly, the present invention relates to a mechanism for managing cache line replacement within a computer system.
2. Description of the Prior Art
Many high-performance data processing systems include both a system memory and a cache memory. A cache memory is a relatively high-speed memory that stores a copy of information that is also stored in one or more portions of the system memory. A cache line typically includes a directory for storing address tags, and a cache entry array for storing instructions or data. A compare match of an incoming address with one of the tags within the directory indicates a cache “hit;” otherwise, it is considered as a cache “miss.”
Typically, if there is a cache miss when all the cache lines within the cache memory are filled, then one of the cache lines within the cache memory must be selected for replacement. There are different cache replacement algorithms that are well-known to those skilled in the relevant art, none of which can provide optimal results on all software applications. For example, a least-recently used (LRU) replacement algorithm removes data from a cache memory if the re-use latency of the data is relatively long. However, if the data is required between groups of data stream that completely fill the cache memory, the data will be flushed from the cache memory, forcing the data to be reloaded back to the cache memory again after each large volume of data stream. Consequently, it is desirable to provide an improved mechanism for managing cache line replacement within a computer system such that optimal results can be achieved for a larger group of software applications.
SUMMARY OF THE INVENTION
In accordance with a preferred embodiment of the present invention, a cache memory comprises multiple cache lines that are partitioned into a first group and a second group. The number of cache lines in the second group is preferably larger than the number of cache lines in the first group. A replacement logic block selectively chooses a cache line from one of the two groups of cache lines for replacement during an allocation cycle.
All objects, features, and advantages of the present invention will become apparent in the following detailed written description.


REFERENCES:
patent: 4928239 (1990-05-01), Baum et al.
patent: 5249282 (1993-09-01), Segers
patent: 5353425 (1994-10-01), Malamy et al.
patent: 5369753 (1994-11-01), Tipley
patent: 5465342 (1995-11-01), Walsh
patent: 5481691 (1996-01-01), Day et al.
patent: 5564035 (1996-10-01), Lai
patent: 5666482 (1997-09-01), McClure
patent: 5737749 (1998-04-01), Patel et al.
patent: 6138213 (2000-10-01), McMinn
patent: 6260114 (2001-07-01), Schug
patent: 6272598 (2001-08-01), Arlitt et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for managing cache line replacement... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for managing cache line replacement..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for managing cache line replacement... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3007446

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.