Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories
Reexamination Certificate
2001-06-28
2004-08-31
Portka, Gary (Department: 2188)
Electrical computers and digital processing systems: memory
Storage accessing and control
Hierarchical memories
C711S134000, C711S159000
Reexamination Certificate
active
06785770
ABSTRACT:
The invention relates to a data processing apparatus with a cache memory, an integrated circuit for use in such an apparatus and to a method of operating such an apparatus.
Cache memories are well known in the art, for example from U.S. Pat. No. 5,737,752. They serve to bridge the gap between the operating speed of a processor and that of a main memory. The cache memory has a higher operating speed than the main memory. When the processor uses (or is expected to use) data from main memory, a copy of that data is stored in the cache memory. Thus, when the processor needs the data, the data can be fetched from the cache memory faster than it could be fetched from main memory.
Conventionally, cache memories have been used between microprocessors and large off-chip DRAM memories. The cache is an SRAM preferably incorporated on the same chip as the microprocessor. Current technology allows the manufacture of processors and SRAMs, which are orders of magnitude faster than DRAMs. Cache memory has also been used for on-chip main memory, i.e. for integrated circuits in which the processor, the main memory and the cache are all integrated in the same integrated circuit. In this case, the main memory is also slow, for example because the on-chip main memory is optimized to reduce silicon area and rather than to maximize speed. In this case too, the cache is a faster memory that bridges the resulting gap in operating speed with the processor. A cache memory may even be applied to a processing apparatus where a first part of the main memory is on-chip together with the processor and the cache memory and second part of the main memory (still in the same memory space as the first part) is off-chip. In this case the cache memory is used to bridge different gaps in operating speed: the gap between the processor and the first part of main memory and the gap between the processor and the second part of main memory, the latter gap being generally larger than the former.
Cache memory is generally much smaller than main memory. When data from a main memory location is to be stored in cache memory and no free cache memory location is available, data that is in the cache for another main memory location will have to be overwritten. A cache management unit selects which cache location will be used. The selection depends on the cache replacement strategy. One well known strategy is the Least Recently Used (LRU) strategy, which selects the cache location that was least recently used by the processor. U.S. Pat. No. 5,737,752 describes a cache that uses this replacement mechanism or another user-selectable replacement mechanism, as an example of a user-selectable replacement mechanism this patent mentions random selection.
Amongst others, it is an object of the invention to provide a processing apparatus and method of operating a processing apparatus with an improved cache replacement strategy.
The method according to the invention is set forth in claim
1
. According to the invention, the cache replacement strategy is made dependent on the latency of the main memory locations for which data is in cache. When it is necessary to create room in the cache for storing new data, the memory management unit preferably select a location that is occupied by data for a main memory location that has a shorter access latency than other main memory locations for which data is in the cache. Thus, the differences in access latency of the main memory are exploited, in that it is expected that less time will be lost if the replaced has to be fetch the replaced data again from main memory. As an example, the access latency of different memory locations may differ because part of the main memory locations are in an on-chip DRAM and part of the main memory locations are in an off-chip DRAM.
REFERENCES:
patent: 5737752 (1998-04-01), Hilditch
patent: 5943687 (1999-08-01), Liedburg
patent: 6272598 (2001-08-01), Arlitt et al.
patent: 6385699 (2002-05-01), Bozman et al.
patent: 6408362 (2002-06-01), Arimilli et al.
patent: 2345987 (2000-07-01), None
Hoogerbrugge Jan
Stravers Paul
Koninklijke Philips Electronics , N.V.
Portka Gary
Ure Michael J.
LandOfFree
Data processing apparatus with a cache memory and method of... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Data processing apparatus with a cache memory and method of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Data processing apparatus with a cache memory and method of... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3271170