Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories
Reexamination Certificate
2000-12-18
2004-05-04
Thai, Tuan V. (Department: 2186)
Electrical computers and digital processing systems: memory
Storage accessing and control
Hierarchical memories
C711S140000, C711S154000
Reexamination Certificate
active
06732236
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates in general to cache memory and more particularly to handling cache misses.
2. Description of Related Art
In general, packet processors have cache memory closely coupled to Execution Units (EUs) in an effort to speed access to memory information stored in main memory. However, cache memory only holds a fraction of the content that can be stored in main memory. Thus, the cache memory is constantly replacing its contents with information from the main memory to remain current with incoming access requests from one or more EUs.
Although the cache memory tries to remain current with incoming access requests, at some point in time, memory locations referenced by an EU load or store may not be in the cache memory, resulting in a cache miss. A cache miss triggers a refill operation that may take several clock cycles to complete. Meanwhile, one or more access requests from the EUs may hit on the cache line having the pending refill causing further cache misses.
One earlier solution to avoiding cache misses on a given cache line having a pending refill is to simply stall the EUs until the pending refill on the given cache line is completely processed. However, stalling the EUs from transmitting access requests has a negative performance impact. This is especially true in a case where cache memory is shared by multiple EUs. A second earlier solution is to treat a subsequent cache miss to a cache line having a pending refill as a normal cache miss. Thus a fill request for the subsequent cache miss is sent out to memory. The problem with this solution is that subsequent refill request is redundant with the already pending refill.
A third earlier solution to avoiding cache misses on a given cache line having a pending refill is to send a signal back to the requesting EU rejecting the access request. The problem with this prior art solution is that the EUs require extra logic to track all of its access requests. This extra logic requires more area and makes the EU design much more complex. It has become desirable for the cache memory to be able to continue to serve subsequent EU access requests to any and all memory locations while a cache fill is in progress without stalling the EUs. As will be disclosed in more detail below, the present invention advantageously achieves these and other desirable results.
SUMMARY OF THE INVENTION
In accordance with one aspect of the present invention, an access request associated with a cache miss to a single cache line having a pending cache fill can be handled in a non-blocking manner by storing the cache miss in a retry queue while the cache fill is pending. The retry queue then detects the return of the cache fill and inserts the access request associated with the cache miss onto the cache pipeline for processing.
REFERENCES:
patent: 5317720 (1994-05-01), Stamm et al.
patent: 5333296 (1994-07-01), Bouchard et al.
patent: 5404483 (1995-04-01), Stamm et al.
patent: 5432918 (1995-07-01), Stamm
patent: 5765199 (1998-06-01), Chang et al.
patent: 5809320 (1998-09-01), Jain et al.
patent: 6145054 (2000-11-01), Mehrotra et al.
patent: 6148372 (2000-11-01), Mehrotra et al.
patent: 6339813 (2002-01-01), Smith, III et al.
Morrison & Foerster / LLP
Redback Networks Inc.
Thai Tuan V.
LandOfFree
Cache retry request queue does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Cache retry request queue, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Cache retry request queue will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3225841