Electrical computers and digital processing systems: memory – Addressing combined with specific memory configuration or... – Addressing cache memories
Reexamination Certificate
1999-02-22
2002-03-05
Nguyen, Hiep T. (Department: 2187)
Electrical computers and digital processing systems: memory
Addressing combined with specific memory configuration or...
Addressing cache memories
C711S118000
Reexamination Certificate
active
06353871
ABSTRACT:
FIELD OF THE INVENTION
The present invention relates generally to memory addressing schemes in computer systems, and specifically, to a directory system and method for reducing memory access latency in systems where CPU generated real addresses are translated to access data contents of main memory systems by means of a directory structure.
BACKGROUND OF THE INVENTION
An emerging development in computer organization is the use of data compression in a computer system's main memory, such that each cache line may be compressed before storage in main memory. The result is that cache lines, which in conventional computer systems that do not use main memory compression are of a uniform fixed size in the main memory, now, using memory compression, occupy varying amounts of space.
Techniques for efficiently storing and accessing variable size cache lines in main memory can be found in U.S. Pat. No. 5,761,536, and in co-pending U.S. patent application Ser. No. 08/603,976, entitled “COMPRESSION STORE ADDRESSING”, both assigned to the assignee of the present invention. The techniques require the use of a directory structure, in which real memory addresses generated by the CPU(s) (or processors) of the computer system are used to index into the directory, which is then used to find the main memory contents containing the compressed data. In contrast, in conventional computer systems that do not use compressed main memory, real memory addresses are used directly as main memory addresses. An example of a compressed main memory system and directory structure is now described with reference to
FIGS. 1-2
.
FIG. 1
shows the overall structure of an example computer system using compressed main memory. A CPU
102
reads and writes data from a cache
104
. Cache misses and stores result in reads and writes to the compressed main memory
108
by means of a compression controller
106
.
FIG. 2
shows in more detail the structure of the cache
104
, components of the compression controller
106
, and compressed main memory
108
of FIG.
1
. The compressed main memory is implemented using a conventional RAM memory M
210
, which is used to store a directory D
220
and a number of fixed size blocks
230
. The cache
240
is implemented conventionally using a cache directory
245
for a set of cache lines
248
. The compression controller
260
includes a decompressor
262
which is used for reading compressed data and a compressor
264
which is used for compressing and writing data, a number of memory buffers
266
used for temporarily holding uncompressed data, and control logic
268
. Each cache line is associated with a given real memory address
250
. Unlike a conventional memory, however, the address
250
does not refer to an address in the memory M
210
; rather, the address
250
is used to index into the directory D
220
. Each directory entry contains information which allows the associated cache line to be retrieved. The units of compressed data referred to by directory entries in D
220
may correspond to cache lines
248
; alternatively the unit of compression may be larger, that is sets of cache lines may be compressed together. The following examples assume the units of compressed data correspond to cache lines
248
. For example, the directory entry
221
for line
1
associated with address A
1
271
is for a line which has compressed to a degree in which the compressed line can be stored entirely within the directory entry; the directory entry
222
for line
2
associated with address A
2
272
is for a line which is stored in compressed format using a first full block
231
and second partially filled block
232
; finally, the directory entries
223
and
224
for line
3
and line
4
associated with addresses A
3
273
and A
4
274
, respectively, are for lines stored in compressed formats using a number of full blocks (blocks
233
and
234
for line
3
and block
235
for line
4
) and in which the remainders of the two compressed lines
3
and
4
have been combined in block
236
.
As a result of using the directory structure, the time for accessing main memory may be substantially longer than in a computer system that does not use such a directory, even disregarding the times for decompression (for reading from main memory) and compression (for writing to main memory). Thus, to a first approximation, the memory latency for reading a line is doubled as compared to directly addressed memories, since a first main memory access is required to read the directory entry, and in general, at least one additional main memory access is required to read or write the main memory data as specified by the directory entry.
SUMMARY OF THE INVENTION
The present invention is directed to a memory caching system and access methodology wherein some number of recently used directory entries are maintained in a directory cache in fast memory in order to reduce such memory access latency. This directory cache may be implemented as part of the compression controller hardware, or alternatively, part of the memory cache which is immediately above the main memory in the memory hierarchy may be used to store lines of main memory containing directory entries.
Use of the directory caching system of the invention greatly reduces the increased latency problem since on a main memory access in which the required directory entry is found in the cache, the latency is approximately the same as in conventional directly addressed main memory designs.
Advantageously, the directory cache may also contain pre-fetched directory entries, for example, all of the directory entries for the lines in a page or a set of associated pages could be loaded on the first reference to any line in the page or set of pages.
REFERENCES:
patent: 5412787 (1995-05-01), Forsyth et al.
patent: 5713003 (1998-01-01), DeWitt et al.
patent: 5729228 (1998-03-01), Franaszek et al.
patent: 5813031 (1998-09-01), Chou et al.
patent: 5864859 (1999-01-01), Franaszek
patent: 5903911 (1999-05-01), Gaskins
patent: 6151685 (2000-11-01), Li et al.
patent: 6173381 (2001-01-01), Dye
patent: 6212602 (2001-04-01), Wicki et al.
Benveniste Caroline D.
Franaszek Peter A.
Robinson John T.
Schulz Charles O.
Nguyen Hiep T.
Scully Scott Murphy & Presser
Tassinari, Esq. Robert P.
LandOfFree
Directory cache for indirectly addressed main memory does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Directory cache for indirectly addressed main memory, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Directory cache for indirectly addressed main memory will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2875271