Cache memory

Static information storage and retrieval – Systems using particular element – Flip-flop

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C365S154000, C365S188000

Reexamination Certificate

active

06813179

ABSTRACT:

The present invention relates to the general field of cache memory circuits.
Cache memory circuits are well known in the art as memory circuitry which may enable optimal response to the needs of a high speed processor. Cache memories are usable as temporary storage of information, for example of information relatively recently used by the processor. Information in cache RAM may be stored based upon two principles, namely spatial locality and temporal locality. The principle of spatial locality is based upon the fact that when data is accessed at an address, there is an above average likelihood that the data which is next required will have an address close to that of the data which has just been accessed. By contrast, temporal locality is based upon the fact that there is an above average probability that data which has just been accessed will be accessed again shortly.
In one approach therefore, when an item of data is accessed, adjacent data is written to cache memory in anticipation of the need to access it, and in another, the item which is accessed is stored. A desirable approach is to do both.
There are many different cache configurations, ranging from direct-mapped cache memory to fully-associative cache memory.
Although the present invention is described in the context of a set-associative cache memory, is not envisaged that it be so limited, and the architecture described and the particular circuit details are equally applicable to other types of cache.
In a typical cache memory, there is provided a so-called “tag RAM” and a so-called “data RAM”. The tag RAM stores information representative of an address in the memory of the computer, e.g. the main memory, and the data RAM stores the content of that address. Each entry in the tag RAM thus has an associated entry in the data RAM. In one type of cache, the tag memory stores the most significant bits of a memory address at a location in the tag RAM determined by the least significant bits of the memory address. Hence application of the least significant bits of the memory address to an address decoder causes the said location to be accessed, and thus the tag RAM outputs the stored most significant bits of the memory address. Comparison is then made between the output of the tag RAM, namely the most significant bits of the stored address and the most significant bits of the address being sought. When identity occurs between the output of the tag RAM and the address being sought, then there is said to be a hit in the tag RAM. A line or entry in the data RAM associated with the access from the address decoder is then output, by then consisting of the data stored at the memory address. If there is a hit between the address applied to the cache and the tag information stored, then the contents of the data memory are output from the cache. If there is no hit, (this situation is termed “a miss”) then the contents of the data memory are not output.
According to the particular technique being used, a mechanism may exist for overwriting both the tag and data RAMs if no hit occurs.
Cache memories are desirably fast in response, since a processor responsive to data stored in the cache cannot complete its action until it has retrieved data from the cache. Equally, the system must be made aware as quickly as possible if the data which is sought is NOT in the cache. It is clearly a truism for many situations that an indication of whether or not data is stored in the cache should only be made available when it has reliably been determined.
Consideration of the operation of a cache memory having a tag RAM, a comparator and a data RAM shows that the critical timing path is that through the tag RAM and comparator. While the comparator itself needs to be fast in response, it cannot produce a result until it has received valid data at its inputs: one input is an address from the system itself and the other input is the result of access to the tag RAM, and sense amplifiers sensing the data in the tag RAM.
The person skilled in the art will be aware that sense amplifiers respond to differentials on bit lines or to potentials on bit lines to provide an output which corresponds to the information stored in memory cells. Sense amplifiers exist which are very sensitive to input differentials, but nevertheless a delay must be provided from the instant at which the memory cell is activated before such sense amplifiers are activated if a reliable result is to achieved. The delay is determined by a number of factors, including the electrical length of the wordlines, and the current sourcing/sinking ability of the memory cells attached to the wordlines. Once a clock pulse is provided to activate the sense amplifier a further period elapses, due to the inherent delay of the sense amplifier, before the output of the sense amplifier will correspond to the memory cell contents.
Another problem in the prior art is that of complicated layouts of integrated cache memories. It would be desirable to have an architecture that enables single physically continuous wordlines to run through the tag and data RAMs.
It is accordingly one aim of the present invention to provide a cache memory circuit which at least partly overcomes the problems of the prior art.
According to the present invention there is provided an integrated cache memory circuit comprising a tag RAM, a comparator and a data RAM, each of said tag RAM and said data RAM having an array of memory cells and plural sense amplifiers, each memory cell of said RAMs being connectable via a respective bitline to one of said plural sense amplifiers, said sense amplifiers of said tag RAM having respective outputs coupled to a first input of said comparator, said comparator having a second input for address information, and an output for selectively enabling data output from sense amplifiers of said data RAM, wherein the memory cells of said tag RAM are arranged to have a higher current drive than the memory cells of said data RAM.
In one embodiment, the memory cells of said tag RAM and of said data RAM comprise transistors, and the transistors of said tag RAM cells have a greater electrical width than the transistors of said data RAM.
In another embodiment, the memory cells of said data RAM have a first drive potential and the memory cells of said tag RAM have a second drive potential, said first drive potential being higher than said second drive potential.
Preferably said data RAM and tag RAMs share common wordlines.


REFERENCES:
patent: 5253203 (1993-10-01), Partovi et al.
patent: 5339399 (1994-08-01), Lee et al.
patent: 6378050 (2002-04-01), Tsuruta et al.
patent: 0 549 218 (1993-06-01), None
Bechade, R. et al., “A 32B 66MHz 1.8W Microprocessor,”IEEE International Solid-State Circuits Conference, New York, vol. 37, Feb. 1994, pp. 208-209, 340.
Gerosa, G. et al. “A 2.2W, 80 MHz Superscalar RISC Microprocessor,”IEEE Journal of Solid-State Circuits, 29(12):140-1454, Dec. 1994.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Cache memory does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Cache memory, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Cache memory will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3312196

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.