Dynamically size configurable data buffer for data cache and...

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S119000, C711S129000, C711S137000, C712S207000, C712S224000

Reexamination Certificate

active

06314494

ABSTRACT:

FIELD OF THE INVENTION
This invention relates generally to computer memory. More particularly, this invention relates to a sizeable data buffer that includes data cache memory and prefetch cache memory.
BACKGROUND
Most modern computer systems include a controller and a main memory. The speed at which the controller can decode and execute instructions to process data has for some time exceeded the speed at which instructions and data can be transferred from main memory to the controller. In an attempt to reduce the problems cause by this mismatch, most computer systems include a cache memory between the controller and main memory.
The cache memory is used in computer systems to effectively accelerate the execution speed of instructions and commands by the computer controller. Cache memory is relatively small memory compared to the main memory. The cache memory, however, provides much faster access time than the access time associated with the main memory. The cache memory provides quick access to instructions and data that are most frequently used by the controller.
Instructions and data received from the main memory by the controller for execution are also stored in the high speed cache memory. Therefore, the controller has ready access to the most recently executed instructions and data if the same instructions or data be needed again by the controller. When the controller requires an instruction and data a second time, rather than initiating a relatively slow main memory retrieval, the controller quickly retrieves the information from the high speed cache memory.
It is important to keep track of which lines of code and data are stored in the cache memory. One technique is to use TAG cache memory which includes memory locations for storing TAG addresses that correspond to addresses of the particular information stored in the cache memory. The controller generates a request for an instruction of data in the form of an address. Retrievals from main memory are made using the entire memory address. The cache memory is smaller than the main memory. Therefore, only a subset of the entire memory address is required for retrievals from the cache memory. The portion of the entire memory address that is not included within the subset required for retrievals from the cache memory is stored within the corresponding location of the TAG cache memory. The TAG addresses are used to determine when an address generated by the controller is one in which the cache memory contains the requested information. To accomplish this, the address generated by the controller is compared to the TAG addresses. When the address generated by a request from the controller matches a TAG address, then the cache memory contains the requested information, and a TAG hit occurs. If an address generated by the controller fails to match any TAG address, then a TAG miss occurs. When a TAG miss occurs, e.g. the requested information is not contained in the cache memory, a memory cycle to the main memory must be generated to obtain the requested information. A memory cycle to the main memory requires much more time than a memory cycle to the cache memory.
Many computer systems also include prefetch cache memory in which prefetch information, e.g. information that the controller does not immediately need, is stored. Prefetch information is requested and retrieved from main memory and stored in the prefetch memory because a high probability exists that the controller will request the information in the near future. The stored prefetch information is available to the controller in prefetch cache memory. Therefore, the controller does not need to retrieve this information from the main memory in the future.
Cache memory and prefetch cache memory in present computer systems exist in separate RAM (random access memory). It is generally believed that by physically separating the two types of cache, the performance of the computer system is enhanced. However, the requirement of separate RAM for each of the types of cache requires at least two separate RAM integrated circuits, which increases the cost of the computer system.
It is desirable to have a computer system in which cache memory and prefetch cache memory can be located within a single RAM, and therefore be less expensive. Additionally, it is desirable that the size of the prefetch cache memory be variably adjusted by a controller within the computer system to allow the controller to optimize the prefetch cache size depending upon the task being performed by the controller.
SUMMARY OF THE INVENTION
The present invention is a computer memory system in which cache memory and prefetch cache can be located within a single RAM integrated circuit. The size of the prefetch cache can be adjusted by a computer system controller.
A first embodiment of this invention includes a size configurable data buffer. The size configurable data buffer includes a plurality of data cache memory registers and a variable number of prefetch memory registers. This embodiment also includes control circuitry that allows for adjusting the number of prefetch memory registers responsive to a controller.
A second embodiment of the invention is similar to the first embodiment. The second embodiment includes a single size configurable data buffer SRAM circuit that includes the data cache memory registers and the prefetch memory registers.
A third embodiment of the invention is similar to the first embodiment. The control circuitry which allows for adjusting the number of prefetch memory registers of the third embodiment includes mask circuitry for masking line address bits when storing information in prefetch memory.
A fourth embodiment of the invention is similar to the third embodiment. The fourth embodiment further includes an address recovery SRAM for storing the masked line address bits.
A fifth embodiment of the invention is similar to the fourth embodiment. The fifth embodiment further includes a TAG SRAM for storing non-masked line address bits.
A sixth embodiment of the invention is similar to the fifth embodiment. The sixth embodiment further includes a TAG compare circuit for comparing requested line address bits with line address bits stored in the address recovery SRAM and the TAG SRAM.
Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.


REFERENCES:
patent: 5293609 (1994-03-01), Shih et al.
patent: 5566324 (1996-10-01), Kass
patent: 5586295 (1996-12-01), Tran
patent: 5680564 (1997-10-01), Divivier et al.
patent: 5737750 (1998-04-01), Kumar et al.
patent: 6134633 (2000-10-01), Jacobs

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Dynamically size configurable data buffer for data cache and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Dynamically size configurable data buffer for data cache and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Dynamically size configurable data buffer for data cache and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2599280

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.