Cache memory system and method for accessing a cache memory...

Electrical computers and digital processing systems: memory – Storage accessing and control – Control technique

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S118000, C711S133000

Reexamination Certificate

active

06754791

ABSTRACT:

TECHNICAL FIELD
The present invention relates to data processing systems, and more particularly to memory caches used by such data processing systems.
BACKGROUND OF THE INVENTION
A cache is a small, fast memory that acts as a buffer between a device that uses a large amount of memory and a large, slower main memory. The cache's purpose is to reduce average memory-access time. Caches are effective because of two properties of software programs: spatial and temporal locality. Spatial locality asserts that because programs are generally composed of subroutines and procedures that execute sequentially, they often use data and instructions whose addresses are proximate. Temporal locality recognizes that since many programs contain loops and manipulate data arranged in lists and arrays, recently used information is more likely to be reused than older information.
Memory caches are used in a data processing system to improve system performance by maintaining instructions and/or data that are statistically likely to be used by a microprocessor or execution unit within such data processing system. Such likelihood of use is generally found to exist with instructions/data in close proximity to the currently executing instruction or currently accessed data. Referring to
FIG. 1
, high speed memory cache
11
is used to quickly provide such instructions or data to the microprocessor or CPU (execution unit)
9
, and thus to minimize delays that would be introduced if the processor were required to access slower main memory
13
. This slower main memory could be such things as dynamic RAM, a read only memory (ROM), an electrical, magnetic or optical disk or diskette, or any other type of volatile or non-volatile storage device known in the art.
The contents of a memory cache must be periodically replenished with instructions/data from main memory. The rate of data transfer between a cache and main memory can be greatly increased by using block data transfers to move information between them. Cache memory is typically organized into lines of data, with each line comprising a plurality of bytes or words of data. A line is used so that groups of bytes/words are transferred to/from cache instead of a single byte/word. For example, each cache line could have 32, 64 or 128 bytes of data. With a 32 byte cache line, 32 bytes can be fetched using a single block data transfer from the main memory each time a cache miss occurs.
A cache miss occurs whenever the cache does not contain an instruction or data needed by the CPU. When a cache miss occurs, the present cache line is reloaded from the slower memory/storage device with the requested value and n−1 bytes/words of instructions/data that immediately follow such requested value, where n is the size of the cache line.
However, at times it would be advantageous to fetch a line from memory for certain load/store operations without displacing or overwriting a line presently in the cache. This requirement for non-displacement could occur, for example, when performing matrix operations on data contained in a cache. If a cache miss occurs when accessing non-matrix data, it would be advantageous to not disturb the existing matrix data while accessing this non-matrix data value. This requirement for non-displacement could also occur if it is known that the line to be loaded may not be used again for a long time (e.g. updating a page frame table), or if only one word needs to be read from memory infrequently. Presently known systems do not allow for selective memory access which can preempt a cache line reload.
SUMMARY OF THE INVENTION
It is therefore an object of the present invention to provide an improved data processing system.
It is a further object of the present invention to provide for improved performance in a data processing system.
It is yet a further object of the present invention to provide an improved cache in a data processing system.
It is still a further object of the present invention to minimize the number of cache line reloads required in a data processing system.
It is yet another object of the present invention to selectively access memory while preempting a cache line reload.
An additional line is provided within the data cache by using one of the redundant rows of the storage array. An input signal for the storage array indicates when this additional line is accessed. All operations which can be performed on the other rows of a cache array can also be performed for this additional line.
If array set associativity is considered, then more than one line can be placed into a row. For example, with 4-way set associativity a total of four additional lines can be brought into the additional row. Using such an array redundant row to provide the extra line of cache line is superior to traditional methods of expanding a cache to include more lines. These traditional methods required the addition of registers, multiplexors and control logic that correspond to the additional cache line to be added. By using an array redundant row as an additional cache line, the amount of physical space taken, and resulting wiring congestion, is minimized.
A one-way set associative array will be described-herein for ease in understanding, but there is nothing to preclude extending it to M-way and thus allowing for more “additional lines”.


REFERENCES:
patent: 4197580 (1980-04-01), Chang et al.
patent: 4637024 (1987-01-01), Dixon et al.
patent: 4751656 (1988-06-01), Conti et al.
patent: 5070502 (1991-12-01), Supnik
patent: 5297094 (1994-03-01), Rastegar
patent: 5301153 (1994-04-01), Johnson
patent: 5341381 (1994-08-01), Fuller
patent: 5367655 (1994-11-01), Grossman et al.
patent: 5497347 (1996-03-01), Feng
“Memory System Reliability Improvent through Associative Cache Redundancy” M. A. Lucente et al; I.EE.E Journal, vol. 26, No. 23, Mar. 1991.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Cache memory system and method for accessing a cache memory... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Cache memory system and method for accessing a cache memory..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Cache memory system and method for accessing a cache memory... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3304763

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.