Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories
Reexamination Certificate
1999-08-12
2001-11-20
Kim, Matthew (Department: 2186)
Electrical computers and digital processing systems: memory
Storage accessing and control
Hierarchical memories
C711S145000
Reexamination Certificate
active
06321301
ABSTRACT:
CROSS-REFERENCE TO RELATED APPLICATION
This application claims the priority benefit of Taiwan application Ser. No. 88107334, filed May 6, 1999, the full disclosure of which is incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The invention relates to a cache device and a method of using the same for data accesses, and in particular to a cache device having an improved efficiency by prefetching and storing address data using a prefetch queue comparing circuit.
2. Background of the Related Art
Computers which are widely used in line with a great progress in semiconductor technology have brought a great change into out life for dozens of years. Now, some companies have successfully produced central processing units (CPUs) with an operating clock of several hundred MHz. Unfortunately, not all devices, such as memory devices, included in a computer system can operate with an operating clock the same as that of CPUs. Although the frequency of the operating clock of CPUs is continuously increased, the access speed of dynamic random access memories (hereinafter referred to as DRAMs) is not greatly improved. To resolve this problem, a cache device is introduced. That is, in a computer system, DRAMs serve a primary memory while static random access memories (SRAMs) serve as a cache device. With such a cache device, data to be likely requested by a CPU is previously transferred from the primary memory to the cache device. In this case, the CPU can access the higher-speed cache device directly instead of the primary memory, thereby reducing data access time. Therefore, a better balance between costs and efficiency can be reached. However, since the cache device has a data memory capacity smaller than that of the primary memory, data required by the CPU may not be totally stored in the cache memory. If data requested by the CPU are exactly stored in the cache memory, this state is called “cache hit” and can allow the data to be accessed with less time taken by the CPU. Inversely, if data requested by the CPU are not stored in the cache memory, this state is called “cache miss.” At the “cache miss” state, the CPU has to access the required data through the primary memory with more time taken. The ratio of the “cache hit” and “cache miss” indicates “hit ratio.”
Referring to
FIG. 1
, a conventional cache device according to the prior art is shown. In
FIG. 1
, a cache device
100
mainly consists of a cache memory
110
and a cache control circuit
120
. The cache control circuit
120
is responsible for the entire operation of the cache device
100
by controlling the cache memory
110
. The cache memory
110
includes a data RAM
112
and a tag RAM
114
. The data RAM
112
stores data corresponding to the primary memory
140
while the tag RAM
114
stores tag addresses corresponding to the data stored.
For detailed description,
FIG. 2A
illustrates the corresponding relationship between the cache memory
110
and the primary memory
140
. As shown in
FIG. 2A
, the primary memory
140
is divided into several blocks each given with a distinct tag address. Furthermore, the index addresses of each block are the same as those of the tag memory
114
and the cache memory
110
, wherein each index address is corresponding to a tag address stored in the tag memory
114
and data stored in the data memory
112
at the same time. Referring to
FIG. 2B
, the combination of a tag address and an index address represents a corresponding addresses of the primary memory
140
. In other words data stored at an index address of the data memory
112
with a corresponding tag address stored in the tag memory
114
is identical to that stored at the same address (consisting of the tag address and the index address) of the primary memory
140
. As we know, the cache memory
110
only stores part of data of the primary memory
140
. Therefore, it must be determined that whether it is at a “cache hit” state or a “cache miss” state and whether it is necessary to re-transfer required data of the primary memory
140
into the cache memory
110
when the cache device
100
handles with data accesses requested from the CPU. The way to achieve the above-stated determination is that when a data access request is received from the CPU, an address output from the CPU is compared to all tag addresses stored in the tag memory
114
together with corresponding index addresses. If the comparing result shows that one is matched, it represents a “cache hit” state while if no one is matched, it represents a “cache miss” state.
Assume that reference symbol T
WR
designates data access time of the cache memory
110
, T
MEM
designates data access time of the primary memory
140
and R
HIT
designates cache hit ratio of the cache device
100
. The average data access time T
AV
can be expressed by:
T
AV
=R
HIT
(T
WR
)+(1−R
HIT
)(T
WR
+T
MEM
) (1)
In equation (1), (T
WR
+R
MEM
) represents the required access time when the cache device
100
is experienced “cache miss”, wherein T
MEM
is generally much longer than T
WR
. In other words, the required data access time at a “cache miss” state is much longer, resulting in a poor system efficiency.
SUMMARY OF THE INVENTION
In view of the above, the invention provides a cache device. The cache device electrically coupled to a primary memory through a bus, includes a data memory, a tag memory, a cache control circuit, wherein the cache control circuit has a prefetch queue comparing circuit. The data memory is for storing data at corresponding index addresses while the tag memory is for storing tag addresses at the corresponding index addresses. The prefetch queue comparing circuit includes a cache hit/miss judging circuit, an address queue register and a prefetch condition judging circuit. The cache hit/miss judging circuit is used to judge whether a currently-read address coming from the bus is of cache hit or cache miss, wherein the address consists of an index address and a tag address. The address queue register directly stores the index address of the currently-read address plus a corresponding first one-bit flag signal output from the cache hit/miss judging circuit if the cache hit/miss judging circuit judges that the currently-read address is of cache hit, wherein the address queue register always continuously outputs index addresses already stored therein in first in-first out order to the data memory for data accesses. The prefetch condition judging circuit is used to judge whether the index address of the currently-read address is the same as any index addresses already stored in the address queue register if the cache hit/miss judging circuit judges that the currently-read address is of cache miss. When the index address is not the same as any index addresses still stored in the address queue register, an original tag address stored at the same index address in the tag memory is replaced with the tag address of the currently-read address with the control of the cache control circuit, and then the index address plus a corresponding second one-bit flag signal output from the cache hit/miss judging circuit are stored in the address queue register. According to the second one-bit flag signal, original data stored at the same index address in the data memory are replaced with required data stored at the same currently-read address in the primary memory, and then the second one-bit flag signal is changed into the first one-bit flag signal with the control of the cache control circuit.
Furthermore the invention provides a method of using the cache device for data accesses according to the invention. The method includes the following steps. First, whether a currently-read address coming from the bus is of cache hit or cache miss is judged by the cache hit/miss judging circuit, wherein the address consists of an index address and a tag address. The index address of the currently-read address plus a corresponding first one-bit flag signal output from the cache hit/miss judging circuit is directly stored in
Chen Chung-ching
Kao Ming-Tsan
Lin Ming-Fen
Bataille Pierre-Michel
Huang Jiawei
Industrial Technology Research Institute
J.C. Patents
Kim Matthew
LandOfFree
Cache memory device with prefetch function and method for... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Cache memory device with prefetch function and method for..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Cache memory device with prefetch function and method for... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2586496