Static information storage and retrieval – Read/write circuit – Having particular data buffer or latch
Reexamination Certificate
1999-11-01
2001-01-23
Nelms, David (Department: 2818)
Static information storage and retrieval
Read/write circuit
Having particular data buffer or latch
C365S189070
Reexamination Certificate
active
06178120
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a memory structure, and more particularly, to a memory structure that is able to increase the speed of data access.
2. Description of Related Art
With the progress of semiconductor technology, the capacity of memory has increased greatly and the operational speed of the central processing unit (CPU) has improved to be faster than the data accessing speed in a memory. Conventionally, to speed up the access time for the CPU to a memory, a cache memory is provided as a buffer between the CPU and a memory module. With reference to
FIG. 4
, a cache memory
43
is arranged between a CPU
41
and a memory
42
to improve the data access speed. The cache memory
43
is typically formed by static random access memories (SRAMs) whose access speed is much faster than that of the memory
42
. Generally, the cache memory
43
is provided to store the data that has been recently accessed by the CPU
41
. Therefore, if the CPU
41
intends to access data that has been accessed previously and is still stored in the cache memory
43
without being replaced, the cache memory
43
can send the data quickly to the CPU
41
without waiting for an extra cycle time. Accordingly, the speed to access data is improved. However, in such a manner to speed up data access time, the data to be accessed in the first time must be read from the memory
42
. Only the memory access operation to the data that is accessed in the second time or latter can be speeded up. Moreover, the size of the cache memory
43
is usually relatively large, so that the hardware cost to have a memory structure with cache memory is high.
In addition, a well known page mode may also be provided to speed up data access to a memory.
FIG. 5
shows the memory structure used to employ the page mode memory operation. It is illustrated that the address bus
51
to access the memory
50
is divided into column address bus
53
and row address bus
52
. A column decoder
54
and a row decoder
55
are provided to decode the column address bus
53
and row address bus
52
to address the desired memory cells in the memory
50
. The memory cells corresponding to the same row but different columns are referred to be of the same page mode. When the data to be read is in the memory cells of the same page with an identical row address, it is only necessary to decode the different column address bus
53
to access the data without decoding the row address bus
52
. Therefore, the speed to access data can be improved. However, if the memory
50
stores both the computer instructions and data, the probability to access data of the same page is low because the CPU generally accesses data and instructions alternatively (for example, an instruction may be required to access data stored in memory). Furthermore, an interrupt may be asserted to the CPU so that the executing program will jump to another address. As a result, the CPU can rarely to access data of the same page mode, and the data access speed can not be improved. Therefore, there is a need for the above memory access structure to be improved.
SUMMARY OF THE INVENTION
Accordingly, the object of the present invention is to provide a memory structure which utilizes a few data latches to speed up data access and, further, to reduce hardware cost and power consumption.
According to one aspect of the present invention, a memory structure is provided for speeding up data access. The memory structure stores data to be addressed by an address bus including a row address and column address. The memory structure has a memory unit having a plurality of memory cells for storing data therein. A row decoder is provided for decoding the row address to address the memory cells. A plurality of data latch units, each having at least two latches, are provided for optionally latching the data in the memory cells addressed by the row address into one of the latches of each data latch unit. A compare and select logic unit is provided for determining whether the data of the memory cells addressed by the row address is stored in the plurality of data latch units, and, if not, selecting one of the latches of each data latch unit to latch the data of the memory cells addressed by the row address. A column decoder is provided for decoding the column address to access data addressed by the address bus from the plurality of data latch units.
According to another aspect of the present invention, a memory structure for speeding up data access is provided. The memory structure has a memory unit having a plurality of memory cells for storing data therein. A row decoder is provided for decoding the row address to address the memory cells. A pre-column decoder is provided for decoding part of the column address to address the memory cells addressed by the row decoder. A plurality of data latch units, each having at least two latches, are provided for optionally latching the data in the memory cells addressed by the row decoder and the pre-column decoder into one of the latches of each data latch unit. A compare and select logic unit is provided for determining whether the data of the memory cells addressed by the row address and the part of the column address is stored in the plurality of data latch units, and, if not, selecting one of the latches of each data latch unit to latch the data of the memory cells addressed by the row address and the part of the column address. A post-column decoder is provided for decoding the remaining column address to access data addressed by the address bus from the plurality of data latch units.
The various objects and advantages of the present invention will be more readily understood from the following detailed description when read in conjunction with the appended drawing.
REFERENCES:
patent: 4894770 (1990-01-01), Ward et al.
patent: 5184320 (1993-02-01), Dye
patent: 5586078 (1996-12-01), Takase et al.
patent: 5644747 (1997-07-01), Kusuda
patent: 5887272 (1999-03-01), Sartore et al.
Bacon & Thomas PLLC
Ho Hoai V.
Nelms David
Sunplus Technology Co. Ltd.
LandOfFree
Memory structure for speeding up data access does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Memory structure for speeding up data access, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Memory structure for speeding up data access will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2466575