Cache memory system with memory request address queue, cache...

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S118000, C711S169000

Reexamination Certificate

active

06327645

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The invention relates to a cache memory system suitable for use in a graphics rendering system to reduce memory traffic and improve the performance of the graphics rendering system.
2. Description of the Related Art
A typical three-dimensional (3D) graphics display system includes a graphics rendering processor, a display memory unit and a display monitor. The display memory unit stores many kinds of data, including pixel color values (R, G, B), pixel transparency or alpha value (A), depth value (Z), texture image data, etc.
These data are generally read out from the display memory, processed, and then written back into the display memory, if necessary.
In order to generate photo-realistic 3D graphics images, each pixel may apply texture mapping, alpha blending, fog blending, etc. Texture mapping is a process where texture image data is read from the texture memory and is applied on each pixel. For each pixel, combinations of 1, 2, 4, 8 or more texture image data with different resolutions may be derived according to the rendering quality requirement, thereby necessitating a great deal of texture memory accesses that increases the loading on memory traffic. It is apparent that memory bandwidth will dominate the system performance and will become the bottleneck of the graphics rendering operation, if a lot of texture mapped objects are rendered, no matter how fast the graphics rendering processor runs.
A general strategy adopted to resolve this problem is to introduce a cache memory system into the graphics rendering system. With a suitable cache replacement scheme, the number of texture memory accesses can be reduced. However, since a pipeline structure is commonly employed in a 3D graphics engine design, most designs will have their pipeline processes stall when there is a cache miss. If the graphics rendering system has a deep pipeline structure, in the event that the requested memory data returns after n clock cycles, n bubbles will emerge in the pipeline. These bubbles can cause idling of the rendering engine and will degrade the overall performance of the graphics rendering system.
SUMMARY OF THE INVENTION
Therefore, the object of the present invention is to provide a cache memory system that incorporates a memory request address queue, a cache write address queue and a cache read address queue to absorb the latency of memory access when a cache miss condition occurs, and to minimize the occurrence of pipeline stalling in a pipelined design.
According to the present invention, a cache memory system is adapted for use with a main memory unit, and comprises:
a main memory controller adapted to be connected to the main memory unit so as to retrieve memory data therefrom;
a cache memory connected to the main memory controller for writing the memory data retrieved by the main memory controller therein;
a tag memory module adapted to receive an address signal and to detect presence of a cache hit condition, indicating that the address signal has a corresponding data entry in the cache memory, or a cache miss condition, indicating a need for accessing the main memory unit;
a read data controller interconnecting the tag memory module and the cache memory, the read data controller including a cache read address queue that receives a cache memory address corresponding to the address signal from the tag memory module, and that provides the cache memory address as a cache read address to the cache memory to control reading of the memory data from the cache memory;
a data request controller interconnecting the tag memory module and the main memory controller, the data request controller including a memory request address queue that receives a main memory address and the cache memory address that correspond to the address signal from the tag memory module in the presence of the cache miss condition, and that provides the main memory address to the main memory controller to control retrieval of the memory data from the main memory unit; and
a write data controller interconnecting the data request controller and the cache memory, the write data controller including a cache write address queue that receives the cache memory address from the memory request address queue, and that provides the cache memory address as a cache write address to the cache memory to control writing of the memory data in the cache memory.
Preferably, a data ready bit array is employed to inhibit the read data controller from providing the cache read address to the cache memory when the corresponding memory data of the main memory unit has yet to be written into the cache memory.


REFERENCES:
patent: 4761731 (1988-08-01), Webb
patent: 5148536 (1992-09-01), Witek et al.
patent: 5327570 (1994-07-01), Foster et al.
patent: 5355467 (1994-10-01), MacWilliams
patent: 5515521 (1996-05-01), Whitted, III et al.
patent: 5598551 (1997-01-01), Barajas et al.
patent: 5687348 (1997-11-01), Whittaker
patent: 5761445 (1998-06-01), Nguyen
patent: 5761708 (1998-06-01), Cherabuddi
patent: 5765220 (1998-06-01), Kipp
patent: 5809530 (1998-09-01), Samra et al.
patent: 5860117 (1999-01-01), Cherabuddi
patent: 5860158 (1999-01-01), Pai et al.
patent: 5905509 (1999-05-01), Jones et al.
patent: 6021471 (2000-02-01), Stiles et al.
patent: 6226713 (2001-05-01), Mehrotra

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Cache memory system with memory request address queue, cache... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Cache memory system with memory request address queue, cache..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Cache memory system with memory request address queue, cache... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2600106

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.