Level 2 smartcache architecture supporting simultaneous...

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S129000

Reexamination Certificate

active

06745293

ABSTRACT:

FIELD OF THE INVENTION
This invention generally relates to microprocessors, and more specifically to improvements in cache memory access circuits, systems, and methods of making.
BACKGROUND
Microprocessors are general purpose processors which provide high instruction throughputs in order to execute software running thereon, and can have a wide range of processing requirements depending on the particular software applications involved. A cache architecture is often used to increase the speed of retrieving information from a main memory. A cache memory is a high speed memory that is situated between the processing core of a processing device and the main memory. The main memory is generally much larger than the cache, but also significantly slower. Each time the processing core requests information from the main memory, the cache controller checks the cache memory to determine whether the address being accessed is currently in the cache memory. If so, the information is retrieved from the faster cache memory instead of the slower main memory to service the request. If the information is not in the cache, the main memory is accessed, and the cache memory is updated with the information.
Many different types of processors are known, of which microprocessors are but one example. For example, Digital Signal Processors (DSPs) are widely used, in particular for specific applications, such as mobile processing applications. DSPs are typically configured to optimize the performance of the applications concerned and to achieve this they employ more specialized execution units and instruction sets. Particularly in applications such as mobile telecommunications, but not exclusively, it is desirable to provide ever increasing DSP performance while keeping power consumption as low as possible.
To further improve performance of a digital system, two or more processors can be interconnected. For example, a DSP may be interconnected with a general purpose processor in a digital system. The DSP performs numeric intensive signal processing algorithms while the general purpose processor manages overall control flow. The two processors communicate and transfer data for signal processing via shared memory. A direct memory access (DMA) controller is often associated with a processor in order to take over the burden of transferring blocks of data from one memory or peripheral resource to another and to thereby improve the performance of the processor.
SUMMARY OF THE INVENTION
Particular and preferred aspects of the invention are set out in the accompanying independent and dependent claims. In accordance with a first embodiment of the invention, there is provided a method of operating a digital system that has a cache with a plurality of request ports. After receiving a set of requests on the request ports, they are evaluated to determine hit/miss status. Multiple hits are served concurrently via separate ports of the cache.
In another embodiment, the cache is configured as two or more distinct portions. A request in each portion can be serviced concurrently. A set of requests are sorted into separate queues each associated with a different portion of the cache.
In another embodiment, the requests in different queues are sorted according to a priority associated with each request in the queue.
In another embodiment, the cache is configured as two or more distinct portions based on a range of addresses assigned to one portion. A request for an address that is assigned to this portion is blocked from other portions of the cache. A request in each portion can be serviced concurrently.
In another embodiment, a digital system is provided with a cache that has a plurality of sets which each have a plurality of lines for holding data and an associated tag to indicate if data stored in each line is valid. One detection circuit is operable to detect if a first requested data is present in the cache by examining one set of tags. Another detection circuit is operable to detect if a second requested data is present in the cache by examining a different set of tags. Both detection circuits operate concurrently. The cache is a level two cache, but in other embodiments the cache may be a first level or a higher level cache.
In another embodiment, a first level cache embodying the present invention may make requests to a second level cache that also embodies the present invention.


REFERENCES:
patent: 5442747 (1995-08-01), Chan et al.
patent: 5752260 (1998-05-01), Liu
patent: 5905997 (1999-05-01), Stiles
patent: 6038647 (2000-03-01), Shimizu
patent: 6253297 (2001-06-01), Chauvel et al.
patent: 6665775 (2003-12-01), Maiyuran et al.
patent: 0 284 751 (1988-10-01), None
patent: WO 93 09497A 2 (1993-05-01), None
Texas Instruments Incorporated, S/N: 09/591,537, filed Jun. 9, 2000,Smart Cache.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Level 2 smartcache architecture supporting simultaneous... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Level 2 smartcache architecture supporting simultaneous..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Level 2 smartcache architecture supporting simultaneous... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3349933

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.