DMA driven processor cache

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S143000, C710S022000

Reexamination Certificate

active

06658537

ABSTRACT:

BACKGROUND
1. Field of the Invention
The present invention relates to computer systems that store instructions and data within a cache memory, and more particularly to a method for initiating cache prefetch and copyback operations through instructions issued by an external controller, such as a DMA controller.
2. Related Art
The performance of advanced microprocessors depends on high hit rates in their internal instruction and data caches. Network routing applications, for routing packets through a data network, have ideal characteristics for high instruction cache hit rate, but the flow-through nature of packet data makes the data cache hit rate low, because the data is often removed from the data cache when it is modified by the external interfaces and must be reloaded before a processor can access it.
Microprocessors, such as the PowerPC 604e manufactured by the Motorola Corporation, include a 64 bit bus to interface with external memory and the I/O subsystem. This bus is optimized for moving data to and from the internal L
1
caches and for maintaining cache coherency amongst multiple processors. The 60x bus includes a control path to specify the bus operations between processors, including data transfer, cache coherency, and synchronization operations.
The 604e processor can be clocked internally at four or more times the rate of its external interface. This results in a multiple cycle delay when the processor needs to access data from external memory. This delay is on top of the normal latency for memory accesses.
Compilers can sometimes remove part of this delay for static memory references by moving a cache “load” instruction within a piece of executable code, so that there is more time between the load operation and the usage of the data retrieved by the load operation. However, for dynamic memory references, in which the location of a desired data item may not be known beforehand, this is more difficult, if not impossible.
A special case that occurs frequently is a dynamically referenced structure within a loop where the (N+1)th memory address is known during the Nth loop iteration. The 604e processor includes cache prefetch instructions that can be used to bring the (N+1)th data into the cache while the Nth iteration is executing.
When there is an interaction between the processor, the memory and an external DMA controller, the problem becomes yet more difficult. If the processor prefetch, either by compiler scheduling or explicit instruction, is moved too far from the usage of the prefetched data, the possibility exists that the DMA controller will modify the data in memory, thus, negating the advantage of the prefetch.
What is needed is a mechanism that initiates prefetching of data, such as flow-through I/O data, into a processor cache.
SUMMARY
The present invention provides a mechanism whereby caching operations, such as prefetch and copyback operations, can be initiated by an external direct memory access (DMA) controller. This allows the DMA controller to govern the inclusion as well as exclusion of data from a processor cache in such as way as to avoid unnecessary cache faults, and to thereby improve system performance. Thus, the present invention effectively provides a synchronization mechanism between an external DMA controller and the processor cache.
The present invention can be characterized as a computing system, comprising: a processor including a cache; a memory coupled with the processor; a direct memory access device coupled with the processor and the memory; wherein the processor includes a mechanism that, in response to a command form the direct memory access device, potentially modifies an entry in the cache.
The present invention may also be characterized as method for updating a cache, the method operating in a system including, a processor including the cache, a memory coupled with the processor, and a direct memory access device coupled with the processor and the memory, the method comprising the steps of: receiving at the processor a command from the direct memory access device; and in response to the command, potentially updating an entry in the cache.
According to one aspect of the present invention, the above-mentioned method includes the step of modifying an entry within the memory.


REFERENCES:
patent: 4933835 (1990-06-01), Sachs et al.
patent: 5119485 (1992-06-01), Ledbetter, Jr. et al.
patent: 5347634 (1994-09-01), Herrell et al.
patent: 5524208 (1996-06-01), Finch et al.
patent: 5572701 (1996-11-01), Ishida et al.
patent: 5613153 (1997-03-01), Arimilli et al.
patent: 5652915 (1997-07-01), Jeter
patent: 5659710 (1997-08-01), Sherman
patent: 5668956 (1997-09-01), Okazawa et al.
patent: 5796979 (1998-08-01), Arimilli et al.
patent: 5813036 (1998-09-01), Ghosh et al.
patent: 5859990 (1999-01-01), Yarch
patent: 5860111 (1999-01-01), Martinez, Jr. et al.
patent: 5875352 (1999-02-01), Gentry et al.
patent: 5893141 (1999-04-01), Kulkarni
patent: 5893153 (1999-04-01), Tzeng et al.
patent: 5900017 (1999-05-01), Genduso
patent: 5941968 (1999-08-01), Mergard et al.
patent: 5950227 (1999-09-01), Kulkarni
patent: 5953538 (1999-09-01), Duncan
patent: 6018763 (2000-01-01), Hughes et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

DMA driven processor cache does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with DMA driven processor cache, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and DMA driven processor cache will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3114522

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.