Electrical computers and digital processing systems: memory – Storage accessing and control – Memory configuring
Reexamination Certificate
1998-12-30
2001-11-27
Kim, Matthew (Department: 2186)
Electrical computers and digital processing systems: memory
Storage accessing and control
Memory configuring
C074S123000, C074S129000
Reexamination Certificate
active
06324632
ABSTRACT:
TECHNICAL FIELD
The present invention relates to a method and computer system for processing a data stream, particularly but not exclusively for continuous video or audio data.
BACKGROUND OF THE INVENTION
Many algorithms involve the processing of a continuous “stream” of data, the size of the stream usually being many times that of any on-chip cache memory which may be provided. As is well known in the art, a cache memory operates between a processor and a main memory of a computer. Data and/or instructions which are required by the process running on the processor can be held in the cache while that process runs. An access to the cache is normally much quicker than an access to main memory. If the processor does not locate a required data item or instruction in the cache memory, it directly accesses main memory to retrieve it, and the requested data item or instructions is loaded into the cache. There are various known systems for using and refilling cache memories, and in particular cache memories can exhibit advantageous characteristics in respect of pre-fetching items from main memory which are expected to be required by a processor into the cache, and aggregation of writing data items out from the cache into the main memory. Despite these advantages, the usefulness of on-chip caches for processing streams of data may be limited, since the data forming an input stream is likely to be read only once, and data in the output stream is not accessed once it has been written back to main memory. A data stream can be considered to constitute a continuous sequence of bytes of data. Streams can be classified as either an input stream or an output stream. Thus, an input stream can be considered as a continuous sequence of data items which are subject to a processing step using predefined program data to generate output data. The output data may be in the form of an output data stream or may take some other output format. An output stream of data can be considered as a continuous sequence of bytes of data which have been generated by the execution of processing steps using predefined program data. The output stream may be generated from an input stream, or may be generated directly from the program data itself.
For the reasons outlined above, streamed data generally shows poor temporal locality of reference, and can reduce the effectiveness of existing cache memories by causing the eviction of more suitable cache occupants, in particular the program data which is used to process the data stream. That data forms an ideal cache occupant because it is repeatedly accessed for processing the data stream.
The present invention seeks to allow for processing of a data stream with enhanced performance and greater predictability.
SUMMARY OF THE INVENTION
According to one aspect of the present invention there is provided a method of processing a data stream using a set of program data in a computer system comprising an execution unit, a main memory and a cache memory divided into a plurality of cache partitions wherein the data stream is to be stored in a first memory space in the main memory and the set of program data is to be stored in a second memory space in the main memory. The method includes allocating exclusively to the first memory space a first one of the cache partitions for use by the data stream, allocating exclusively to the second memory space a second one of the cache partitions for use by the program data, and transferring the data stream between the execution unit and the main memory via the first allocated cache partition whereby in effecting this transfer the program data is not evicted from the cache memory.
The invention can be used in ways such as the following three illustrations.
In a first context, the data stream is an incoming data stream which is to be processed using the program data to generate output data. In that connection, the incoming data stream is pre-fetched into the first cache partition from the main memory prior to processing. It will readily be appreciated that as the incoming data stream is continuously pre-fetched into the first cache partition, it does not matter that the overall size of the stream may be much greater than the size of the cache partition which is available. As that cache partition is allocated exclusively to the incoming data stream, it can continuously operate to pre-fetch “chunks” of the data stream to have them ready for processing.
The output data may be generated in the form of an outgoing data stream, or in a “closed end” form. The case where the output data is in the form of an outgoing data stream constitutes a second context.
A further aspect of the present invention according to the second context provides a method of processing an incoming data stream using a set of program data to generate an outgoing data stream in a computer system comprising an execution unit, a main memory and a cache memory divided into a plurality of cache partitions wherein the incoming data stream is to be stored in a first memory space in the main memory, the set of program data is to be stored in a second memory space in the main memory and the outgoing data stream is to be stored in a third memory space in main memory. The method includes allocating exclusively to the first memory space a first one of the cache partitions for use by the incoming data stream, allocating exclusively to the second memory space a second one of the cache partitions for use by the program data, allocating exclusively to the third memory space a third one of the cache partitions for use by the outgoing data stream,
prefetching data from the incoming data stream into the first cache partition; processing the data using the program data to generate the outgoing data stream and transferring the outgoing data stream to the third cache partition; and writing the outgoing data stream from the third cache partition into the third memory space.
According to a third context, the outgoing data stream may be generated from the program data itself and/or from input data in a “closed end” format, rather than from an input data stream.
The present invention also provides a computer system for processing a data stream including a main memory having a first memory space for holding a data stream and a second memory space for holding program data for use in processing said data stream, an execution unit for executing a process using said program data, a cache memory divided into a plurality of cache partitions, a cache access mechanism for controlling the storage of items in the cache memory and operable to allocate exclusively a first one of the partitions for items held in the first memory space and a second one of the partitions for items held in the second memory space; and a data transfer mechanism operable to continuously transfer the data stream between the execution unit and the main memory via the first allocated cache partition without evicting program data from the cache memory.
The invention is particularly useful for video and audio algorithms, for example MPEG
2
, where the input stream represents a digitally encoded bit stream, the program data comprises look-up tables and the output stream is a decoded version of the input stream. The program data may also include an instruction sequence defining a process to be executed.
According to a concept in the present invention, cache partitioning is used to provide a mechanism by which stream processing may enjoy the performance benefits of the cache, while the contents of the rest of the cache are protected from being negatively affected by the presence of such transient stream data.
REFERENCES:
patent: 4905141 (1990-02-01), Brenza
patent: 5434992 (1995-07-01), Mattson
patent: 5442747 (1995-08-01), Chan et al.
patent: 5535359 (1996-07-01), Hata et al.
patent: 5584014 (1996-12-01), Nayfeh et al.
patent: 5875465 (1999-02-01), Kilpatrick et al.
patent: 5966734 (1999-10-01), Mohamed et al.
patent: 6061763 (2000-05-01), Rubin et al.
patent: 2214336A (1989-08-01), None
patent: 2292822A (1996-03-01), None
patent: 2311880A (199
Chace Christian P.
Iannucci Robert
Kim Matthew
Seed IP Law Group PLLC
STMicroelectronics Limited
LandOfFree
Processing a data stream does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Processing a data stream, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Processing a data stream will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2612554