Method for employing a page prefetch cache for database...

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S122000, C711S140000, C711S133000

Reexamination Certificate

active

06829680

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to database processing. More particularly, the present invention relates to a method for improving the processing speed of database instructions using a page prefetch cache.
2. The Background Art
One of the primary functions a computer is required to perform is that of processing information present in databases. When the database is large, it is necessary to have the processing function perform as efficiently as possible, or valuable time and computer resources may be wasted.
Database applications spend a significant fraction of time waiting on data stream cache misses. It is well known that cache misses contribute to a significant fraction of cycles per instruction (CPI) on database workloads. Many of these cache misses are due to accessing database data structures called “pages”.
Database engines store data records in physical memory and on a hard disk in the form of pages. Each page holds several data or index records. The page size is typically in the range 2 kilobytes (KB) to 8 KB. The total amount of data that databases maintain in pages is large, e.g. gigabytes. Accessing pages within the database has a propensity to cause compulsory or capacity cache misses.
The end result of these cache misses is that they cause delays in processing an instruction. If data required to process the instruction is not available in data cache memory, valuable central processing unit (CPU) cycles are wasted while the required data is retrieved from external memory, a process which can often take up to 100 or more CPU cycles. During these CPU cycles, the CPU is either executing instructions which cause the data to be retrieved, or is waiting for the required data to be present in the data cache.
It would therefore be beneficial to provide an apparatus and method for ensuring that data required for the execution of an instruction is present in a data cache memory or other memory storage close to the pipeline prior to that data being required for the execution of an instruction.
SUMMARY OF THE INVENTION
The present invention comprises an apparatus for reducing the impact of cache misses using a page prefetch cache. The page prefetch cache resides on a CPU chip or in an adjacent off-chip architecture. The page prefetch cache has space to store “n” complete database pages in which the database page size varies but is typically in the range of 2 KB to 8 KB where typically “n” is a small number. By way of example and not of limitation, the number of database pages stored by the page prefetch cache is four.
In operation, during a database application, the page prefetch cache is enabled. The CPU issues a page prefetch instruction to load pages into the page prefetch cache. For optimal benefit the database page needs to be loaded into the page prefetch cache prior to the first use of the database page. All load instructions check the page prefetch cache. If the requested database page is in the page prefetch cache, access to that data page do not slow the CPU or are returned to the pipeline much faster.
Numerous means for generating page prefetch instructions can be employed. By way of example, the means for generating a page prefetch instruction include a compiler or developer software. The compiler or developer software identifies locations in the code where access to a particular database page will start. Locations are identified through a profile feedback or through a plurality of “pragmas” inserted in the source code by the developers. The compiler inserts the new page prefetch instruction at the designated locations defined by the profile feedback or pragmas. The newly inserted page prefetch instructions bring the page to be accessed into the page prefetch cache. When the page prefetch instruction is executed, the entire page is received by the page prefetch cache. Subsequent load instructions that access the page prefetch cache return the prefetched database pages to a pipeline significantly faster.
The present invention provides a method for reducing cache misses and improving performance. The method provides for the issuance of pre-fetch commands which store one or more database pages in the page prefetch cache before each of the database pages must be accessed. Simultaneously, as database page prefetch commands are issued, the method determines whether the page prefetch cache has a free entry for a new database page.


REFERENCES:
patent: 5151991 (1992-09-01), Iwasawa et al.
patent: 5293609 (1994-03-01), Shih et al.
patent: 5347654 (1994-09-01), Sabot et al.
patent: 5361357 (1994-11-01), Kionka
patent: 5551046 (1996-08-01), Mohan et al.
patent: 6182111 (2001-01-01), Inohara et al.
patent: 6182133 (2001-01-01), Horvitz
patent: 0 437 712 (1991-07-01), None
patent: 0 509 231 (1992-10-01), None
Klaiber A C et al., “An Architecture for Software-Controlled Data Prefetching,”Computer Architecture News, May 1991, vol. 19., No. 3, New York, U.S., 6 pages.
R. Sugumar, “Page Prefetching for Database Workloads”,Second Workshop on Computer Architecture Evaluation using Commercial Workloads, CAECW '99, Orando, Florida, Jan. 10, 1999.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for employing a page prefetch cache for database... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for employing a page prefetch cache for database..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for employing a page prefetch cache for database... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3335410

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.