Multi-level page cache for enhanced file system performance...

Electrical computers and digital processing systems: memory – Address formation – Address mapping

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

10903438

ABSTRACT:
A physical read ahead is implemented at the filing system level by using a two-level page cache. When a request is received for a page of data within a file, such that the file has a corresponding inode number, a page cache is searched for the requested page of data based on the corresponding inode number and a page number corresponding to the requested page of data. The request is translated into an actual location on the storage device when the page of data is not found in the page cache, and a search of the page cache is performed using an inode representing the storage device. A handle identifying the page of data in the page cache is updated to logically associate the page with a user file inode. Least recently used physical read-ahead data is evicted from the page cache.

REFERENCES:
patent: 2003/0158873 (2003-08-01), Sawdon et al.
patent: 2003/0182389 (2003-09-01), Edwards
patent: 2004/0143711 (2004-07-01), So et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Multi-level page cache for enhanced file system performance... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Multi-level page cache for enhanced file system performance..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Multi-level page cache for enhanced file system performance... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3730999

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.