Distributed data cache with memory allocation model

Electrical computers and digital processing systems: memory – Storage accessing and control – Memory configuring

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S172000, C711S119000

Reexamination Certificate

active

06453404

ABSTRACT:

TECHNICAL FIELD
The present invention is related generally to a data cache and, more particularly, to a distributed data cache whose memory can be allocated in accordance with a memory allocation model.
BACKGROUND OF THE INVENTION
A data cache is a well-known tool for the temporary storage of data. Typically, the data is downloaded from a data source into the data cache and temporarily saved for subsequent use thereby avoiding the need to download the data again from the data source. For example, a data cache may be used for the temporary storage of data downloaded from an Internet web site. In this example, a computer, such as a conventional personal computer (PC) executes a web browser application program. Data may be downloaded from a web site for display on the PC. The data is stored in a data cache within the PC for subsequent display on the PC so as to avoid having to access the web site a second time to download the same data. The data caching process greatly enhances the speed of operation by eliminating the need to download data a second time.
Computers, such as a PC, workstation, or the like, are frequently connected to other computing platforms to form a computer network. Each portion of the computer network may include its own data cache as an integral part of an application program(s) that may be executing on the particular computer. Unfortunately, these data caches are accessible only through the particular application programs being executed on each computer and are not available for use by other portions of the computer network.
Therefore, it can be appreciated that there is a significant need for a distributed data cache having a general form that can be accessible by any portion of the computer network. The present invention provides this and other advantages, as will be apparent from the following detailed description and accompanying figures.
SUMMARY OF THE INVENTION
A data cache is implemented on a computer platform having an operating system capable of memory allocation. A large block of the memory is set aside by the operating system for allocation by the data cache software. A cache controller associated with the memory allocates memory portions independent of the operating system to be used to store data. The cache controller receives an allocation request to allocate a first amount of memory to store a first data item. In response to the allocation request, the cache controller allocates one or more blocks of the memory, each having a predetermined block size, wherein the memory allocated in the one or more blocks of memory is less than or equal to the first amount of memory requested in the allocation request. If the memory allocated in the one or more blocks of memory is less than the first amount of memory requested in the allocation request, the operating system may allocate an additional memory portion such that the total amount of memory allocated is sufficient to meet the first allocation request.
In one embodiment, the blocks of memory allocated by the cache controller are equal in size. In one embodiment, the allocated blocks are less than one kilobyte (KByte) in size. The allocated blocks of memory may comprise a continuous portion of the memory or it may be discontinues portions of the memory. If multiple blocks are allocated, a first of the allocated blocks of memory may include a link to a second subsequent allocated block of memory. The cache controller may retrieve the first data item from the allocated blocks of memory using the link to the second subsequent allocated block of memory.
The system may also include a status indicator for each of the blocks of memory to indicate whether each block of memory is allocated or is free and available for allocation. The cache controller uses the status indicator to select the one or more blocks of memory indicated as free and available for allocation to store the first data item.
The portion of the memory allocated by the operating system is generally smaller in size than the predetermined block size of the memory blocks allocated by the cache controller. The operating system can preallocate a plurality of portions of memory less than the predetermined size and may allocate a first set of memory portions having a first uniform size and a second set of memory portions having a second uniform size different from the first uniform size. The residual portion of the data item may be stored in one of the sets of memory portions allocated by the operating system whose uniform size is sufficient to accommodate the residual portion of the data item. In a typical environment, a sufficiently large number of plurality of memory portions are allocated by the operating system to avoid contention when storing data items within the cache.


REFERENCES:
patent: 5263142 (1993-11-01), Watkins et al.
patent: 5454107 (1995-09-01), Lehman et al.
patent: 5513353 (1996-04-01), Fujimoto
patent: 5651136 (1997-07-01), Denton et al.
patent: 5717886 (1998-02-01), Miyauchi
patent: 5802600 (1998-09-01), Smith et al.
patent: 5933844 (1999-08-01), Young
patent: 5983313 (1999-11-01), Heisler et al.
patent: 6076151 (2000-06-01), Meier

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Distributed data cache with memory allocation model does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Distributed data cache with memory allocation model, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Distributed data cache with memory allocation model will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2866633

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.