Buffer caching method and apparatus for the same in which...

Electrical computers and digital processing systems: memory – Storage accessing and control – Hierarchical memories

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C711S001000, C711S003000, C711S004000, C711S113000, C711S133000, C711S146000

Reexamination Certificate

active

06549982

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a file system, and more particularly to a buffer cache control method and an apparatus for the same.
2. Description of the Related Art
A switching system is provided with a communication channel unit
12
to control a telephone line between telephone subscribers
10
, as shown in
FIG. 1. A
communication routing unit
12
is controlled by a central processing unit
14
such as a computer. That is, a central control unit
16
reads a program from a main storage unit
18
and an auxiliary storage unit
20
and interprets and executes each of commands of the program. Thus, a necessary instruction and data are sent to a communication routing unit
12
so that all functions such as a charging function and a call processing function are controlled.
Also, a maintenance and operation unit (console)
22
exchanges data with the central control unit
16
.
In such a switching system, various type files are used. In a conventional switching system, caching is carried out using a buffer cache with a predetermined block size. Referring to
FIG. 2
, a general file operation in the conventional switching system will be described. It is supposed that there is a file aaa on a magnetic disk unit
18
as the main storage unit. The file aaa is divided on the magnetic disk in units of blocks. The file aaa is stored in blocks
100
,
101
,
102
,
205
,
206
,
255
,
256
and
300
. When the file size is extended, a new block is connected to the last block. When the new block cannot be connected to the last block, the block is provided at another place. For example, if a block
301
next to the block
300
is used for the new block. If the block
301
is already used, a released block is used for the new block.
By the way, it is not possible to read the whole file aaa in the buffer cache, because the occupied memory area is large. Therefore, only the contents of blocks containing a necessary portion of data are read in the buffer cache. For example, when there is the necessary data portion in the blocks
101
,
102
and
205
of the file aaa composed of the blocks
100
,
101
,
102
,
205
,
206
,
255
,
256
and
300
, the data in the blocks
101
,
102
and
205
are read in the buffer cache. Also, when there is the necessary data portion in the block
102
of the file aaa, only the block
102
is read in the buffer cache.
In a buffer caching method using the buffer cache with a predetermined block size, it is possible to speed up the read operation if the block size of the file is smaller than the size of the buffer cache. However, when some blocks are continuously accessed as in a file system of a switching apparatus, the read access operation is divided into a plurality of I/O processes. For this reason, the enough effect is not attained.
In the technique described in Japanese Laid Open Patent Application (JP-A-Heisei 5-158792), an input/output buffer is provided, and when the access size is large, the data is once taken into an input/output buffer. Then, data is divided and is supplied to the buffer cache.
Also, in a method of caching using the buffer cache with a predetermined block size, access of a predetermined data length is carried out with no relation to the block size of the file. Therefore, when the block size of the buffer cache is large, the wasteful use of the block had occurred when a small file is manipulated.
It is proposed to make the block size of the buffer cache variable to cope with such a problem. For example, the technique is disclosed in Japanese Laid Open Patent Application (JP-A-Heisei 10-69429), in which a buffer memory area on a memory is allocated in accordance with the request of an application program. However, this method using the buffer cache of the variable length is for a communication control program in which many data with predetermined sizes are manipulated. In the file system of the switching apparatus, the method is not suitable because required data have various sizes.
Also, a method of controlling a buffer cache using a hash list is disclosed in Japanese Laid Open Patent Application (JP-A-Heisei 7-311710). However, it is not advantageous in high speed access to apply the hash list to the buffer cache with a variable length.
In the file system of the switching apparatus, because a data area with a large size is required, the above conventional examples are not suitable in the viewpoint to realize high-speed access. For this reason, the method using the buffer cache with a predetermined block size is still carried out for the caching. In such a way, it is not possible to attain high speed access.
In conjunction with the above description, Japanese Laid Open Patent Application (JP-A-Heisei 2-39343) discloses a transmission and reception buffer control system for data transfer between a host computer and a display. A buffer is provided with a data storage area of variable length, a head pointer storage area for storing a pointer of a head address of the data storage area and a tail pointer storage area for storing a pointer of a tail address of the data storage area.
Also, an image forming apparatus such as a laser printer an a facsimile is disclosed in Japanese Laid Open Patent Application (JP-A-Heisei 7-210365). In this reference, it is detected using a timer that a reception buffer is full. If the buffer is still full after a predetermined time, the buffer size is increased.
SUMMARY OF THE INVENTION
Therefore, an object of the present invention is to provide a buffer caching method in which a buffer cache can be made variable in size, and a buffer caching apparatus for the same.
Another object of the present invention is to provide a buffer caching method in which files with various sizes can be accessed at high speed, and a cache buffer apparatus for the same.
Still another object of the present invention is to provide a buffer cache method in which buffers can be released or inserted based on the data size of requested data, and a cache buffer apparatus for the same.
Yet still another object of the present invention is to provide a buffer cache method in which buffers with various sizes are previously provided such that one of the buffers is selected, and a cache buffer apparatus for the same.
It is also an object of the present invention to provide a buffer cache method in which a balanced tree is used, and a cache buffer apparatus for the same.
Another object of the present invention is to provide a buffer caching method in which unbalance of a balanced tree can be corrected, and a cache buffer apparatus for the same.
In order to achieve an aspect of the present invention, a buffer cache apparatus includes a storage unit storing data, a buffer cache and a control unit. The buffer cache stores a balanced tree and free lists. Data containing buffers are hierarchically provided as nodes in the balanced tree, and the free lists are respectively provided for different sizes. Each of the free lists includes at least one free buffer when the at least one free buffer exists. The control unit is responsive to a data request. The control unit searches the free lists for a desired buffer based on a size of a requested data requested by the data request, when the requested data does not exist in the balanced tree. Also, the control unit adds the desired buffer to the balanced tree to store the requested data from the storage unit into the desired buffer. The desired buffer has a size larger than the requested data size and as small as possible.
Here, each of the free lists comprises a head section indicating the size and the at least one free buffer connected to the head section, when the at least one free buffer exists. The control unit: (a) searches the head sections of the free lists based on the requested data size to find one of the free lists which is provided for the smallest one of the sizes larger than or equal to the requested data size, (b) determines whether the found free list contains the at least one free buffer, (c) sets another of the free lists for the

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Buffer caching method and apparatus for the same in which... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Buffer caching method and apparatus for the same in which..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Buffer caching method and apparatus for the same in which... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3038911

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.