Prefetched data in a digital broadcast system

Electrical computers and digital processing systems: multicomput – Computer network managing – Network resource allocating

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C709S241000, C709S217000, C709S233000

Reexamination Certificate

active

06725267

ABSTRACT:

BRIEF DESCRIPTION OF THE INVENTION
This invention relates generally to data-on-demand systems. In particular, this invention relates to video-on-demand systems.
BACKGROUND OF THE INVENTION
Video-on-demand (VOD) systems are one type of data-on-demand (DOD) system. In VOD systems, video data files are provided by a server or a network of servers to one or more clients on a demand basis.
In a conventional VOD architecture, a server or a network of servers communicates with clients in a standard hierarchical client-server model. For example, a client sends a request to a server for a data file (e.g., a video data file). In response to the client request, the server sends the requested file to the client. In the standard client-server model, a client's request for a data file can be fulfilled by one or more servers. The client may have the capability to store any received data file locally in non-volatile memory for later use. The standard client-server model requires a two-way communications infrastructure. Currently, two-way communications requires building new infrastructure because existing cables can only provide one-way communications. Examples of two-way communications infrastructure-are hybrid fiber optics coaxial cables (HFC) or all fiber infrastructure. Replacing existing cables is very costly and the resulting services may not be affordable to most users.
In addition, the standard client-server model has many limitations when a service provider (e.g., a cable company) attempts to provide VOD services to a large number of clients. One limitation of the standard client-server model is that the service provider has to implement a mechanism to continuously listen and fulfill every request from each client within the network; thus, the number of clients who can receive service is dependent on the capacity of such a mechanism. One mechanism uses massively-parallel computers having large and fast disk arrays as local servers. However, even the fastest existing local server can only deliver video data streams to about 1000 to 2000 clients at one time. Thus, in order to service more clients, the number of local servers must increase. Increasing local servers requires more upper level servers to maintain control of the local servers.
Another limitation of the standard client-server model is that each client requires its own bandwidth. Thus, the total required bandwidth is directly proportional to the number of subscribing clients. Cache memory within local servers has been used to improve bandwidth limitation but using cache memory does not solve the problem because cache memory is also limited.
Presently, in order to make video-on-demand services more affordable for clients, existing service providers are increasing the ratio of clients per local server above the local server's capabilities. Typically, a local server, which is capable of providing service to 1000 clients, is actually committed to service 10,000 clients. This technique may work if most of the subscribing clients do not order videos at the same time. However, this technique is set up for failure because most clients are likely to want to view videos at the same time (i.e., evenings and weekends), thus, causing the local server to become overloaded.
Thus, it is desirable to provide a system that is capable of providing on-demand services to a large number of clients over virtually any transmission medium without replacing existing infrastructure.
SUMMARY OF THE INVENTION
In an exemplary embodiment, at a server side, a method for sending data to a client to provide data-on-demand services comprises the steps of: receiving a data file, specifying a time interval, parsing the data file into a plurality of data blocks based on the time interval such that each data block is displayable during the time interval, determining a required number of time slots to send the data file, allocating to each time slot at least a first of the plurality of data blocks and optionally one or more additional data blocks, such that the plurality of data blocks is available in sequential order to a client accessing the data file during any time slot, and sending the plurality of data blocks based on the allocating step. In one embodiment, the parsing step includes the steps of: determining an estimated data block size, determining a cluster size of a memory in a channel server, and parsing the data file based on the estimated data block size and the cluster size. In another embodiment, the determining step includes the step of assessing resource allocation and bandwidth availability.
In one embodiment, the method further comprises the steps of selecting a set of prefetch data blocks from the plurality of data blocks and separately sending the set of prefetch data blocks in a dedicated channel for sending prefetch data, program guide, commercials, firmware update, etc. In an exemplary embodiment, the step of selecting a set of prefetch data blocks includes the steps of: (1) determining a bandwidth reduction, a bandwidth allocation for prefetch data in the dedicated channel, and a delay time; and (2) selecting the prefetch data blocks based on the bandwidth reduction, the bandwidth allocation, and the delay time.
In another embodiment, the method further comprises the steps of receiving a request for a preview, randomly selecting a set of data blocks from the plurality of data blocks to compose the preview, and causing a display of the preview. In yet another embodiment, the method further comprises sending a set of commercial data blocks in the dedicated channel and causing a display of the set of commercial data blocks at predetermined times. In an exemplary embodiment, the commercial data blocks are continuously sent in the dedicated channel. In this embodiment, the step of displaying the set of commercial data blocks includes the steps of receiving a user selection of a price based on a frequency of commercial display and causing a display of the set of commercial data blocks based on the user selection.
In yet another embodiment, the method further comprises the steps of checking a packet header of the data file for an emergency bit, tuning to the dedicated channel to receive emergency information when the emergency bit is detected, and causing a display of the emergency information. In one embodiment, this method further comprises the steps of determining whether the emergency information is for a relevant region and displaying the emergency information if the emergency information is for the relevant region.
In an exemplary embodiment, at a client side, a method for processing data received from a server to provide data-on-demand services comprises the steps of: (a) receiving a selection of a data file during a first time slot; (b) receiving at least one data block of the data file during a second time slot; (c) during a next time slot: receiving any data block not already received, sequentially displaying a data block of the data file, and repeating step (c) until all data blocks of the data file has been received and displayed. In one embodiment, the method for processing data received from a server is performed by a set-top box at the client side.
In an exemplary embodiment, a data file is divided into a number of data blocks and a scheduling matrix is generated based on the number of data blocks. At the server side, the scheduling matrix provides a send order for sending the data blocks, such that a client can access the data blocks in sequential order at a random time. In an exemplary embodiment, a method for generating a scheduling matrix for a data file comprises the steps of: (a) receiving a number of data blocks [x] for a data file; (b) setting a first variable [j] to zero; (c) setting a second variable [i] to zero; (d) clearing all entries in a reference array; (e) writing at least one data block stored in matrix positions of a column [(i+j) modulo x] in a matrix to a reference array, if the reference array does not already contain the data block; (f) writing a data bloc

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Prefetched data in a digital broadcast system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Prefetched data in a digital broadcast system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Prefetched data in a digital broadcast system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3244059

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.