Scalable architecture for media-on demand servers

Electrical computers and digital processing systems: multicomput – Computer-to-computer protocol implementing – Computer-to-computer data streaming

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C709S203000, C709S223000

Reexamination Certificate

active

06279040

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to media-on-demand systems in which stored multimedia comprising multiple digital bit streams are retrieved and delivered by a media server on a real-time and on-demand basis. More particularly, the invention relates to a scalable architecture for a media server which may be used to implement multimedia data delivery systems servicing large numbers of simultaneous subscribers.
BACKGROUND OF THE INVENTION
Multimedia server design is emerging as a key technology in the trend toward interactive multimedia services such as video-on-demand (VOD), teleshopping, digital video broadcasting and distance learning. A media server primarily acts as an engine, reading multimedia data streams from disk storage devices and delivering the streams to clients at a proper delivery rate. The multimedia bit streams are digital bit streams representing video, audio and other types of data. Each multimedia bit stream is generally delivered subject to a quality-of-service (QOS) constraint, such as average bit rate or maximum delay jitter. One of the most important performance criteria of an interactive multimedia system is the maximum number of real-time multimedia data streams that can be simultaneously supported. A media server generally must be able to deliver retrieved multimedia streams in a timely manner while simultaneously supporting real-time retrieval requests of a large number of clients. A number of different bottlenecks limit the stream retrieval and delivery capability of a media server. These bottlenecks include, for example, storage device input/output (I/O) limitations, network bandwidth restrictions, and central processing unit (CPU) processing overhead.
FIG. 1
shows an exemplary prior art video server
10
suitable for use in a multimedia data delivery system. The server
10
includes a microprocessor
12
coupled to a memory
14
. A storage controller
16
directs the storage and retrieval of multimedia data streams in a disk storage device
18
which may be a multiple-disk array. The server
10
also includes a network controller which
20
serves as an interface to an access network shared by a plurality of subscribers. The microprocessor
12
, storage controller
16
and network controller
20
are interconnected by a system bus
22
. The network controller
20
receives requests for retrieval of stored video streams from subscribers via the access network and passes the requests via system bus
22
to the microprocessor
12
. The microprocessor
12
utilizes a disk scheduling algorithm to generate retrieval instructions which are supplied to the storage controller
16
to direct the retrieval of the requested data streams from the storage device
18
. The server
10
is configured to provide simultaneous retrieval of multiple stored streams in response to corresponding requests from the subscribers. The operation of video server
10
is described in greater detail in, for example, F. A. Tobagi and J. Pang, “StarWorks—A Video Application Server,” IEEE COMPCON, Spring '93, pp. 4-11, and W. Tseng and J. Huang, “A High Performance Video Server For Karaoke Systems,” IEEE Transactions on Consumer Electronics, Vol. 40, No. 3, August 1994, pp. 329-336. The server computer
10
of
FIG. 1
suffers from a significant problem in that it generally unable to simultaneously support retrieval requests for real-time video from a large number of clients. The server
10
is instead better suited for use in local area network (LAN) applications in which a personal computer (PC) or workstation is configured to serve a relatively small number of clients.
FIG. 2
illustrates a prior art architecture for scaling a video server
10
such as that shown in
FIG. 1
in order to increase the number of simultaneous data stream retrievals and thereby the number of subscribers which can be supported. The scaled server network of
FIG. 2
includes m of the video servers
10
-i connected to a switch network
24
. The switch network
24
is connected to n of the subscribers
26
-i. The switch network delivers the outputs of the video servers
10
-i to the subscribers
26
-i in accordance with subscriber requests and thereby provides some increase in the number of subscribers which can be supported simultaneously. However, these and other switch-based scalable servers are generally unable to provide a multimedia distribution system accessible by a sufficiently large number of subscribers.
Other prior art systems provide video-on-demand service architectures combined with network capability. Examples of such systems may be found in U.S. Pat. No. 5,442,749 issued Aug. 15, 1995 to J. D. Northcutt et al., assigned to Sun Microsystems Inc. and entitled “Network Video Server System Receiving Requests From Clients for Specific Formatted Data Through a Default Channel and Establishing Communication Through Separate Control and Data Channels,” U.S. Pat. No. 5,508,732 issued Apr. 16, 1996 to J. F. Bottomley et al., assigned to IBM Corp. and entitled “Data Server, Control Server and Gateway Architecture System and Method for Broadcasting Digital Video on Demand,” U.S. Pat. No. 5,521,631 issued May 28, 1996 to H. S. Budow et al., assigned to SpectraVision Inc. and entitled “Interactive Digital Video Services System With Store and Forward Capabilities,” U.S. Pat. No. 5,471,318 issued Nov. 28, 1995 to S. R. Ahuja et al., assigned to AT&T Corp. and entitled “Multimedia Communications Network,” and Republic of China Patent No. 252248 85110129-0 72228, July 1995. These other systems fail to address and solve the scalability issue and thus cannot support a sufficient number of subscribers.
As is apparent from the above, a need exists for a scalable media server architecture which may be used to implement multimedia data delivery systems supporting large numbers of subscribers and simultaneous real-time data stream retrievals.
SUMMARY OF THE INVENTION
The present invention provides a scalable media server which can be used to implement a scaled server for simultaneous retrieval and delivery of a large number of media data streams. Various aspects of the invention relate to the design of a stream pumping engine used as a basic building block in a scalable media server, the manner in which multiple scalable servers may be interconnected to provide a scaled server with a desired data delivery capability, and a stream multiplexer for delivering the multiple media data streams from a scaled server in accordance with agreed-upon quality of service restrictions.
In accordance with one aspect of the invention, a scalable media server is provided which includes a plurality of stream pumping engines. Each of the stream pumping engines is connected between. a distinct storage device of a storage system and a system bus of the scalable media server. A given stream pumping engine retrieves a requested data stream stored on the distinct storage device to which it is connected, and delivers the requested data stream to an appropriate subscriber. The scalable server also includes a server processor coupled to the stream pumping engines via the system bus. The server processor receives retrieval requests from clients and directs the operations of the plurality of stream pumping engines in accordance with the retrieval requests. A given stream pumping engine may include a storage controller coupled to the corresponding storage device, and a network controller coupled to the storage controller. The storage controller retrieves a data stream from the corresponding storage device in response to particular retrieval requests, while the network controller delivers the retrieved data stream to a network accessible by the appropriate client. The given stream pumping engine also includes a stream pumping engine processor which is coupled to the storage controller and network controller and directs the operations of those elements. The given stream pumping engine may also include a shared memory accessible by the server processor via a host system bus and accessible by the stream pumping engine pr

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Scalable architecture for media-on demand servers does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Scalable architecture for media-on demand servers, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Scalable architecture for media-on demand servers will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2509979

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.