Method and apparatus for facilitating group musical...

Music – Instruments – Electrical musical tone generation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C084S600000, C084S645000, C709S200000, C709S201000, C709S204000, C709S217000, C709S231000

Reexamination Certificate

active

06353174

ABSTRACT:

FIELD OF THE INVENTION
This invention relates to electronic music systems and, more particularly, to an electronic music system by which a group of musicians connected by a network achieve musical collaboration in “near real time.”
BACKGROUND OF THE INVENTION
Music is a temporal medium, the organization of sound in time. Accordingly, music making is highly timing sensitive. When a musician presses a key on a piano, the musician expects the result to be immediately audible. Any delay in hearing the sound, even as brief as few milliseconds, produces a perceived sluggishness that impedes the ability of the musician to use the instrument.
Music making is also often a collaborative effort among many musicians who interact with each other. With the advent of the Internet, musicians have sought ways to collaborate and interact with each other from remote locations. A primary inadequacy of the Internet for such purposes, however, is the inherent latency of data transmissions over the network. Such latency often exceeds hundreds or thousands of milliseconds, which is far beyond the threshold of tolerable real-time musical interaction.
Therefore, a need exists for a system and method that enable musicians to achieve near real-time musical collaboration over a high-latency network, such as the Internet.
SUMMARY OF THE INVENTION
It is an object of the invention to provide a system and method that a group of users connected to a network can use to collaborate upon a task in near real time. It is a further object of the invention to enable the group of users to achieve near real-time musical collaboration.
In general, in one aspect, the invention features a method for achieving near real-time musical collaboration. A stream of musical data is played to each user. Each musical data stream represents the musical collaboration upon which the users are collaborating. In some embodiments of the invention, the playing of each musical data stream to each user occurs automatically, repetitiously, or both.
Each user is allowed to modify the musical data of the musical data stream as that musical data are played to that user. During the playing of the musical data stream, each user may add, delete, or modify musical data of that musical data stream. Any musical data modifications made by one of the users are transmitted to another user over the network.
The playing of the musical data streams are staggered such that each user is located at a different time in the musical collaboration, and thus in the musical data stream played to that user, than every other user. In one embodiment, the staggering of the musical data streams separates any two users by a temporal offset that exceeds the maximum time required to transmit musical data modifications from one user to another user over the network. The length of the temporal offset ensures that the destination computer will receive the transmitted modifications in time to incorporate the modifications into the musical data stream played by that computer.
In one embodiment, the musical data modifications made by one user are transmitted to every other user in a broadcast is fashion. In another embodiment, the modifications pass from user to user in peer-to-peer communication. In still another embodiment, the musical data modifications made by one user are transmitted to another user through a server.
In another aspect, the invention relates to method for achieving near real-time collaboration on a task by a plurality of users connected by a network. A stream of data representing the collaboration is output to each user. The outputting of the data streams is staggered such that each user is located at a different time in the collaboration, and thus in the data stream played to that user, than every other user. Each user can modify the data of the data stream as that data are output to that user. Data modifications made by one of the users are transmitted to another user over the network.
In still another aspect, the invention relates to system for achieving near real-time musical collaboration by a plurality of users connected by a network. In general, the system includes a plurality of computers connected by a network. Each computer has an output system playing a stream of musical data representing the musical collaboration to a user of that computer and an input system by which the user of that computer modifies the musical data as the output system plays that musical data. Each computer also has a transmitter that transmits the musical data modifications to another computer over the network. The computers of the system stagger the playing of the musical data streams such that each computer plays musical data located at a different time in the musical collaboration than every other computer. In one embodiment, the input system includes a MIDI instrument. In another embodiment, the output system includes a MIDI synthesizer.


REFERENCES:
patent: 5054360 (1991-10-01), Lisle et al.
patent: 5393926 (1995-02-01), Johnson
patent: 5689641 (1997-11-01), Ludwig et al.
patent: 5916302 (1999-06-01), Dunn et al.
The Distributed Real-Time Groove Network (DRGN), Matthew D. Moller and Canton Becker, 1995.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for facilitating group musical... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for facilitating group musical..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for facilitating group musical... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2861379

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.