Inter-track communication of musical performance data

Music – Instruments – Electrical musical tone generation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C084S609000

Reexamination Certificate

active

06541689

ABSTRACT:

TECHNICAL FIELD
This invention relates to systems and methods for computer generation of musical performances. Specifically, the invention relates to a software architecture that allows dependent music tracks to obtain control data from dynamically chosen controlling music tracks.
BACKGROUND OF THE INVENTION
Musical performances have become a key component of electronic and multimedia products such as stand-alone video game devices, computer-based video games, computer-based slide show presentations, computer animation, and other similar products and applications. As a result, music generating devices and music playback devices are now tightly integrated into electronic and multimedia components.
Musical accompaniment for multimedia products can be provided in the form of digitized audio streams. While this format allows recording and accurate reproduction of non-synthesized sounds, it consumes a substantial amount of memory. As a result, the variety of music that can be provided using this approach is limited. Another disadvantage of this approach is that the stored music cannot be easily varied. For example, it is generally not possible to change a particular musical part, such as a bass part, without re-recording the entire musical stream.
Because of these disadvantages, it has become quite common to generate music based on a variety of data other than pre-recorded digital streams. For example, a particular musical piece might be represented as a sequence of discrete notes and other events corresponding generally to actions that might be performed by a keyboardist—such as pressing or releasing a key, pressing or releasing a sustain pedal, activating a pitch bend wheel, changing a volume level, changing a preset, etc. An event such as a note event is represented by some type of data structure that includes information about the note such as pitch, duration, volume, and timing. Music events such as these are typically stored in a sequence that roughly corresponds to the order in which the events occur. Rendering software retrieves each music event and examines it for relevant information such as timing information and information relating the particular device or “instrument” to which the music event applies. The rendering software then sends the music event to the appropriate device at the proper time, where it is rendered. The MIDI (Musical Instrument Digital Interface) standard is an example of a music generation standard or technique of this type, which represents a musical performance as a series of events.
There are a variety of different techniques for storing and generating musical performances, in addition to the event-based technique utilized by the MIDI standard. As one example, a musical performance can be represented by the combination of a chord progression and a “style”. The chord progression defines a series of chords, and the style defines a note pattern in terms of chord elements. To generate music, the note pattern is played against the chords defined by the chord progression.
A “template” is another example of a way to represent a portion of a musical performance. A template works in conjunction with other composition techniques to create a unique performance based on a musical timeline.
These different techniques correspond to different ways of representing music. When designing a computer-based music generation and playback system, it is desirable for the system to support a number of different music representation technologies and formats, such as the MIDI, style and chord progression, and template technologies mentioned above. In addition, the playback and generation system should support the synchronized playback of traditional digitized audio files, streaming audio sources, and other combinations of music-related information such as lyrics in conjunction with sequenced notes.
A concurrently filed United States Patent Application, entitled “Track Based Music Performance Architecture” by inventors Todor C. Fay and Mark T. Burton, describes an architecture that easily accommodates various different types of music generation techniques. In the system described in that application, a piece of music is embodied as a programming object, referred to as a segment or segment object. The segment object has an interface that can be called by a playback program to play identified intervals of the music piece. Each segment comprises a plurality of tracks, embodied as track objects. The track objects are of various types for generating music in a variety of different ways, based on a variety of different data formats. Each track, regardless of its type, supports an identical interface, referred to as a track interface, that is available to the segment object. When the segment object is instructed to play a music interval, it passes the instruction on to its constituent tracks, which perform the actual music generation. In many cases, the tracks cooperate with each other to produce music. The cited application describes inter-track object interfaces that facilitate communication between the tracks, thereby allowing one track to obtain data from another track. This is used, for example, by a style track in order to obtain chord information from a chord progression track—the style track needs the chord information for proper interpretation of notes within the style track, which are defined in terms of chord elements.
The application cited above assumes that inter-track communications take place primarily between track objects of a single segment, although the disclosed inter-track object interfaces can of course also be called from track objects of other segments. However, no mechanism is provided for a track of one segment to obtain a reference to track interface of a track in a different segment. Obtaining such a reference can be problematic, especially when the system allows segments to be initiated and terminated dynamically during a music performance. In this case, a given track should not be designed to rely on a particular track in another segment, because that segment may not even exist when the performance is actually rendered.
The invention described below relates to a more flexible way for music track objects to utilize control data from other music track objects—especially from music track objects of different music segment objects. Using the invention, multiple segment objects can be active at any given time and can change during a musical performance. Individual tracks are designed without specifying the exact source of their control information. Rather, the system determines an appropriate track to provide such control information during the performance itself. This provides a tremendous amount of flexibility in designing a performance.
SUMMARY OF THE INVENTION
In accordance with one aspect of the invention, a segment manager is used to coordinate a performance and to facilitate inter-track communications. Tracks are implemented as objects. A given track object can be a dependent track object, a control track object, or both. A dependent track object is one whose data is interpreted against control data from another track object. A control track object is one that supplies control data for use by another track object.
At any given time, a performance can include any number of active segments. The active segments have relative priorities, determined either dynamically or at the time the performance is authored. When a dependent segment needs control data of a particular type, it requests such control data from the segment manager. The segment manager notes the type of data requested, and in response queries the segments, in order of priority, to find a track containing control data of the requested type. When such a track is found, its control data is returned to the requesting track.


REFERENCES:
patent: 4526078 (1985-07-01), Chadabe
patent: 4716804 (1988-01-01), Chadabe
patent: 5052267 (1991-10-01), Ino
patent: 5164531 (1992-11-01), Imaizumi et al.
patent: 5179241 (1993-01-01), Okuda et al.
patent: 5218153 (1993-06-01), Minamitaka
patent: 5278348 (19

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Inter-track communication of musical performance data does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Inter-track communication of musical performance data, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Inter-track communication of musical performance data will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3031228

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.