Music – Instruments – Electrical musical tone generation
Utility Patent
1999-02-02
2001-01-02
Donels, Jeffrey (Department: 2837)
Music
Instruments
Electrical musical tone generation
C084S613000, C084S645000
Utility Patent
active
06169242
ABSTRACT:
TECHNICAL FIELD
This invention relates to systems and methods for computer generation of musical performances. Specifically, the invention relates to a software architecture that allows a music generation and playback program to play music based on new technologies, without modifying the playback program itself.
BACKGROUND OF THE INVENTION
Musical performances have become a key component of electronic and multimedia products such as stand-alone video game devices, computer-based video games, computer-based slide show presentations, computer animation, and other similar products and applications. As a result, music generating devices and music playback devices are now tightly integrated into electronic and multimedia components.
Musical accompaniment for multimedia products can be provided in the form of digitized audio streams. While this format allows recording and accurate reproduction of non-synthesized sounds, it consumes a substantial amount of memory. As a result, the variety of music that can be provided using this approach is limited. Another disadvantage of this approach is that the stored music cannot be easily varied. For example, it is generally not possible to change a particular musical part, such as a bass part, without re-recording the entire musical stream.
Because of these disadvantages, it has become quite common to generate music based on a variety of data other than pre-recorded digital streams. For example, a particular musical piece might be represented as a sequence of discrete notes and other events corresponding generally to actions that might be performed by a keyboardist—such as pressing or releasing a key, pressing or releasing a sustain pedal, activating a pitch bend wheel, changing a volume level, changing a preset, etc. An event such as a note event is represented by some type of data structure that includes information about the note such as pitch, duration, volume, and timing. Music events such as these are typically stored in a sequence that roughly corresponds to the order in which the events occur. Rendering software retrieves each music event and examines it for relevant information such as timing information and information relating the particular device or “instrument” to which the music event applies. The rendering software then sends the music event to the appropriate device at the proper time, where it is rendered. The MIDI (Musical Instrument Digital Interface) standard is an example of a music generation standard or technique of this type, which represents a musical performance as a series of events.
There are a variety of different techniques for storing and generating musical performances, in addition to the event-based technique utilized by the MIDI standard. As one example, a musical performance can be represented by the combination of a chord progression and a “style”. The chord progression defines a series of chords, and the style defines a note pattern in terms of chord elements. To generate music, the note pattern is played against the chords defined by the chord progression. A scheme such as this is described in a previously
A “template” is another example of a way to represent a portion of a musical performance. A template works in conjunction with other composition techniques to create a unique performance based on a musical timeline.
U.S. Pat. No. 5,753,843, issued to Microsoft Corporation on May 19, 1998, describes a system that implements techniques such as those described above. These different techniques correspond to different ways of representing music. When designing a computer-based music generation and playback system, it is desirable for the system to support a number of different music representation technologies and formats, such as the MIDI, style and chord progression, and template technologies mentioned above. In addition, the playback and generation system should support the synchronized playback of traditional digitized audio files, streaming audio sources, and other combinations of music-related information such as lyrics in conjunction with sequenced notes.
However, it is impossible to anticipate the development of new music technologies. Because of this, a given music performance program might need significant re-writing to support a newly developed music technology. Furthermore, as more and more performance technologies are added to an application program, the program becomes more and more complex. Such complexity increases the size and cost of the program, while also increasing the likelihood of program bugs.
SUMMARY OF THE INVENTION
The invention allows a music playback program or performance supervisor to accommodate different types of playback technologies and formats without requiring such technologies to be embedded in the program itself. A piece of music is embodied as a programming object, referred to herein as a segment or segment object. The segment object has an interface that can be called by the playback program to play identified intervals of the music piece.
Each segment comprises a plurality of tracks, embodied as track objects. The track objects are of various types for generating music in a variety of different ways, based on a variety of different data formats. Each track, regardless of its type, supports an identical interface, referred to as a track interface, that is available to the segment object. When the segment object is instructed to play a music interval, it passes the instruction on to its constituent tracks, which perform the actual music generation.
In some cases, the tracks cooperate with each other to produce the music. Inter-track interfaces can be implemented to facilitate communication between the tracks. Tracks are distinguished from each other by object type identifiers, group specifications, and index values.
This architecture allows a musical piece to be embodied as a segment, with the details of the music generation being hidden within the track objects of the segment. As a result, the playback program does not need to implement methodologies for actual music generation techniques. Therefore, the playback program is compatible with any future methods of music generation, and will not need to be modified to support any particular music generation technique.
REFERENCES:
patent: 4526078 (1985-07-01), Chadabe
patent: 4716804 (1988-01-01), Chadabe
patent: 5052267 (1991-10-01), Ino
patent: 5164531 (1992-11-01), Imaizumi et al.
patent: 5179241 (1993-01-01), Okuda et al.
patent: 5218153 (1993-06-01), Minamitaka
patent: 5278348 (1994-01-01), Eitaki et al.
patent: 5281754 (1994-01-01), Farrett et al.
patent: 5286908 (1994-02-01), Jungleib
patent: 5315057 (1994-05-01), Land et al.
patent: 5355762 (1994-10-01), Tabata
patent: 5455378 (1995-10-01), Paulson et al.
patent: 5496962 (1996-03-01), Meier et al.
patent: 5596159 (1997-01-01), O'Connell
patent: 5734119 (1998-03-01), France et al.
patent: 5753843 (1998-05-01), Fay
patent: 5902947 (1999-05-01), Burton et al.
Burton Mark T.
Fay Todor C.
Donels Jeffrey
Lee & Hayes PLLC
Microsoft Corporation
LandOfFree
Track-based music performance architecture does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Track-based music performance architecture, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Track-based music performance architecture will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2514595