Live performance control of computer graphic characters

Computer graphics processing and selective visual display system – Display driving control circuitry – Controlling the condition of display elements

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S215000, C345S473000, C707S793000

Reexamination Certificate

active

06377281

ABSTRACT:

FIELD OF THE INVENTION
The present invention generally relates to data processing. The invention relates more specifically to computer software and systems useful in controlling computer graphic characters and electromechanically actuated puppet characters as used in the production of motion pictures.
BACKGROUND OF THE INVENTION
In the performing arts, puppetry traditionally has been a manually performed art that relies on the skill, creativity, and coordinated movement of a human puppeteer who manipulates a passive, mechanical puppet. Sophisticated artificial characters or “creatures” that resemble puppets, developed by Jim Henson and The Jim Henson Company, offer a wide range of motion and character expression. However, generally they require a human performer to hold them or otherwise use close manual control. This is a disadvantage in certain types of motion picture productions that feature large or fully autonomous creatures that cannot be held by or otherwise directly manipulated by a human performer.
Partly in response to this problem, remotely controlled, robotic electromechanically actuated puppet creatures have been developed. An electromechanically actuated puppet may simulate any desired animal, human character, or other creature, real or imagined, of any size. Typically an electromechanically actuated puppet comprises: a movable skeleton made of metals or other rigid materials; servos, motors, hydraulics or other actuators that move parts of the skeleton (limbs, facial features, eyes, etc.); and a skin that covers the skeleton and simulates real skin, muscle, hair, etc. Electromechanically actuated puppets also offer the ability to control complex expressions involving numerous actuation points in a small area Electromechanically actuated puppets may be remotely controlled using a cable tether that connects a remote onboard computer to a control computer, or using a wireless link. Electromechanically actuated puppets have been featured in such motion pictures and television productions as “Babe,” “George Of The Jungle,” “Dr. Doolittle,” “Animal Farm,” and others.
Unfortunately, electromechanically actuated puppets are relatively costly, and require considerable time and skill to create, test, and debug. In some cases, the amount of time that a performer can spend in rehearsal time with the electromechanically actuated puppet is limited because of concerns about wearing out its mechanical parts. As a result, in past productions the performers of the electromechanically actuated puppets sometimes have had difficulty gaining familiarity with a new electromechanically actuated puppet. This is undesirable. It is desirable to have a way for the performer to have ample rehearsal time to create and test out expressions with less wear on the puppet.
Moreover, improvements in computer graphics technology and networking technology have lead to increased interest in creating and distributing performances of computer graphic (CG) characters. Unlike an electromechanically actuated puppet, a CG character is intangible and takes form only as an animated graphical display generated by a computer. CG characters may be created, for example, using 3D graphics computer software such as 3D Studio Max, commercially available from Kinetix, a unit of Autodesk, Inc. There is tremendous interest now in creating and performing CG characters for use in World Wide Web and video game content and productions. Conventional graphics software systems, however, require the performer to learn how to use the software in order to create expressions and performances for the CG characters. It is desirable to have a way for a performer to create expressions and performances for CG characters in a manner that is homologous to a puppetry performance, without having to use graphics software or other traditional computer input methods.
In particular, existing CG tools provide only low-level approaches for generating expressions and performances of characters. Typically they are complex, time consuming to use, and prone to user error. There is a need for a performance control system that offers users high-level configuration tools for rapid generation of expressions and performances.
Control systems used to manipulate electromechanically actuated puppets are known. For example, since approximately 1989, Jim Henson's Creature Shop of London, England, a unit of The Jim Henson Company, has used a computer system known as “Big1” to control performances of electromechanically actuated puppet characters, as described in the document, “Big1: User Manual,” printed Aug. 23, 1990. Generally, the Big1 system comprises a dedicated, custom designed computer; a handset for providing manual input to the computer; a motor driver for receiving signals from the computer and driving motors of an electromechanically actuated puppet; a power supply; and an i/o unit known as Smarter Tool. The Big1 can provide real time control of 32 servos and 8 motor actuators by one puppeteer. The Jim Henson Company won an Academy Award for Technical Achievement for the design of this control system. Most recently the Big1 system was used in the production of “Jack Frost” and “Farscape.”
In 1990 an edit suite called Performance Editor was developed which works in tandem with the Big1 control system to capture a performance of a puppeteer and edit a performance, to enable synchronization to a sound track for accurate lip-synching. The edit suite was first used in the production of the film “Babe,” for which The Jim Henson Company won an Academy Award for Best Special Effects.
The Big1 system, however, has certain known disadvantages. For example, use of a custom, dedicated control computer causes problems in modifying the system to add new features and functions. Such a computer cannot be expanded in terms of speed, memory, or peripherals in an easy or cost-effective manner. All principal functionality executes on the control computer, resulting a monolithic architecture that provides a single point of failure. A monolithic software architecture, without an operating system, is also used, making distribution of program components impractical. This architecture has no memory protection, no multi-tasking, no device support, and no application programming interface. Hence, the system lacks modularity, and therefore is difficult to update, modify, or extend. It also cannot use widely available applications and tools that are available for other operating systems.
Another fundamental disadvantage of the Big1 system is that it is not readily usable with computer graphic (CG) characters.
Based on the foregoing, there is a need for a way for the performer to create and test out expressions more easily and rapidly.
There is also a need for a way for a performer to create expressions and performances for CG characters in a manner that is homologous to a puppetry performance, without having to use graphics software or other traditional computer input methods.
SUMMARY OF THE INVENTION
These needs, and other needs that will become apparent from the following description, are fulfilled by the present invention, which comprises, in one aspect, a method and apparatus for performing a computer graphic character live in a manner homologous to a live puppetry performance. Character representation information is created and stored using a first computer. Performer movement information is received at the first computer from a manual input device that receives live manual manipulations and converts the manipulations into the performer movement information. Character motion information is created and stored based on combining the performer movement information with the character representation information. The character motion information is communicated in real time to a second computer. The second computer converts the character motion information into movements of a computer graphic character, which is displayed substantially synchronized to the live manual manipulations. Control objects define elements of the manual input device. Actuators define movable eleme

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Live performance control of computer graphic characters does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Live performance control of computer graphic characters, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Live performance control of computer graphic characters will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2890619

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.