Generating action data for the animation of characters

Computer graphics processing and selective visual display system – Computer graphics processing – Animation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

06522332

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to the generation of action data for animating a character such that body part locations for a selected character are positioned in response to body part locations captured from performance data. The invention relates to apparatus in a computer animation system, a method for animating a character and a computer carrying medium.
2. Description of the Related Art
In the field of three-dimensional graphics, motion is generally considered to be a successive configuration of geometric three dimensional objects over time. The movement is achieved by a succession of frames wherein the position of the three-dimensional objects are defined within each of these frames. When object positions are generated on a frame by frame basis at a rate equal to the frame display rate, the animation is considered to be displayed in real time. In this way, computer generated animations may be used as an alternative to traditional artistic solutions and the resulting product may also be included with real action captured by camera for use in cinematographic, video or computer-based presentations.
An example of a known procedure for animating a character in response to captured performance data is illustrated in FIG. A. The system shown in FIG. A makes use of a process referred to as a solver, which uses processes of forward kinematics, inverse kinematics, three-point rotation and other techniques that may be generally referred to as a set of geometric constraints. The solver calculates the locations for one or more nodes in order to provide a best fit representation of segments connecting these nodes. These techniques generally require a significant degree of processing operations and may introduce additional inconsistencies into the motion data. As processing power has increased and become more generally available, it has become possible to deploy these techniques into a commercial animation environment.
A solver A
1
receives captured motion A
2
derived from sensors applied to a moving performer. These sensors may be considered as defining locations within the solver's model and as such it is necessary to associate each location A
3
such that real sensors placed on the performer are individually associated with modelled body parts of the animated character. The character itself is described by means of a character description A
4
and movements of the character are defined with reference to geometric constraints A
5
. Thus, in response to input data defining a character description A
4
, the captured motion A
2
, association definitions A
3
and the geometric constraints it is possible to generate character animation A
6
.
Thus, it can be seen that for each character to be animated, it is necessary to go through a process of obtaining captured motion and then applying this, in combination with the character description, the associations and the geometric constraints, in order to produce character animation in which a specific character is animated in accordance with the captured motion data. In particular, for each animated performance, it is necessary to define the association definitions A
3
which, in order to ensure accurate registration, is a relatively time consuming process. In some circumstances, it may be desirable for several characters of different sizes and shapes to perform similar motions. Given the necessity to establish good registration with the locations, this would usually involve performing the process repeatedly for each character animation. Thus, this situation may be summarised in that for each particular character animation to be produced, it is necessary to create a specific marker set which can then be used to ensure that the captured motion registers correctly with the character animation. Put another way, for each specific motion capture data A
2
and for each specific character animation A
6
, it is necessary to define a specific association A
3
so as to associate the two.
A problem that has been encountered recently is that, given the availability of affordable processing capabilities, the overriding overhead in terms of the production activity becomes that of actually establishing the association definitions which, ultimately, restricts the number of applications where these techniques may be employed.
BRIEF SUMMARY OF THE INVENTION
According to a first aspect of the present invention, there is provided apparatus for generating action data for animating a character such that body part locations for a selected character are positioned in response to body part locations captured from performance data. When implemented in a computer system, the system includes storage means configured to store program instructions for processing means, performance data, registration data for the performance and registration data for the character. In response to the stored instructions, the processing means is configurable by the program instructions to perform several processing steps. These steps comprise identifying the location of body parts of a generic actor in response to the performance in combination with a bio-mechanical model. Thereafter, locations of body parts are identified for a character in response to the position and orientation of body parts of the generic actor in combination with a bio-mechanical model. The registration data for the performance associates body parts in the performance data and body parts in the generic actor. The registration data for the character associates body parts in the generic actor with body parts in the character.
An advantage of the invention is that any performances registered to the generic actor may be combined with any character registered to the same generic actor. The re-usability of performance data and character definitions is thereby significantly increased.
The invention provides for the use of only two sets of geometric constraints that are used for any marker configuration or any character, given that these may be represented by the same bio-mechanical model. The human bio-mechanical model itself implicitly defines most of the set of the geometric constraints. The first set of geometric constraints is used to extract the human motion and the other is used to apply human motion. The marker set is used by the extracting solver so as to calculate an appropriate marker configuration. The solver defines the relationships between body parts in terms of mathematical expressions. Consequently, appropriate marker configurations are defined by the solver in terms of a specific parameterisation for the solver. The characterisation is used by the mapping solver to calculate a geometric configuration of the character. The registration data for the performance associates markers in the performance and body parts in the generic actor.


REFERENCES:
patent: 5990908 (1999-11-01), Thingvold
patent: 6191798 (2001-02-01), Handelman et al.
patent: 6278455 (2001-08-01), Baker
patent: 6317132 (2001-11-01), Perlin
patent: 6400368 (2002-06-01), Laperriere
Welman, “Inverse Kinematics and Geometric Constraints for Articulated Figure Manipulation”, Simon Fraser University, Sep. 1993.
Boulic et al, “Hierarchical Kinematic Behaviors for Complex Articulated Figures” in “Interactive Computer Animation” Prentice Hall Europe 1996.
Boulic et al, “Position Control of the Center of Mass for Articulated Figures in Multiple Support”, Proc. 6thEurographics Workshop on Animation and Simulation, pp. 130-143, Sep. 1995.
Boulic et al, “Robust Position Control of the Center of Mass with Second Order Inverse Kinetics”, Interactive Computer Animation, Copyright 1997.
Phillips et al, “Interactive Behaviors for Bipedal Articulated Figures”, Computer Graphics, vol. 25, No. 4, Jul. 1991.
Boulic et al, “An Anatomic Human Body for Motion Capture”, Computer Graphics Laboratory, Switzerland, Copyright 1999.
Bindiganavale, “Generating Motions from Motion Capture Data”, Test through Studio for Creative Enquiry, Carnegie Mellon UniversityUniversity of Pennsylvania Internal Report, 1995.
Gleicher

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Generating action data for the animation of characters does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Generating action data for the animation of characters, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Generating action data for the animation of characters will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3179893

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.