Manipulation of motion data in an animation editing system

Computer graphics processing and selective visual display system – Computer graphics processing – Animation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

06806879

ABSTRACT:

BACKGROUND
A common problem in animation is creating animated characters that move along a specified path. There are several ways to specify this kind of motion. One way is to use motion capture data. Another way is to specify a series of key frames to describe the motion using a combination of inverse and forward kinematics and character rigging techniques.
In both motion capture and dense key frame animation, it can be difficult to visually edit motion that shifts between being locked at a single point in space and being in motion. The most common example of this scenario is way a foot locks to a position on the ground, accelerates to step forward and then decelerates to a new locked position. This problem is particularly acute when the animation is defined using motion capture data. For example, an animator may have a motion capture data representing a person walking. Repositioning where the character is stepping is currently a great deal of work, and typically requires laborious manual editing or offsetting of motion curves.
FIG. 1
illustrates the typical problem. The motion capture data represents a character walking, shown over time with the line
100
. The animator desires to reposition the character's feet over time, for example to the line
102
by manipulating the motion capture data. This task of editing the motion capture data for repositioning the character typically requires a lot of work.
Current solutions to this problem include describing original motion capture rotation information in reference to a path and blending in foot stepping positions only at points where the foot is in contact with the ground. The foot contact positions are introduced as a secondary layer. Such a solution is described in “Motion Editing with Spacetime Constraints,” by Michael Gleicher, in
Proceedings of the
1997
Symposium on Interactive
3
D Graphics
and related work.
SUMMARY
The problem of editing motion data can be solved by providing a way to specify control points (herein called “handles”) along the path of the motion data and to describe the motion data as a combination of layers of information in relationship to these handles.
For example, a first layer may describe, for each point in the motion data, the distance of the point between the handles. For example, a path between two handles may be defined. Each point in the motion data is closest to a point along that path, which may be called a reference point. That point along the line has a distance to the two handles. These distances may be defined as a percentage of the length of the path. A second layer may describe the offset of points in the motion data from the line between the two handles.
In one embodiment, the handles may be identified by the animator either by spatial position, timing or a combination of the two. In another embodiment, the handles may be identified automatically, for example, at predetermined intervals in the motion data or by identifying points where motion stops, changes course, or dips below a minimum velocity.
To edit an animation using the motion data, an animator simply can manipulate the handles in three-dimensional space and/or manipulate the offsets.
Thus, by describing motion as a series of connecting handles, and a combination of layers describing distances and offsets along a path between the handles, the motion data can be easily modified using a graphical user interface that allows an animator to reposition the handles and modify the offsets.


REFERENCES:
patent: 5649133 (1997-07-01), Arquie
patent: 5758093 (1998-05-01), Boezeman et al.
patent: 6144972 (2000-11-01), Abe et al.
patent: 6388668 (2002-05-01), Elliott
Bindiganavale, Ramamani N., “Building Parameterized Action Representations From Observation”, A PhD. Dissertation in Computer & Information Sciences, University of Pennsylvania, 2000, pp.iv-110.
Bindiganavale, Rama, et al., “Motion Abstraction and Mapping with Spatial Constraints”, Proc. of Int'l Workshop on Modeling and Motion Capture Techniques for Virtual Environments, Captech '98, Nov. 1998, pp. 70-82.
Choi, Kwangjin, et al., “Processing Motion Capture to Achieve Positional Accuracy”, Graphical Models and Image Processing, vol. 61, No. 5, 1999, pp. 260-273.
Gleicher, Michael, “Animation From Observation: Motion Capture And Motion Editing”, Computer Graphics, vol. 33, No. 4, 1999, pp. 51-54.
Gleicher, Michael, “Comparative Analysis of Constraint-Based Motion Editing Methods”, Dept. of Computer Science, University of Wisconsin, Jan. 2001, pp. 1-38.
Gleicher, Michael, “Motion Editing with Spacetime Constraints”, Proc. of the 1997 Symposium on Interactive 3D Graphics, 1997, pp. 1-10.
Gleicher, Michael, “Motion Path Editing”, Computer Science Dept., University of Wisconsin, ACM Symposium on Interactive 3D Graphics, 2001, pp. 195-202.
Gleicher, Michael, et al., “Constraint-Based Motion Adaptation”, Apple TR 96-153, Jun. 14, 1996, 1-30.
Kovar, Lucas et al., “Footskate Cleanup for Motion Capture Editing”, ACM SIGGRAPH Jul. 2002, pp. 97-104.
Lee, Jehee, et al., “A Hierarchical Approach to Interactive Motion Editing for Human-Like Figures”, Proc. SIGGRAPH 1999, pp. 39-48.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Manipulation of motion data in an animation editing system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Manipulation of motion data in an animation editing system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Manipulation of motion data in an animation editing system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3281099

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.