System and method for parameter-based image synthesis using hier

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

395133, 395173, G06T 1100

Patent

active

056490860

ABSTRACT:
Synthesis of novel images from example images is achieved by interpolating among example images based on user selection of parameter values in a hierarchy of networks referred to as parent networks and child networks. Child networks describe distinct physical characteristics of the image, such as thickness of eyebrows. Parent networks describe more general or abstract characteristics related to the image, such as emotional states expressed by the image.

REFERENCES:
patent: 5325475 (1994-06-01), Poggio et al.
Librande, Steve, "Example-Based Character Drawing", Massachusetts Institute of Technology 1992.
Poggio, Tomaso, et al., "Networks for Approximation and Learning", Proceedings of the IEEE, vol. 78, No. 9, Sep. 1990.
Cassell, Justine, et al., Animated Conversation: Rule-based Generation of Facial Expression, Gesture & Spoken Intonation of Multiple Conversational Agents, Computer Graphics Proceedings, Annual Conference 1994, pp. 413-420.
Waters, Keith et al., DECface: An Automated Lip-Synchronization Algorithm for Synthetic Faces, Digitial Equipment Corporation, Sep. 23, 1993.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for parameter-based image synthesis using hier does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for parameter-based image synthesis using hier, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for parameter-based image synthesis using hier will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1498478

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.