Diffusion-based interactive extrusion of 2D images into 3D...

Image analysis – Image transformation or preprocessing – Mapping 2-d image onto a 3-d surface

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S154000, C382S162000, C382S256000, C345S419000, C345S420000

Reexamination Certificate

active

07630580

ABSTRACT:
Systems and methods are provided for performing diffusion-based image extrusion. According to one embodiment, a three dimensional model is created by polygonizing an input image to produce an inflatable image. The input image may be either a 2D image or icon or a 3D image or icon. The set of pixels making up the input image are represented as a plurality of polygons. Then, an initial value is assigned to a z-coordinate of each pixel in the set of pixels. After polygonizing the input image to create the inflatable image, the inflatable image is extruded by applying a biased image-based diffusion process to generate appropriate z-coordinate values for a reference point associated with each polygon of the plurality of polygons. In various embodiments, an end-user may be provided with the ability to interactively change one or more parameters associated with the inflatable image and/or the diffusion process.

REFERENCES:
patent: 6246806 (2001-06-01), Hara et al.
patent: 6342884 (2002-01-01), Kamen et al.
patent: 6456287 (2002-09-01), Kamen et al.
patent: 6525744 (2003-02-01), Poggio et al.
patent: 6549201 (2003-04-01), Igarashi et al.
patent: 6750873 (2004-06-01), Bernardini et al.
patent: 7006093 (2006-02-01), Fujiwara et al.
patent: 7035433 (2006-04-01), Mihara et al.
Kujik et al. “Faster Phong Shading via Angular Interpolation”, Computer Graphics Forum, No. 8, 1989, pp. 315-324.
Repenning, “Inflatable Icons: Diffusion-Based Interactive Extrusion of 2D Images into 3D Models”, Journal of Graphics Tools, vol. 10, No. 1, 2005.
Markosian et al., “Skin: A Constructive Approach to Modeling Free-form Shapes”, SIGGRAPH 1999.
Bloomenthal et al., “Interactive Techniques for Implicit Modeling”, Symposium for Interactive 3D Graphics, 1990, pp. 109-116.
Bloomenthal, J. et al., “Interactive Techniques for Implicit Modeling.” Xerox PARC.
Debunne, G. et al., “Dynamic Real-Time Deformations Using Space & Time Adaptive Sampling.”
Desbrun, M. et al., “Implicit Fairing of Irregular Meshes Using Diffusion and Curvature Flow.”
Gross., Mark D., “The Fat Pencil, the Cocktail Napkin, and the Slide Library.” College of Architecture and Planning. pp. 1-24.
Igarashi, T. et al., “Teddy: A Sketching Interface for 3D Freeform Design.” University of Tokyo, Tokyo Institute of Technology.
Klein, R. et al., “Mesh Reduction with Error Control.” Wilhelm-Schickard-Insitut.
Lamb., D. et al., “Interpreting a 3D Object From a Rough 2D Line Drawing.” Department of Computer Science. pp. 59-66.
Repenning, A. et al., “AgentSheets: End-User Programmable Simultations.” Journal of Artificial Societies and Social Simulation. vol. 3. No. 3; 2000.
Schweikardt. E., “Digital Clay: Deriving Digital Models from Freehand Sketches.” Design Machine Group, University of Washington. 1998.
Yl-Luen Do, E., “Drawing Marks, Acts, and Reacts: Toward a Computational Sketching Interface for Architectural Design.” Design machine Group, Department of Architecture, University of Washington. 2002; pp. 149-171.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Diffusion-based interactive extrusion of 2D images into 3D... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Diffusion-based interactive extrusion of 2D images into 3D..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Diffusion-based interactive extrusion of 2D images into 3D... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4142805

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.