Method capable of automatically transforming 2D image into...

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S419000, C345S420000, C345S427000, C382S133000, C382S285000

Reexamination Certificate

active

07876321

ABSTRACT:
The invention discloses a method for the transforming of a 2D image into a 3D image. The method comprises the steps of: (a) selecting an object of 2D image; (b) setting a base line in the 2D image; (c) base on the base line, judging whether the object is located on the foreground or background of the 2D image; (d) offering a displacement to the object; (e) moving the object with the displacement to generate a plurality of continuous images; and (f) sequentially outputting each of the continuous images to generate the 3D image. Accordingly, after the user selects an object of 2D image, the method of the invention will automatically transform the 2D image into the 3D image.

REFERENCES:
patent: 6434277 (2002-08-01), Yamada et al.
patent: 6515659 (2003-02-01), Kaye et al.
patent: 6590573 (2003-07-01), Geshwind
patent: 7102633 (2006-09-01), Kaye et al.
patent: 2008/0246759 (2008-10-01), Summers

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method capable of automatically transforming 2D image into... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method capable of automatically transforming 2D image into..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method capable of automatically transforming 2D image into... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2639442

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.