Pose estimation method and apparatus

Image analysis – Applications – 3-d or stereo imaging analysis

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S289000

Reexamination Certificate

active

10209860

ABSTRACT:
A three-dimensional image data is formulated and saved in a memory for indicating a three-dimensional shape of an object and reflectivity or color at every point of the object. For each of multiple pose candidates, an image space is created for representing brightness values of a set of two-dimensional images of the object which is placed in the same position and orientation as the each pose candidate. The brightness values are those which would be obtained if the object is illuminated under varying lighting conditions. For each pose candidate, an image candidate is detected within the image space using the 3D model data and a distance from the image candidate to an input image is determined. Corresponding to the image candidate whose distance is smallest, one of the pose candidates is selected. The image space is preferably created from each of a set of pose variants of each pose candidate.

REFERENCES:
patent: 5208763 (1993-05-01), Hong et al.
patent: 5710876 (1998-01-01), Peercy et al.
patent: 6002782 (1999-12-01), Dionysian
patent: 6526156 (2003-02-01), Black et al.
patent: 6580821 (2003-06-01), Roy
patent: 6888960 (2005-05-01), Penev et al.
patent: 2001/0031073 (2001-10-01), Tajima
patent: 2001/0033685 (2001-10-01), Ishiyama
patent: 2001/0043738 (2001-11-01), Sawhney et al.
patent: 2002/0097906 (2002-07-01), Ishiyama
patent: 0 141 706 (1985-05-01), None
patent: 1 139 269 (2001-10-01), None
patent: 2 315 124 (1998-01-01), None
patent: A 11-51611 (1999-02-01), None
patent: A 2000-339468 (2000-12-01), None
patent: A 2001-283229 (2002-10-01), None
Kayanuma et al.; “A New Method to detect obect and estimate the position and oreintation from an image usinga 3D model having feature points”, IEEE, 1999.
Suen et al.; “The analysis and recognition of real-world textures in three dimensions”, IEEE Transactions on Pattern Analysis and Machine Intellignece, vol. 22 No. 5, May 2000.
Nomura et al.; “3D object pose estimation based on iterative image matching: shading and edge data fusion”, Proceedings of ICPR '96, IEEE, 1996.
Tsukamoto et al.; “Pose Estimation of human face using synthesized model images”, IEEE, 1994.
Haralick et al.; “Post estimation from corresponding point data”, iEEE Transactions on systems, vol. 19 No. 6, Dec. 1989.
Wunsch et al.; “Real-Time Pose Estimation of 3D objects from camera images using neural networks”, Proceedings of the 1997 IEEE International conference on robotics and automation, Apr. 1997.
Edwards; “An active apearance based approach to the pose estimation of complex objects”, Proc IROS '96, IEEE, 1996.
Ishiyama et al.; “A Range Finder for Human Face Measurement”, Technical Report of IEICE, Jun. 1999.
Radu Horaud et al., “An Analytic Solution for the Perspective 4-Point Problem,” Computer Vision, Graphics and Image Processing, V. 47, 1989, pp. 33-44.
Long Quan et al., “Linear N ≧ 4-Point Pose Determination,” Proceedings of the IEEE International Conference Computer Vision, V. 6, 1998, pp. 778-783.
Covell et al., “Articulated-pose estimation using brightness- and depth-constancy constraints”, Computer Vision and Recognition, 2000, Proceedings, IEEE Conference on vol. 2, Jun. 13-15, 2000, pp. 438-445.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Pose estimation method and apparatus does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Pose estimation method and apparatus, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Pose estimation method and apparatus will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3824724

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.