Pose estimation based on critical point analysis

Image analysis – Pattern recognition – Feature extraction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S154000, C382S190000, C382S201000, C382S224000, C382S291000

Reexamination Certificate

active

07317836

ABSTRACT:
Methods and systems for estimating a pose of a subject. The subject can be a human, an animal, a robot, or the like. A camera receives depth information associated with a subject, a pose estimation module to determine a pose or action of the subject from images, and an interaction module to output a response to the perceived pose or action. The pose estimation module separates portions of the image containing the subject into classified and unclassified portions. The portions can be segmented using k-means clustering. The classified portions can be known objects, such as a head and a torso, that are tracked across the images. The unclassified portions are swept across an x and y axis to identify local minimums and local maximums. The critical points are derived from the local minimums and local maximums. Potential joint sections are identified by connecting various critical points, and the joint sections having sufficient probability of corresponding to an object on the subject are selected.

REFERENCES:
patent: 5426712 (1995-06-01), Nakajima
patent: 6243106 (2001-06-01), Rehg et al.
patent: 6301370 (2001-10-01), Steffens et al.
patent: 6628821 (2003-09-01), Covell et al.
patent: 6741756 (2004-05-01), Toyama et al.
patent: 2003/0169906 (2003-09-01), Gokturk et al.
patent: 2003/0235334 (2003-12-01), Okubo
patent: 2004/0120581 (2004-06-01), Ozer et al.
patent: 2004/0240706 (2004-12-01), Wallace et al.
patent: 2005/0265583 (2005-12-01), Covell et al.
Li et al. (“Articulated Pose Identification with Sparse Point Features,” IEEE Transaction on Systems, Man and Cybernetics—Part B: Cybernetics, vol. 34, No. 3, Jun. 2004, pp. 1412-1422).
Leung et al. (“First Sight: A Human Body Outline Labeling System,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, No. 4, Apr. 1995, pp. 359-377).
Thirion (“The Extremal Mesh and the Understanding of 3D surfaces,” IEEE Workshops on Biomedical Image Analysis, Jun. 1994, pp. 3-12).
Bowden, R. et al., “Reconstructing 3D Pose and Motion from a Single Camera View,” Brunel University, Uxbridge, Middlesex, UK, 9thBritish Machine Vision Conference, ICPR '98, Vienna, Austria, 10 pages.
Chu, C. e al., “Markerless Kinematic Model and Motion Capture from Volume Sequences,” Department of Computer Science, University of Southern California, Los Angeles, CA, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Proceedings, Jun. 2003, pp. II-475-II-482. vol. 2.
Drouin, S. et al., “Simultaneous Tracking and Estimation of a Skeletal Model for Monitoring Human Motion,” Department of Electrical and Computer Engineering, Laval University, Sainte-Foy, QC, Canada, Vision Interface 2003, pp. 1-8.
Poppe, R., “Real-Time Pose Estimation from Monocular Image Sequences Using Silhouettes,” Apr. 2004, Department of Electrical Engineering, Mathematics and Computer Science, University of Twente, the Netherlands, 8 pages.
International Search Report and Written Opinion, PCT/US06/09875, Aug. 20, 2007, 10 pages.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Pose estimation based on critical point analysis does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Pose estimation based on critical point analysis, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Pose estimation based on critical point analysis will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2744937

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.