Image analysis – Applications – Target tracking or detecting
Reexamination Certificate
2009-10-02
2011-10-04
Strege, John (Department: 2624)
Image analysis
Applications
Target tracking or detecting
C382S159000, C382S291000, C348S169000
Reexamination Certificate
active
08031906
ABSTRACT:
A system for estimating orientation of a target based on real-time video data uses depth data included in the video to determine the estimated orientation. The system includes a time-of-flight camera capable of depth sensing within a depth window. The camera outputs hybrid image data (color and depth). Segmentation is performed to determine the location of the target within the image. Tracking is used to follow the target location from frame to frame. During a training mode, a target-specific training image set is collected with a corresponding orientation associated with each frame. During an estimation mode, a classifier compares new images with the stored training set to determine an estimated orientation. A motion estimation approach uses an accumulated rotation/translation parameter calculation based on optical flow and depth constrains. The parameters are reset to a reference value each time the image corresponds to a dominant orientation.
REFERENCES:
patent: 6671391 (2003-12-01), Zhang et al.
patent: 6804416 (2004-10-01), Bachelder et al.
patent: 6959109 (2005-10-01), Moustafa
patent: 7313266 (2007-12-01), Ishiyama
patent: 2001/0043738 (2001-11-01), Sawhney et al.
patent: 2005/0265583 (2005-12-01), Covell et al.
patent: 11-047196 (1999-02-01), None
patent: 2000-222585 (2000-08-01), None
patent: 2001-143075 (2001-05-01), None
patent: 2001-250121 (2001-09-01), None
patent: 2003-058884 (2003-02-01), None
patent: 2003-141551 (2003-05-01), None
Japanese Final Office Action, Japanese Patent Application No. 2006-516618, Mar. 19, 2010, 4 pages.
Japanese Non-Final Office Action, Japanese Patent Application No. 2006-516618, Dec. 21, 2009, 7 pages.
European Patent Office, Supplementary European Search Report, Patent Application No. EP 04769493.0, Aug. 11, 2010, five pages.
Hattori, K. et al., “Estimating Pose of Human Face Based on Symmetry Plane Using Range and Intensity Images,”Proceedings of the IEEE Fourteenth International Conference on Pattern Recognition, 1998, pp. 1183-1187, vol. 2, Brisbane, Australia.
Liu, X. et al., “Real-time Pose Classification for Driver Monitoring,” Proceedings of the IEEE Fifth International Conference on Intelligent Transportation Systems, Sep. 3-6, 2002, pp. 174-178, Singapore.
McKenna, S.J. et al., “Real-Time Face Pose Estimation,”Real-Time Imaging,1998, pp. 333-347, vol. 4, Article No. ri980127.
Nanda, H. et al., “Illumination Invariant Head Pose Estimation Using Single Camera,”Proceedings of the IEEE Intelligent Vehicles Symposium, 2003, pp. 434-437.
Niyogi, S. et al., “Example-Based Head Tracking,”Proceedings of the Second IEEE International Conference on Automatic Face and Gesture Recognition, 1996, pp. 374-378.
Fujimura Kikuo
Zhu Youding
Duell Mark E.
Fenwick & West LLP
Honda Motor Co. Ltd.
Strege John
LandOfFree
Target orientation estimation using depth sensing does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Target orientation estimation using depth sensing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Target orientation estimation using depth sensing will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-4285342