Sign based human-machine interaction

Image analysis – Applications – Target tracking or detecting

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S107000, C382S181000, C382S259000

Reexamination Certificate

active

11129164

ABSTRACT:
Communication is an important issue in man-to-robot interaction. Signs can be used to interact with machines by providing user instructions or commands. Embodiment of the present invention include human detection, human body parts detection, hand shape analysis, trajectory analysis, orientation determination, gesture matching, and the like. Many types of shapes and gestures are recognized in a non-intrusive manner based on computer vision. A number of applications become feasible by this sign-understanding technology, including remote control of home devices, mouse-less (and touch-less) operation of computer consoles, gaming, and man-robot communication to give instructions among others. Active sensing hardware is used to capture a stream of depth images at a video rate, which is consequently analyzed for information extraction.

REFERENCES:
patent: 5454043 (1995-09-01), Freeman
patent: 5581276 (1996-12-01), Cipolla et al.
patent: 5594469 (1997-01-01), Freeman et al.
patent: 6002808 (1999-12-01), Freeman
patent: 6128003 (2000-10-01), Smith et al.
patent: 6215890 (2001-04-01), Matsuo et al.
patent: 6720949 (2004-04-01), Pryor et al.
patent: 6788809 (2004-09-01), Grzeszczuk et al.
patent: 6819782 (2004-11-01), Imagawa et al.
patent: 2002/0041327 (2002-04-01), Hildreth et al.
patent: 2002/0181773 (2002-12-01), Higaki et al.
patent: 2003/0113018 (2003-06-01), Nefian et al.
patent: 2003/0156756 (2003-08-01), Gokturk et al.
patent: 2004/0151366 (2004-08-01), Nefian et al.
patent: 2004/0189720 (2004-09-01), Wilson et al.
patent: 2004/0193413 (2004-09-01), Wilson et al.
patent: 2006/0033713 (2006-02-01), Pryor
patent: WO 00/30023 (2000-05-01), None
patent: WO 2004/097612 (2004-11-01), None
Gvili et al.; “Depth Keying”, SPIE Elec. Imaging, 2003.
Ishibuchi et al.; “Real Time Vision-Based Hand Gesture Estimatiohn for a Human-Computer Interface”, Systems and Computers In Japan, vol. 28, No. 7, 1997.
Redert et al.; “ATTEST: Advanced Three-dimensional Television System technologies”, 3DPVT, 2002.
Zhu et al.; “3D Head Pose Estimation with Optica Flow and Depth Constraints”, Proceedings of the Fourth International Conference on 3D Digital Imaging and Modeling, 2003.
Athitsos, V. et al., “An Appearance-Based Framework for 3D Hand Shape Classification and Camera Viewpoint Estimation,” Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FGR'02), IEEE, 2002, 8 pages.
Bretzner, L. et al., “Hand Gesture Recognition using Multi-Scale Colour Features, Hierarchical Models and Particle Filtering,” Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FGR'02), IEEE, 2002, 8 pages.
Gavrila, D.M. et al., “Real-Time Object Detection for “Smart” Vehicles,” The Proceedings of the Seventh IEEE International Conference on Computer Vision, Sep. 20-27, 1999, 10 pages.
Iddan, G.J. et al., “3D Imaging in the Studio (and Elsewhere . . . )” Proceedings of the SPIE, Jan. 24-25, 2001, pp. 48-55, vol. 4298.
Jojic, N. et al., “Detection and Estimation of Pointing Gestures in Dense Disparity Maps,” Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Mar. 28-30, 2000, 15 pages.
Malassiotis, S. et al., “A Gesture Recognition System using 3D Data,” Proceedings of the First International Symposium on 3D Data Processing Visualization and Transmission (3DPVT'02), IEEE, 2002, pp. 1-4.
Oka, K. et al., “Real-Time Tracking of Multiple Fingertips and Gesture Recognition for Augmented Desk Interface System,” Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FGR'02), IEEE, 2002, 12 pages.
Pavlovic, V.I. et al., “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Jul. 1997, vol. 17, No. 7.
Polat, E. et al., “Robust Tracking of Human Body Parts for Collaborative Human Computer Interaction,” Computer Vision and Image Understanding, 2003, pp. 44-69, vol. 89.
Sakagami, Y. et al., “The Intelligent ASIMO: System Overview and Integration,” Proceedings of the 2002 IEEE/RSJ, Intl. Conference on Intelligent Robots and Systems, IEEE, 2002, pp. 2478-2483.
Starner, T. et al., “Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Dec. 1996, pp. 1371-1375, vol. 20, No. 12.
Vogler, C. et al., “ASL Recognition Based on a Coupling Between HMMs and 3D Motion Analysis,” Sixth International Conference on Computer Vision, Jan. 4-7, 1998, pp. 363-369.
Wilson, A.D. et al., “Parametric Hidden Markov Models for Gesture Recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Sep. 1999, pp. 884-900, vol. 21, No. 9.
Zhu, X. et al., “Segmenting Hands of Arbitrary Color,” Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Mar. 28-30, 2000, pp. 446-453.
Zhu, Y. et al., “A Real-Time Approach to the Spotting, Representation, and Recognition of Hand Gestures for Human-Computer Interaction,” Computer Vision and Image Understanding, 2002, pp. 189-208, vol. 85.
Lee et al., “Online, Interactive Learning of Gestures for Human/Robot Interfaces,” Proceedings of the 1996 IEEE International Conference on Robotics and Automation, Apr. 1996, pp. 2982-2987.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Sign based human-machine interaction does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Sign based human-machine interaction, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sign based human-machine interaction will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3858698

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.