Architecture for controlling a computer using hand gestures

Data processing: presentation processing of document – operator i – Operator interface – On-screen workspace or object

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C715S863000

Reexamination Certificate

active

07665041

ABSTRACT:
Architecture for implementing a perceptual user interface. The architecture comprises alternative modalities for controlling computer application programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and verbal commands. The perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object. Detection of object characteristics is based at least in part upon image comparison of a plurality of images relative to a course mapping of the images. A seeding component iteratively seeds the tracking component with object hypotheses based upon the presence of the object characteristics and the image comparison. A filtering component selectively removes the tracked object from the object hypotheses and/or at least one object hypothesis from the set of object hypotheses based upon predetermined removal criteria.

REFERENCES:
patent: 5528263 (1996-06-01), Platzker et al.
patent: 5616078 (1997-04-01), Oh
patent: 5801704 (1998-09-01), Oohara et al.
patent: 5828779 (1998-10-01), Maggioni
patent: 6002808 (1999-12-01), Freeman
patent: 6111580 (2000-08-01), Kazama et al.
patent: 6215890 (2001-04-01), Matsuo et al.
patent: 6222465 (2001-04-01), Kumar et al.
patent: 6226388 (2001-05-01), Qian et al.
patent: 6377296 (2002-04-01), Zlatsin et al.
patent: 6421453 (2002-07-01), Kanevsky et al.
patent: 6476834 (2002-11-01), Doval et al.
patent: 6591236 (2003-07-01), Lewis et al.
patent: 6594616 (2003-07-01), Zhang et al.
patent: 6750848 (2004-06-01), Pryor
patent: 6868383 (2005-03-01), Bangalore et al.
patent: 7007236 (2006-02-01), Dempski et al.
patent: 7036094 (2006-04-01), Cohen et al.
patent: 7227526 (2007-06-01), Hildreth et al.
patent: 2002/0041327 (2002-04-01), Hildreth et al.
patent: 2004/0056907 (2004-03-01), Sharma et al.
patent: 2004/0113933 (2004-06-01), Guler
Sharma et al., “Method of Visual and acoustic signal co-analys for co-verbal gesture recognition”; 20020919, USPTO U.S. Appl. No. 60/413,998.
Azoz et al., “Reliable tracking of human arm dynamics by multiple cue integration and constraint fusion”, IEEE Conference on Computer Vision and Pattern Recognition, 1998.
Rigoll et al., “High Performance Real-Time Gesture Recognition Using Hidden Markov Models,” Gesture and Sign Language in Human-Computer Interaction, vol. LNAI 1371, Frohlich, ed., pp. 69-80, 1997.
Wilson et al., “Hidden Markov Models for Modeling and Recognizing Gesture Under Variation,” Hidden Markov Models:Applications in Computer Vision., T. Caelli, ed., World Scientific, pp. 123-160, 2001.
Guler, Sadiye Zeyno,“Split and Merge Behavior Analysis and Understanding Using Hidden Markov Models”, Oct. 8, 2002 , all pages.
Sharon Oviatt. Ten Myths of Multimodal Interaction. Communications of the ACM, vol. 42, No. 11, Nov. 1999, 8 pages.
Thomas Baudel and Michel Beaudouin-Lafon. Charade: Remote Control of Objects using Free-Hand Gestures. Communications of the ACM, vol. 36. No. 7, Jul. 1993. 10 pages.
Claudette Cedras and Mubarak Shah. Motion-based Recognition: A Survey. IEEE Proceedings, image and Vision Computing, vol. 13, No. 2, pp. 129-155. Mar. 1995.
Michael Nielsen, Moritz Storring. Thomas B. Moeslund, and Erik Granum. A Procedure For Developing Intuitive And Ergonomic Gesture Interfaces For Man-Machine Interaction. Technical Report CVMT 03-01. ISSN 1601-3646. CVMT. Aalborg University. Mar. 2003. 12 pages.
Thomas B. Moeslund and Erik Granum. A Survey of Computer Vision-Based Human Motion Capture. Computer Vision and Image Understanding: CVIU, vol. 81, No. 3. pp. 231-268, 2001.
R. Sharma. M. Yeasin, N. Krahnstoever, I. Rauschert. G. Cai, I. Brewer, A. Maceachren, and K. Sengupta. Speech-Gesture Driven Multimodal Interfaces for Crisis Management. Proceedings of IEEE special issue on Multimodal Human-Computer Interface. 48 pages.
Will Fitzgerald and R. James Firby. Multimodal Event Parsing for Intelligent User Interfaces. IUI Conference, Jan. 2003. 8 pages.
Andrew Wilson, et al., GWindows: Towards Robust Perception-Based UI, Microsoft Research, 2003, pp. 1-8.
Zhengyou Zhang, Flexible Camera calibration by Viewing a Plane from Unknown Orientations, Microsoft Research, 1999, 8 pages.
Ikushi Yoda, et al., Utilization of Stereo Disparity and Optical Flow information for Human Interaction, Proceedings of the Sixth International Conference on Computer Vision, 1998, 5 pages, IEEE Computer Society, Washington D.C., USA.
Aileen Worden, et al., Making Computers Easier for Older Adults to Use: Area Cursors and Sticky Icons, CHI 97, 1997, pp. 266-271, Atlanta, Georgia, USA.
Shumin Zhai, et al., The “Silk Cursor”: Investigating Transparency for 3D Target Acquisition, CHI'94, 1994, pp. 273-279.
Kanade, et al. Development of Video-Rate Stereo Machine, Proceedings of 94 ARPA Image Understanding Workshop, 1994, pp. 549-558. Last accessed Sep. 30, 2008, 4 pages.
Darrell, et al. Integrated Person Tracking Using Stereo, Color and Pattern Dectection, Proceedings of the Conference on Computer Vision and Pattern Recognition, 1998, pp. 601-609. Last accessed Jul. 8, 2005, 10 pages.
Zhang A. Flexible New Technique for Camera Calibration, IEEE Transactions on Pattern Analysis and Machine Intelligence, Nov. 2000, pp. 1330-1334, vol. 22, No. 11. Last accessed Nov. 23, 2005, 5 pages.
GWindows: Light-Weight Stereo Vision for Interaction. http://research.microsoft.com/˜nuria/gwindows.htm. Last accessed Jul. 8, 2005, 2 pages.
Long, Jr., et al. Implications for a Gesture Design Tool, Proceedings of CHI'99, 1999, pp. 40-47. Last accessed Jul. 8, 2005, 8 pages.
Moyle, et al. Gesture Navigation: An Alternative ‘Back’ for the Future, Proceedings of CHI'02, 2002, pp. 882-823.
Oh, et al. Evaluating Look-to-talk: A Gaze-Aware interface in a Collaborative Environment, CHI'02, 2002, pp. 650-651. Last accessed Jul. 8, 2005, 3 pages.
Guiard. Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model, Journal of Motor Behavior, 1987, pp. 486-517, vol. 19 Issue 4.
Buxton, et al. A Study of Two-Handed Input, Proceedings of CHI'86, 1986, pp. 321-326. Last accessed Jul. 8, 2005, 6 pages.
Kabbash, et al. The “Prince” Technique: Fitts' Law and Selection Using Area Cursors, Proceedings of CHI'95, 1995, pp. 273-279. http://www.billbuxton.com/prince.html. Last accessed Jul. 8, 2005, 11 pages.
Welford. Signal, Noise, Performance, and Age. Human Factors, 1981, pp. 97-109, vol. 23, Issue 1. http://www.ingentaconnect.com/content/hfes/hf/1981/00000023/00000001/art0009.
Walker, et al. Age Related Differences in Movement Control: Adjusting Submovement Structure to Optimize Performance, Journals of Gerontology, Jan. 1997, pp. p40-p52. http://psychsoc.gerontologyjournals.org/cgi/content/abstract/52/1/p40.
Ali Azarbayejani and Alex Pentland, Real-Time Self-Calibrating Stereo Person Tracking Using 3-D Shape Estimation from Blob Features, Proceedings of ICPR, Aug. 1996, pp. 627-632, Vienna, Austria.
Cristopher Mignot, Claude Valot, and Noelle Carbonell, An Experimental Study of Future ‘Natural’ Multimodal Human-Computer Interaction, proceedings of INTERCHI93, 1993 pp. 67-68.
Erice Horvitz and Tim Peak, A Computational Architecture for Conversation, Proceedings of the Seventh International Conference on User Modeling, 1999, pp. 201-210.
Eric Horvitz, Principles of Mixed-Initiative User Interfaces, Proceedings of CHI, 1999.
Francois Berard, The Perceptual Window-Head Motion as a New Input Steam, Proceedings of the Seventh IFIP Conference of Human-Computer interaction, 1999, pp. 238-244.
Frederik C. M. Kjeldsen, Visual Interpretation of Hand Gestures as Practical Interface Modality, Ph.D. Dissertation, 1997, Columbia University Department of Computer Science, 168 pages.
Nebojsa Jojic, Barry Brumitt, Brian Meyers, Steve Harris and Thomas Huang, Dectection and Estimation of Pointing Gestures in Dense Disparity Maps, Proceedings of IEEE International Conference on Autom

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Architecture for controlling a computer using hand gestures does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Architecture for controlling a computer using hand gestures, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Architecture for controlling a computer using hand gestures will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4218097

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.