Asynchronous and synchronous gesture recognition

Data processing: presentation processing of document – operator i – Operator interface – Gesture-based

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07614019

ABSTRACT:
A system and method for determining whether a flick gesture has occurred is described. A flick gesture is a simple gesture that may be easily detected and is characterized by minimal interference with other applications or gestures.

REFERENCES:
patent: 4727588 (1988-02-01), Fox et al.
patent: 5347295 (1994-09-01), Agulnick et al.
patent: 5428805 (1995-06-01), Morgan
patent: 5511135 (1996-04-01), Rhyne et al.
patent: 5689667 (1997-11-01), Kurtenbach et al.
patent: 5694150 (1997-12-01), Sigona et al.
patent: 5717939 (1998-02-01), Bricklin et al.
patent: 5730602 (1998-03-01), Gierhart et al.
patent: 5768418 (1998-06-01), Berman et al.
patent: 5862256 (1999-01-01), Zetts et al.
patent: 6104317 (2000-08-01), Panagrossi et al.
patent: 6249606 (2001-06-01), Kiraly et al.
patent: 6340967 (2002-01-01), Maxted
patent: 2001/0036619 (2001-11-01), Kerwin
patent: 2002/0130839 (2002-09-01), Wallace et al.
patent: 2003/0177286 (2003-09-01), Gould
patent: 2004/0189720 (2004-09-01), Wilson et al.
patent: 2004/0193413 (2004-09-01), Wilson et al.
patent: 2005/0088420 (2005-04-01), Dodge et al.
patent: 2005/0210418 (2005-09-01), Marvit et al.
patent: 2006/0001656 (2006-01-01), LaViola et al.
patent: 2008/0036743 (2008-02-01), Westerman et al.
patent: 1335272 (2003-03-01), None
Internet Printout: http://www.alias.com/eng/support/studiotools/documentation/Using/Interfacell.html, Use marking menus, dated Sep. 8, 2004.
Shrinath Shanbhag et al., “An Intelligent Multi-layered Input Scheme for Phonetic Scripts”, ACM 2002, pp. 35-38, 2002.
Michael Moyle et al., “The Design and Evaluation of a Flick Gesture for ‘Back’ and ‘Forward’ in Web Browsers”, Human-Computer Interaction Lab, Department of Computer Science.
Michael Moyle et al., “Gesture Navigation: An Alternative ‘Back’ for the Future”, CHI 2002, Apr. 20-25, 2002, Minneapolis, Minnesota.
Mike Wu et al., “Multi-Finger and Whole Hand Gestural Interaction Techniques for Multi-User Tabletop Displays”, 2003 ACM, pp. 193-202.
Dan Venolia et al., “T-Cube: A Fast, Self-Disclosing Pen-Based Alphabet”, 1994 ACM, pp. 265-270.
Kenneth P. Fishkin et al., “Embodied User Interfaces forReallyDirect Manipulation”, pp. 1-11, Submitted toCommunications of the ACM, Version 9 (Jul. 3, 1999).
Margaret R. Minsky, Manipulating Simulated Objects with Real-world Gestures using a Force and Position Sensitive Screen, 1984 ACM, pp. 195-203.
André Meyer, “Pen Computing” A Technology Overview and a Vision, vol. 27, No. 3, SIGCHI Bulletin, Jul. 1995.
Wanted Features for Berlin, Wanted Features for Berlin for the Warsaw and Moscow APIs, modified Apr. 14, 1999, printed from ANOQ of the Sun homepage on Sep. 1, 2004, 4 pages.
Dulberg, M.S. et al, “An Imprecise Mouse Gesture for the Fast Activation of Controls”, Proceedings of Interact '99, 1999, pp. 1-10.
Moyle, M., “A Flick in the Right Direction: An Evaluation of Simple Gesture Based Controls”, University of Canterbury, Christchurch, New Zealand, Nov. 2, 2001, pp. 1-43.
Extended EP Search Report dtd Jun. 30, 2006, European Patent Application No. 05108158.6.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Asynchronous and synchronous gesture recognition does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Asynchronous and synchronous gesture recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Asynchronous and synchronous gesture recognition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4074932

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.