Environmental modeling for motion controlled handheld devices

Computer graphics processing and selective visual display system – Display peripheral interface input device

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S158000

Reexamination Certificate

active

10807571

ABSTRACT:
A motion controlled handheld device includes a user interface comprising a display having a viewable surface and operable to generate a current image and a motion detection module operable to detect motion of the device within three dimensions and to identify components of the motion in relation to the viewable surface. The device includes a device state tracking module operable to analyze the components to determine an environmental state of the device. The environmental state comprises a motion state and an orientation of the device with respect to gravity. The device also includes a controller operable to execute an application and to perform an operation of the application based on the environmental state.

REFERENCES:
patent: 4812831 (1989-03-01), Laier
patent: 5112785 (1992-05-01), Brun et al.
patent: 5142655 (1992-08-01), Drumm
patent: 5506605 (1996-04-01), Paley
patent: 5543588 (1996-08-01), Bisset et al.
patent: 5602566 (1997-02-01), Motosyuku et al.
patent: 5734371 (1998-03-01), Kaplan
patent: 6008810 (1999-12-01), Bertram et al.
patent: 6057554 (2000-05-01), Plesko
patent: 6088023 (2000-07-01), Louis et al.
patent: 6121960 (2000-09-01), Carroll et al.
patent: 6184847 (2001-02-01), Fateh et al.
patent: 6201554 (2001-03-01), Lands
patent: 6245014 (2001-06-01), Brainard, II
patent: 6288704 (2001-09-01), Flack et al.
patent: 6466198 (2002-10-01), Feinstein
patent: 2002/0190947 (2002-12-01), Feinstein
patent: 2004/0027330 (2004-02-01), Bradski
patent: 2378878 (2003-02-01), None
patent: WO 01/86920 (2001-11-01), None
patent: WO 03/001340 (2003-01-01), None
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, mailed Nov. 3, 2005, re PCT/US2005/007409 filed Mar. 7, 2005, 13 pages.
Westeyn et al., “Georgia Tech Gesture Toolkit: Supporting Experiments in Gesture Recognition”, ICMI '03, Vancouver, British Columbia, Canada, 8 pages, Nov. 5, 2003.
Yee, Ka-Ping, “Peephole Displays: Pen Interaction on Spatially Aware Handheld Computers”, CHI 2003, Ft. Lauderdale, Florida, 8 pages, Apr. 5, 2003.
Patent Application entitled, “Distinguishing Tilt and Translation Motion Components in Handheld Devices”. 68 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Handheld Device With Preferred Motion Selection”, 64 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Motion Sensor Engagement for a Handheld Device”, 68 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Selective Engagement of Motion Detection”, 66 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Gesture Based Navigation of a Handheld User Interface”, 69 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Translation Controlled Cursor”, 66 pages specification, claims and abstract, 11 pages of drawings, inventors Reinhardt et al.
Patent Application entitled, “Selective Engagement of Motion Input Modes”, 66 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Feedback Based User Interface for Motion Controlled Handheld Devices”, 66 pages specification, claims and abstract, 11 pages of drawings, inventor Marvit.
Patent Application entitled, “Spatial Signatures”, 64 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Motion Controlled Remote Controller”, 65 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Gesture Identification of Controlled Devices”, 65 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Gesture Based User Interface Supporting Preexisting Symbols”, 65 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Context Dependent Gesture Response”, 66 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Customizable Gesture Mappings for Motion Controlled Handheld Devices”, 67 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “User Definable Gestures for Motion Controlled Handheld Devices”, 66 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Non-Uniform Gesture Precision”, 66 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.
Patent Application entitled, “Dynamic Adaptation of Gestures for Motion Controlled Handheld Devices”, 65 pages specification, claims and abstract, 11 pages of drawings, inventors Marvit et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Environmental modeling for motion controlled handheld devices does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Environmental modeling for motion controlled handheld devices, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Environmental modeling for motion controlled handheld devices will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3810970

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.