Method and apparatus using multiple sensors in a device with...

Computer graphics processing and selective visual display system – Display peripheral interface input device

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S157000, C345S158000, C345S160000, C345S161000, C345S163000, C345S173000

Reexamination Certificate

active

09875477

ABSTRACT:
In a device having a display, at least one sensor signal is generated from a sensor in the device. One or more context values are then generated from the sensor signal. The context values indicate how the device is situated relative to one or more objects. At least one of the context values is then used to control the operation of one or more aspects of the device.

REFERENCES:
patent: 4504701 (1985-03-01), Lucchesi
patent: 5329577 (1994-07-01), Norimatsu
patent: 5337353 (1994-08-01), Boi et al.
patent: 5481595 (1996-01-01), Ohashi et al.
patent: 5602566 (1997-02-01), Motosyuku et al.
patent: 5657372 (1997-08-01), Ahlberg et al.
patent: 5661632 (1997-08-01), Register
patent: 5689665 (1997-11-01), Mitsui et al.
patent: 5705997 (1998-01-01), Park
patent: 5712911 (1998-01-01), Her
patent: 5714997 (1998-02-01), Anderson
patent: 5761071 (1998-06-01), Bernstein et al.
patent: 5860016 (1999-01-01), Nookala et al.
patent: 5910882 (1999-06-01), Burrell
patent: 5924046 (1999-07-01), Martensson
patent: 5963952 (1999-10-01), Smith
patent: 5995852 (1999-11-01), Yasuda et al.
patent: 6137468 (2000-10-01), Martinez et al.
patent: 6201554 (2001-03-01), Lands
patent: 6216016 (2001-04-01), Cronin
patent: 6216106 (2001-04-01), John
patent: 6246862 (2001-06-01), Grivas et al.
patent: 6259787 (2001-07-01), Schulze
patent: 6288704 (2001-09-01), Flack et al.
patent: 6292674 (2001-09-01), David
patent: 6304765 (2001-10-01), Cosgrove et al.
patent: 6310955 (2001-10-01), Reeves
patent: 6374145 (2002-04-01), Lignoul
patent: 6381540 (2002-04-01), Beason et al.
patent: 6408187 (2002-06-01), Merriam
patent: 6426736 (2002-07-01), Ishihara
patent: 6449363 (2002-09-01), Kielsnia
patent: 6466198 (2002-10-01), Feinstein
patent: 6509907 (2003-01-01), Kuwabara
patent: 6516202 (2003-02-01), Hawkins et al.
patent: 6532447 (2003-03-01), Christensson
patent: 6542436 (2003-04-01), Myllyla
patent: 6560466 (2003-05-01), Skorko
patent: 6567068 (2003-05-01), Rekimoto
patent: 6567101 (2003-05-01), Thomas
patent: 6573883 (2003-06-01), Bartlett
patent: 6597384 (2003-07-01), Harrison
patent: 6621508 (2003-09-01), Shiraishi et al.
patent: 6621800 (2003-09-01), Klein
patent: 6624824 (2003-09-01), Tognazzini et al.
patent: 6631192 (2003-10-01), Fukiharu
patent: 6658272 (2003-12-01), Lenchik et al.
patent: 6822683 (2004-11-01), Torikai
patent: 6931592 (2005-08-01), Ramaley et al.
patent: 6970182 (2005-11-01), Schultz et al.
patent: 2001/0044318 (2001-11-01), Mantyjarvi et al.
patent: 2002/0140675 (2002-10-01), Ali et al.
patent: 2003/0055655 (2003-03-01), Suominen
patent: 2003/0104800 (2003-06-01), Zak
patent: 2003/0176205 (2003-09-01), Oota et al.
patent: 8-292826 (1996-11-01), None
patent: 2000124970 (2000-04-01), None
patent: 2001094636 (2001-04-01), None
patent: WO 98/14863 (1998-04-01), None
patent: WO 99/22338 (1999-05-01), None
Bartlett, J.F., “Rock'n'Scroll Is Here to Stay,” IEEE Computer Graphics and Applications, pp. 40-45, (May/Jun. 2000).
Harrison, Beverly L. et al, “Squeeze Me, Hold Me, Tilt Me! An Exploration of Manipulative User Interfaces,” pp. 17-24 (Apr. 18-23, 1998), CHI '98.
Rekimoto, Jun, “Tilting Operations for Small Screen Interfaces (Tech Note),” pp. 167-168, UIST '96.
Schmidt, Albrect, “Implicit Human Computer Interaction Through Context,” pp. 1-5, 2ndWorkshop on Human Computer Interaction with Mobile Devices, 1999.
Schmidt, Albrect et al., “There Is More to Context Than Location,” Environment Sensing Technologies for Adaptive Mobile User Interfaces, 5 pages, IMC '98.
Small, David et al., “Design of Spatially Aware Graspable Displays,” Extended Abstracts of CHI '97, pp. 1-2 (Mar. 22-27, 1997).
Schmidt, Albrecht et al., “Advanced Interaction in Context,” 13 pages, HUC '00.
Office Action (Jun. 13, 2005) and Response (Sep. 13, 2005) from U.S. Appl. No. 10/162,487, filed Jun. 3, 2002.
Office Action (Jul. 22, 2005) and Response (Sep. 27, 2005) from U.S. Appl. No. 10/294,286, filed Nov. 14, 2002.
Office Action (Apr. 17, 2006) from U.S. Appl. No. 10/162,487, filed Jun. 3, 2002.
Hinckley et al., “Sensing Techniques for Mobile Interaction,” CHI Letters vol. 2, 2; Copyright 2000, ACM 1-58113-232-3, pp. 91-100.
Office Action (Apr. 6, 2006) from U.S. Appl. No. 10/294,286, filed Nov. 14, 2002.
Office Action (Dec. 1, 2005) and Response (Apr. 21, 2005; Jan. 25, 2006) from U.S. Appl. No. 10/162,487, filed Jun. 3, 2002.
Office Action (Nov. 2, 2005) and Response (Oct. 11, 2005; Dec. 9, 2005; Feb. 2, 2006) from U.S. Appl. No. 10/294,286, filed Nov. 14, 2002.
Office Action (Jan. 7, 2005) and Response (Mar. 14, 2005) from U.S. Appl. No. 10/162,487, filed Jun. 3, 2002.
RotoView™ By Innoventions, How It Works.
RotoView™ By Innoventions, The Intuitive Display Navigation Solution for Hand Held Devices.
RotoView™ By Innoventions, Features and Specifications.
RotoView™ By Innoventions, The Intuitive Display Navigation Solution for Hand Held Devices.
Innoventions' RotoView™, The Intuitive Display Navigation Solution for Hand Held Devices, Background and Problem Definition.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus using multiple sensors in a device with... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus using multiple sensors in a device with..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus using multiple sensors in a device with... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3902803

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.