Intuitive mobile device interface to virtual spaces

Recorders – Record receiver deforming

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S156000, C345S215000

Reexamination Certificate

active

06798429

ABSTRACT:

FIELD
The present invention relates generally to an intuitive mobile device interface and more particularly to a method and apparatus to enable the relative and/or absolute motion of a mobile device to serve as a one-dimensional or multi-dimensional input to the mobile device.
BACKGROUND OF THE INVENTION
As the demand for information and connectivity has grown, mobile computing devices have been increasingly deployed to provide convenient access to information. The term mobile computing devices, or mobile devices, as used herein, includes mobile phones, beepers, hand held computers and/or devices, personal digital assistants, and any other type of mobile user electronic device with a display area of some form.
The small size and lightweight of mobile computing devices gives the user a sense of intimacy and control. However, these same advantages require that the screen size of mobile devices be small so that they can be hand held. This leads to cumbersome user input interfaces since conventional interfaces, such as keyboards and mouse devices, usually hinder mobility.
Typically, users are limited to using touch screens, stencils, or buttons as input interfaces to mobile devices. Such input interfaces are cumbersome requiring the use of both hands, one to hold the mobile device and the other to enter data.
Another difficulty with the small display screens of mobile devices is controlling the view and/or movement of representations of data and/or objects, also referred to as the virtual space. Indicating the desired movement in the virtual space may be cumbersome and slow using a stylus or touch screen. For example, indicating the desired motion in a three-dimensional virtual space may be awkward using two-dimensional interfaces such as stylus or touch screens. Moreover, controlling movement in a virtual space by using a stylus or touch screen may conflict with other modes of operation of the input interface.
The small screen sizes and cumbersome interfaces limit the display of larger data sets. Some of these data sets may include two-dimensional data, such as text, three-dimensional data, such visual objects, or four-dimensional data, such as visual objects that change over time. The user may be limited to viewing small documents or objects or small parts of a large document or objects.


REFERENCES:
patent: 5602566 (1997-02-01), Motosyuku et al.
patent: 6005548 (1999-12-01), Latypov et al.
patent: 6115025 (2000-09-01), Buxton et al.
patent: 6137468 (2000-10-01), Martinez et al.
patent: 6201554 (2001-03-01), Lands
patent: 6288704 (2001-09-01), Flack et al.
patent: 6340957 (2002-01-01), Adler et al.
patent: 6347290 (2002-02-01), Bartlett
patent: 6400376 (2002-06-01), Singh et al.
patent: 6466198 (2002-10-01), Feinstein
patent: 6509907 (2003-01-01), Kuwabara
patent: 6556185 (2003-04-01), Rekimoto
patent: 6567101 (2003-05-01), Thomas
patent: 6597384 (2003-07-01), Harrison
patent: 6603420 (2003-08-01), Lu
Hinckley, Pierce, Sinclair, Horvitz, Microsoft Research, Redmond, WA, “Sensing Techniques for Mobile Interaction,” UIST '00, Nov. 5-8, 2000, pp. 91-100.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Intuitive mobile device interface to virtual spaces does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Intuitive mobile device interface to virtual spaces, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Intuitive mobile device interface to virtual spaces will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3272160

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.