Virtual environment viewpoint control

Amusement devices: games – Including means for processing electronic data – Perceptible output or display

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C463S032000, C345S419000

Reexamination Certificate

active

06241609

ABSTRACT:

BACKGROUND OF THE INVENTION
The present invention relates to interactive environment systems such as immersive games and virtual reality or shared (multi-user) virtual environment systems which provide the user or users with a view of a virtual world within which the users computer-generated virtual presence appears and may interact with other such user virtual presences as well as, optionally, features of the environment itself. In particular, the present invention relates to such systems having means for controllably varying the viewpoint from which the image of the environment (as presented to a user) is rendered, a feature referred to herein as “virtual camera” control.
A description of a system providing a virtual environment (or cyberspace) accessible by remote users is given in European patent application EP-A-0 697 613 (Sony Corp.). The system described includes a server providing a virtual reality space, and user terminals connected to the server via a high-speed communications network (using optical fibres or the like). In operation, the server maintains a number of virtual environments and supports many differing terminal types by the use of conversion objects between information objects and user objects: the conversion objects provide individually tailored translation for communications back and forth between each type of terminal and each configuration of virtual environment supported.
At each user terminal, the user is presented with a two-dimensional view of the three-dimensional virtual environment from their own particular viewpoint location within the three-dimensional environment, with computer-generated representations of any other users who may at that time be within the same area of the virtual environment as the viewing user. Rather than generating a representation of a whole or part of the viewing user in the image seen by that user, the system of EP-A-0 697 613 takes the first-person view (i.e. the image is that which would be seen through the “eyes” of the users computer generated character) but provides a simple arrow-shaped cursor which the user may utilise to indicate or select items from within the virtual environment by up/down/left/right movements of the cursor in the presented two-dimensional image of the environment or, by clicking on the virtual character of a further user, to initiate a conversation or other interaction between two users. This technique is used in EP-A-0 697 613 as an improvement to a described prior art system in which the user is represented by a rendered character always appearing in the centre of the image presented to the user, such that the user takes a third-person view of their representation within the virtual environment.
Although rendering from the first-person point of view enhances a users feeling of immersion within the virtual environment, it can prove less than satisfactory when it comes to interacting with the virtual representations of other users where the third-person point of view provides the user with more information in the way of context for the interaction. Being able to select the viewpoint (virtual camera position) relative to your own representation would be an asset but the requirement to do so could become a distraction over time.
SUMMARY OF THE INVENTION
It is accordingly an object of the present invention to provide a system configured to automatically adjust the virtual camera position such as to provide an appropriate viewpoint in dependence on, amongst other factors, whether or not interaction between the users virtual presence and the representation of another user is taking place.
In accordance with a first aspect of the present invention there is provided a multi-user interactive virtual environment system comprising: a first data store containing data defining a virtual environment; a second data store containing data defining the external appearance of a plurality of characters; and a processor coupled to receive input commands from a plurality of separate users and arranged to access the first and second stores and generate for each user a respective image of the virtual environment and characters therein, including an assigned character particular to that individual user, from a respective viewpoint at a position and orientation within the virtual environment determined at least partially by the user-directed motion of the users assigned character, characterised by: interaction zone generation means arranged to maintain updated coordinates for a respective zone of predetermined size and shape about the current virtual environment location for each character; and monitoring means coupled with the zone generation means and arranged to determine when the respective interaction zones of two or more user-assigned characters overlap and to signal the same to said processor, the determination of respective viewpoint location and orientation for each such user-assigned character being based at least partially on a predetermined set of rules applied by the processor for as long as the overlap remains.
By the provision of interaction zones (which will preferably be invisible to the user) a trigger mechanism is provided for switching the users virtual camera location. As will be described in examples hereinafter, the virtual camera may simply follow its character at a position effectively looking “over its shoulder” while there is no interaction (no zone overlap) and then swing around to a third-person view to give a more informative view of the two interacting representations.
As the power available to implement complex user virtual presences increases, the size and/or complexity of the virtual world they may be modelled in increases, as does the number of different users who may simultaneously visit the same part of the virtual world at any given time. The effect of this is that a large-scale overlapping of interaction zones in a small area may occur, leading to unacceptable processor loading as camera positions are calculated. To avoid this potential problem, the processor suitably maintains at least one further interaction zone at a fixed location within the virtual environment, with this fixed interaction zone or zones being independent of any particular character within the virtual environment. With these fixed zones being provided at popular and/or crowded locations within the virtual environment, with a particular set of rules governing camera positioning, a global camera view covering the area may be specified for all characters within the area to avoid individual camera position calculations. This feature may be pre-set for a particular location within the virtual environment (regardless of the number of characters/cursors in that location) or it may be applied dynamically at any location where it is determined that, for example, the interaction zones of five or more characters have overlapped.
By way of a refinement, these fixed interaction zones may be made up from a concentric arrangement of at least two partial zones with only a part of the set of rules being applied by the processor when a character interaction zone overlaps only the outer partial zone. In other words, the extent to which a characters respective camera movements are determined by the movements of the character diminishes as the inner partial zone (with global camera positioning) is approached.


REFERENCES:
patent: 5347306 (1994-09-01), Nitta
patent: 5491743 (1996-02-01), Shiio et al.
patent: 5736982 (1998-04-01), Suzuki et al.
patent: 5913727 (1999-06-01), Ahdoot
patent: 6139434 (2000-10-01), Miyamoto et al.
patent: 0697613A2 (1996-02-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Virtual environment viewpoint control does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Virtual environment viewpoint control, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Virtual environment viewpoint control will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2451076

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.