Method and device for generating display frames from a...

Image analysis – Image transformation or preprocessing – Changing the image coordinates

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S295000, C345S606000, C348S456000

Reexamination Certificate

active

06442303

ABSTRACT:

BACKGROUND OF THE INVENTION
The invention relates to a method as recited in the preamble of claim
1
. U.S. Pat. No. 4,736,248 discloses how to generate display frames by interpolating between source frame pairs. The transformation algorithm is derived from point pairs that occur in both of two successive source frames. The interpolation uses the same transformation for other pixel pairs that occur in both of these source frames. Sometimes a particular source pixel is present in only one of the two source frames, so that only for that particular source pixel extrapolation must be effected. The reference intends to improve picture rendering in dynamic aerial survey.
A different field of use pertains to highly interactive computer games and similar multimedia environment types. Here, it is important to have a high frame rate, so that displayed motion will be as smooth as possible. In principle this will also allow minimal latency between user-initiated events and visual feedback connected therewith. Such is especially important for navigation control, like in flight simulation games. It has been suggested to raise the frame rate to a value comparable to the display refresh rate that may be 60-72 Hz. It has furthermore been found that Virtual Reality (VR) systems need low latency to protect a user person against motion sickness. However, the general use of interpolating according to the reference will introduce additional latency, because the various interpolation parameters will only be known after reception of the later source frame, even if among all of the pixels, certain display frame pixels will only depend on past source frame pixels.
SUMMARY TO THE INVENTION
In consequence, amongst other things, it is an object of the present invention to avoid the latency increase caused by overall interpolation. Now therefore, according to one of its aspects, the invention is characterized according to the characterizing part of claim
1
. The Z-buffer is generated during the rendering of the source frames that is not part of the invention. The Z-buffer may be used to convert the 2-D-frames into 3-D space, so that changes in perspective as well as arbitrary 3-D camera rotations and translations may be implemented (Eq.8). These two transform types are the main causes for the changing of a scene.
A specific problem caused by extrapolating is that a scene part which was obscured in an earlier source frame may subsequently be uncovered in an extrapolated frame, because the obscuring object has moved in a transverse manner with respect to the obscured part. A solution is attained through lateral extrapolation from adjacent pixels that had not been subject to obscuring in the preceding source frame. The extrapolation may go along with or counter to the scanning direction. If the extrapolating operates on a background pattern or another entity with coarse granularity, the result is usually true or nearly true. On the other hand, the effect of a small item that would suddenly be uncovered from behind an obscuring object, will be ignored until arrival of the next source frame. Usually, the effect of this faking is allowable. Contrariwise to the above, restricting the extrapolation to a 2 D-affine transform would often create unreasonable distortions in the extrapolated frames, because a hitherto obscuring but moved part would now be extended a great deal. This problem is solved by making the extrapolation specifically dependent on the depth (zv
2
) expressed in the view coordinates of Eq. 12.
A display sequence may consist of the source frames together with intermediate frames. Alternatively, all or certain source frames may be left unused provided that the intermediate frames occur often enough. The inventive technology keeps display frame latency small. On the other hand, interpolating display frames may often increase latency to an unwanted extent.
The invention also relates to a frame-based device arranged for practising the recited method. Further advantageous aspects of the invention are recited in dependent claims.
The invention allows to calculate pixel displacements for producing intermediate frames between the frames generated by the 3-D rendering pipeline itself. The displacements may be calculated incrementally during scan conversion of the source frames, through consistently using results attained for an immediately adjacent pixel along the scan line. This requires for each next pixel only a low number of arithmetical calculations on the basis of pixels treated earlier.
The invention allows to use known camera motion with respect to a most recent source frame for producing instantaneous coherence between pixels of this source frame and pixels of an immediately following synthesized display frame according to Eq. 9.


REFERENCES:
patent: 4736248 (1988-04-01), Rosebrock
patent: 5550543 (1996-08-01), Chen et al.
patent: 5604856 (1997-02-01), Guenter
patent: 5706417 (1998-01-01), Adelson
“Efficient Prediction of Uncovered Background in Interframe Coding Using Spatial Extrapolation”, A Kaup et al, Acoustics, Speech and Signal Processing 1994, ICASSP-94, 1994 IEEE International Conference on Apr. 19-22, 1994, ISBM-07803-1175-0.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and device for generating display frames from a... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and device for generating display frames from a..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and device for generating display frames from a... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2942217

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.