Method for interactively viewing full-surround image data...

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S582000

Reexamination Certificate

active

07542035

ABSTRACT:
A method of modeling of the visible world using full-surround image data includes steps for selecting a view point within a p-surface, selecting a direction of view within the p-surface, texture mapping full-surround image data onto the p-surface such that the resultant texture map is substantially equivalent to projecting full-surround image data onto the p-surface from the view point to thereby generate a texture mapped p-surface, and displaying a predetermined portion of the texture mapped p-surface. An apparatus for implementing the method is also described.

REFERENCES:
patent: 3725563 (1973-04-01), Woycechowsky
patent: 4667236 (1987-05-01), Dresdner
patent: 4728839 (1988-03-01), Coughlan et al.
patent: 4763280 (1988-08-01), Robinson et al.
patent: 4821209 (1989-04-01), Hempel et al.
patent: 4899293 (1990-02-01), Dawson et al.
patent: 5027287 (1991-06-01), Artigalas et al.
patent: 5185667 (1993-02-01), Zimmermann
patent: 5321776 (1994-06-01), Shapiro
patent: 5359363 (1994-10-01), Kuban et al.
patent: 5396284 (1995-03-01), Freeman
patent: 5434617 (1995-07-01), Bianchi
patent: 5495292 (1996-02-01), Zhang et al.
patent: 5666157 (1997-09-01), Aviv
patent: 5684937 (1997-11-01), Oxaal
patent: 5694533 (1997-12-01), Richards et al.
patent: 5923334 (1999-07-01), Luken
patent: 6028584 (2000-02-01), Chiang et al.
patent: 6049281 (2000-04-01), Osterweil
patent: 6147709 (2000-11-01), Martin et al.
patent: 6215519 (2001-04-01), Nayar et al.
patent: 6243099 (2001-06-01), Oxaal
patent: 6344852 (2002-02-01), Zhu
patent: 6509926 (2003-01-01), Mills et al.
patent: 6724421 (2004-04-01), Glatt
patent: 6757434 (2004-06-01), Miled et al.
patent: 6763068 (2004-07-01), Oktem
patent: 2003/0128756 (2003-07-01), Oktem
patent: 1 341 383 (2003-09-01), None
patent: WO 02/062056 (2002-08-01), None
Comaniciu, D., Ramesh, V., and Meer, P., “Real-Time Tracking of Non-Rigid Objects Using Mean-shift,” IEEE Computer Vision and Pattern Recognition, vol. 1 II, 2000, pp. 142-149.
Y. Yardimci, I. Yilmaz, A. E. Cetin, “Correlation Tracking Based on Wavelet Comain Information,” Proceedings of SPIE vol. #5204, San Diego, Aug. 5-7, 2003.
A M. Bagci, Y. Yardimci, A. E. Cetin, “Moving Object Detection Using Adaptive Subband Decomposition and Franctional Lower-Order Statistics in Video Sequences,” Signal Processing, 82 (12): 1941-1947, Dec. 2002.
C. Stauffer, W. Grimson, “Adaptive Background Mixture Models for Real-Time Tracking.” Proc. IEEE CS Conf. on Computer Vision and Pattern Recognition, vol. 2, 1999, pp. 246-252.
“A System for Video Surveillance and Monitoring,” in Proc. American Nuclear Society (ANS) Eighth International Topical Meeting on Robotics and Remote Systems, Pittsburgh, PA, Apr. 25-29, 1999 by Collins, Lipton and Kanade.
Aube, 12th International Conference on Automatic Fire Detection, 2001.
X. Zhou, R. Collins, T. Kanade, and P. Metes, “A Master-Slave System to Acquire Biometric Imagery of Humans at Distance”, ACM International Workshop on Video Surveillance, Nov. 2003.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for interactively viewing full-surround image data... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for interactively viewing full-surround image data..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for interactively viewing full-surround image data... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4055110

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.