Depth information for auto focus using two pictures and...

Image analysis – Image transformation or preprocessing – Mapping 2-d image onto a 3-d surface

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S264000, C382S265000

Reexamination Certificate

active

07929801

ABSTRACT:
An imaging acquisition system that generates a depth map from two pictures of a three dimensional spatial scene is described. According to one aspect of the invention, the system generates the depth map based on the relative blur between the two pictures and the absolute blur contributed by the system. According to another aspect of the invention, the system calculates the depth map directly from the relative blur between the two pictures.

REFERENCES:
patent: 4751570 (1988-06-01), Robinson
patent: 4947347 (1990-08-01), Sato
patent: 4965840 (1990-10-01), Subbarao
patent: 5148209 (1992-09-01), Subbarao
patent: 5212516 (1993-05-01), Yamada et al.
patent: 5365597 (1994-11-01), Holeva
patent: 5432331 (1995-07-01), Wertheimer
patent: 5577130 (1996-11-01), Wu
patent: 5604537 (1997-02-01), Yamazaki et al.
patent: 5703637 (1997-12-01), Miyazaki et al.
patent: 5752100 (1998-05-01), Schrock
patent: 5793900 (1998-08-01), Nourbakhsh et al.
patent: 6023056 (2000-02-01), Fiete et al.
patent: 6130417 (2000-10-01), Hashimoto
patent: 6177952 (2001-01-01), Tabata et al.
patent: 6219461 (2001-04-01), Wallack
patent: 6229913 (2001-05-01), Nayar et al.
patent: 6677948 (2004-01-01), Wasserman et al.
patent: 6683652 (2004-01-01), Ohkawara et al.
patent: 6829383 (2004-12-01), Berestov
patent: 6876776 (2005-04-01), Recht
patent: 6891966 (2005-05-01), Chen
patent: 6925210 (2005-08-01), Herf
patent: 7019780 (2006-03-01), Takeuchi et al.
patent: 7035451 (2006-04-01), Harman et al.
patent: 7303131 (2007-12-01), Carlson et al.
patent: 7340077 (2008-03-01), Gokturk et al.
patent: 2003/0067536 (2003-04-01), Boulanger et al.
patent: 2003/0231792 (2003-12-01), Zhang et al.
patent: 2004/0027450 (2004-02-01), Yoshino
patent: 2004/0036763 (2004-02-01), Swift et al.
patent: 2004/0125228 (2004-07-01), Dougherty
patent: 2004/0131348 (2004-07-01), Ohba et al.
patent: 2005/0104969 (2005-05-01), Schoelkopf et al.
patent: 2005/0105823 (2005-05-01), Aoki
patent: 2005/0220358 (2005-10-01), Blonde et al.
patent: 2005/0265580 (2005-12-01), Antonucci et al.
patent: 2006/0120706 (2006-06-01), Cho et al.
patent: 2006/0221179 (2006-10-01), Seo et al.
patent: 2006/0285832 (2006-12-01), Huang
patent: 2007/0040924 (2007-02-01), Cho et al.
patent: 10108152 (1998-04-01), None
patent: 2004048644 (2004-12-01), None
Darrell et al., “Pyramid Based Depth from Focus”, 1988 p. 504-509.
Ohba, et al., “Real-Time Micro Environmental Observation with Virtual Reality”, 2000, International Conference on Pattern Recognition, vol. 4, pp. 487-490.
Ohba, et al., “Microscopic Vision System with All-in-focus and Depth Images”, Dec. 2003, Machine Vision and Applications, vol. 15, Issue 2, pp. 55-62.
Pettersson, Niklas, “Online Stereo Calibration using FPGAs”, http:/
ice.se/publications/pettersson—thesis—2005.pdf, 2005, pp. 1-6.
Ghita, Ovidiu and Whelan, Paul, “Real-Time 3D Estimation Using Depth from Defocus”, www.vsq.dcu.ie/papers/vision—2000.pdf, 2000, Third Quarter, vol. 16, No. 3, pp. 1-6.
Darrell, et al., “Pyramid Based Depth From Focus”, 1988, pp. 504-509.
Blickgesteuerte PC-Steuerung,Forschung&Entwicklung, XP-000725834, Publiation Date Aug. 4, 1997, pp. 76-78.
21st Century 3D: 3DVX3 Press Release; 21st Century 3D Introduces Uncompressed 4:4:4 Steroscopic camera System-3DVX3; San Jose Convention Center Jan. 18, 2006 SPIE Steroscopic Displays and Applications conference.
Three-dimensional Camera Phone; Smithsonian/NASA ADS Physics Abstract Service; find Similar Abstracts (with default settings below); Electronic Refereed Journal Article (HTML); Full Refereed Journal Article (PDF/Postscript); Reads History; Translate Abstract; Title: Three-dimensional Camera Phone, Authors: Iizuka, Kiego, Publications: Applied Optics IP, vol. 43, pp. 6285-6292, Publication Date: Dec. 2004, Origin: Web, Bibliographic Code: 2004ApOpt..43.6285I.
Real-time view interpolation system for a super multiview 3D display; processing; Smithsonian/NASA ADS Physics Abstract Service; find Similar Abstracts (wtih default settings below); Table of Contents; Also-Read Articles (Reads History); Translate Abstract; Title: Real-Time view interpolastion system for a super multiview 3D display; processsing implementation and evalutaion; Authors: Hamaguchi, Tadahiko, Fujii, Toshiaki; Honda, Toshio; Affiliation: AA (Telecommunications Advancement Organization of Japan) AB (Telecommunications Advancement Organization of Japan and Nagoya Univ.) AC (Telecommunications Advancement Organization of Japan and Chiba Univer.); Publication; Proc. SPIE vol. 4660, p. 105-115, Steroscopic Displays and Virtual Reality Systems, IX, Andrew J. Woods; John O. Merritt; Stephen A. Benton; Mark T. Bolas; Eds. (Spie Homepage); Publication Date: May 2002; Origin: SPIE; Abstract Copyright: 2002 SPIE-The Internantion Society for Optical Engineering, Downloading of the abstract is permitted for personal use only; Bibliographic Code: 2002SPIE.4600..105H.
3D displays; Using cellphane to convert a liquid crystal display screen into a three dimensional display (3D laptop computer and 3D Camera phone); Keigo Iizuka; Department of Electrical & Computer Engineering, 35 St. George Stree, University of Toronto, Toronto, Ontario, Canada M5S 1A4, available online since Aug. 2003.
Eugene Hecht, Optics 3rd Edition, Addison-Wesley, The Propagation of Light, Chapter 4, p. 126.
Klaus Berthold, Paul Horn, “Robot Vision”, 1986, pp. 1-509.
Tony Lindeberg, “Scale-Space Theory: A Basic Tool for Analysing Structures At Different Scales”, Journal of Applied Statistics, vol. 21, No. 2, pp. 225-270, 1994.
Alex Paul Pentland, “A New Sense for Depth of Field”, 1987, pp. 1-15, IEEE.
Shang-Hong Lai, Chang-Wu Fu and Shyang Chang, “A Generalized Depth Estimation Algorithm with a Single Image”, 1992, pp. 405-411, IEEE.
John Ens and Peter Lawrence, “An Investigation of Methods for Determining Depth from Focus”, 1993, pp. 97-108, IEEE.
Gopal Surya and Murali Subbarao, “Depth from Defocus by Changing Camera Aperture: A Spatial Domain Approach”, 1993, pp. 61-67, IEEE.
Mats Gokstorp, “Computing depth from out-of-focus blur using a local frequency representation”, 1994, pp. 153-158, IEEE.
Gunther Schneider, Bernard Heit, Johannes Honig, and Jacques Brémont, “Monocular Depth Perception by Evaluation of the Blur in Defocused Images”, 1994, pp. 116-119, IEEE.
Murali Subbarao, Tse-Chung Wei, and Gopal Surya, “Focused Image Recovery from Two Defocused Images Recorded with Different Camera Settings”, 1995, pp. 1613-1628, IEEE.
Shree K. Nayar, Masahiro Watanabe and Minori Noguchi, Real-Time Focus Range Sensor, 1996, pp. 1186-1198.
Masahiro Watanabe and Shree K. Nayar, “Minimal Operator Set for Passive Depth from Defocus”, 1996, pp. 431-438, IEEE.
Johannes Honig, Bernard Heit and Jacques Bremont, “Visual Depth Perception Based on Optical Blur”, 1996, pp. 721-724, IEEE.
A.N. Rajagopalan and S. Chaudhuri, “A Variational Approach to Recovering Depth From Defocused Images”, 1997, pp. 1158-1164, IEEE.
D. Ziou, S. Wang and J. Vaillancourt, “Depth from Defocus Using the Hermite Transform”, 1998, pp. 958-962, IEEE.
Shinsaku Hiura and Takashi Matsuyama, Depth Measurement by the Multi-Focus Camera, 7 pgs., Department of Science and Technology, Japan, 1998.
Christophe Simon and Frederique Bicking, “Estimation of depth on thick edges from sharp and blurred images”, 2002, pp. 323-328, IEEE.
Ovidiu Ghita, Paul F. Whelan and John Mallon, “Computational approach for depth from defocus”, Apr.-Jun. 2005 pp. 023021-1-023021-8, vol. 14(2), Journal of Electronic Imaging.
Tony Lindeberg, “On the axiomatic foundations of linear scale-space: Combining semi-group structure with causality vs. scale invariance”, pp. 1-24, 1994, Computational Vision and Action Perception Laboratory (CVAP), Klumer Academic,Sweden.
B.P. Horn, “Robot Vision

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Depth information for auto focus using two pictures and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Depth information for auto focus using two pictures and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Depth information for auto focus using two pictures and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2634410

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.