Focus condition detecting device

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

G03B 1336

Patent

active

055394949

ABSTRACT:
An image sensor having a plurality of photoelectric devices output signal trains representing an incoming image of an object. A microprocessor divides each of the output signal trains of the image sensor into a plurality of ranges. A plurality of assembly errors corresponding respectively to the plurality of ranges are pre-stored in a memory. The microprocessor calculates the gravity center position of the contrast of the object image for each of the plurality of ranges on the basis of the output signal trains, so that the compensation amount corresponding respectively to each range is calculated. Therefore the microprocessor determines a defocus amount of the photographing lens in each of the plurality of ranges on the basis of the output signal trains and the corresponding compensation amount.

REFERENCES:
patent: 4561749 (1985-12-01), Utagawa
patent: 4748321 (1988-05-01), Ishida et al.
patent: 4768052 (1988-08-01), Hamada et al.
patent: 4922279 (1990-05-01), Hamada et al.
patent: 4974007 (1990-11-01), Yoshida
patent: 4977311 (1990-12-01), Kusaka et al.
patent: 5068682 (1991-11-01), Utagawa
patent: 5138357 (1992-08-01), Utagawa
patent: 5138360 (1992-08-01), Yamasaki
patent: 5410383 (1995-04-01), Kusaka et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Focus condition detecting device does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Focus condition detecting device, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Focus condition detecting device will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-717073

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.