Method of and apparatus for discriminating sharp edge transition

Radiant energy – Photocells; circuits and apparatus – Optical or pre-photocell system

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

356386, G01B 1110

Patent

active

046970886

ABSTRACT:
A method of and apparatus for discriminating sharp edge transitions produced during optical scanning, as by a CCD or the like, of differently reflective regions of a surface, such as copper conductors and resist background on printed circuit boards and similar applications, through delaying N successive sampling signals defining the edge transition of the desired edge transition regions slope and amplitude from unwanted reflections from other regions, subtracting each camera scan sampling signal from the previous Nth sample signal to produce a large difference signal only for the edge transitions, and adding such difference to the camera output signals to provide a distinctive boost to the edge transition signals which are then thresholded in a manner to insure rejection of signals from unwanted or spurious reflection regions into binary output signals unambiguously indentifying the edge transitions.

REFERENCES:
patent: 4269515 (1981-05-01), Altman
patent: 4332475 (1982-06-01), Demarest
patent: 4430750 (1984-02-01), Koellensperger

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of and apparatus for discriminating sharp edge transition does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of and apparatus for discriminating sharp edge transition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of and apparatus for discriminating sharp edge transition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1590744

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.