Neural networks for machine vision

Image analysis – Histogram processing – For setting a threshold

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382 54, G06K 948

Patent

active

048037362

ABSTRACT:
Network interactions within a Boundary Contour (BC) System, a Feature Contour (FC) System, and an Object Recognition (OR) System are employed to provide a computer vision system capable of recognizing emerging segmentations. The BC System is defined by a hierarchy of orientationally tuned interactions, which can be divided into two successive subsystems called the OC filter and the CC loop. The OC filter contains oriented receptive fields or masks, which are sensitive to different properties of image contrasts. The OC filter generates inputs to the CC loop, which contains successive stages of spatially shore-range competitive interactions and spatially long-range cooperative interactions. Feedback between the competitive and cooperative stages synthesizes a global context-sensitive segmentation from among the many possible groupings of local featural elements.

REFERENCES:
patent: 4083035 (1978-04-01), Riganati et al.
patent: 4658372 (1987-04-01), Witkin
M. A. Cohen et al, "Neural Dynamics of Brightness Perception: Features Boundaries, Diffusion, and Resonance", Perception and Psychophysics, vol. 36., No. 5, May 1984 pp. 428-456.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Neural networks for machine vision does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Neural networks for machine vision, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Neural networks for machine vision will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1089949

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.