Machine vision process and apparatus for reading a plurality of

Boots – shoes – and leggings

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

364489, 364560, 358101, 382 8, G06K 900

Patent

active

048666297

ABSTRACT:
A machine vision process and apparatus for reading a plurality of separated figures, such as the hole-position diagram of a printed-circuit board; the apparatus comprises an X-Y table to be moved on a plane and to be positioned at a given position, two photographing devies such as camaras being mounted over the X-Y table, and an image processing unit. The X-Y table is used for placing a diagram or the like so as to facilitate a first camera to photograph a large area and to facilitate a second camera to photograph a small and detailed separated figure. The image signal obtained will be transmitted to the image processing unit to find out the center coordinates of every separated figure, and the size and shape of every separated figure classified so as to find out the control data of size, position and the moving path in the real production process.

REFERENCES:
patent: 4596037 (1986-06-01), Bouchard et al.
patent: 4680627 (1987-07-01), Sase et al.
patent: 4720870 (1988-01-01), Billiotte et al.
patent: 4754329 (1988-06-01), Lindsay et al.
patent: 4757550 (1988-07-01), Uga

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Machine vision process and apparatus for reading a plurality of does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Machine vision process and apparatus for reading a plurality of , we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Machine vision process and apparatus for reading a plurality of will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-922115

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.