System and method for reading and decoding optical codes...

Registers – Coded record sensors – Particular sensor structure

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C235S462060, C235S462110

Reexamination Certificate

active

07028901

ABSTRACT:
An optical code reading system and method are provided for reading and decoding an optical code. The system includes a plurality of light sources, a color image sensor, a processor and a decoder. Each light source produces a unique wavelength/color of light to illuminate the optical code, such as a direct mark optical code. The image sensor detects the reflected light from the optical code and generates an integrated multi-colored image. The processor separates the integrated image into individual color channels, where each color channel includes data representative of the imaged optical code in one color. The processor analyzes the contrast for each color channel and determines which color channel has the optimum contrast. The data corresponding to the color channel having the optimum contrast is then decoded by a decoder.

REFERENCES:
patent: 5361158 (1994-11-01), Tang
patent: 5773808 (1998-06-01), Laser
patent: 2003/0062413 (2003-04-01), Gardiner et al.
patent: 0516927 (1992-03-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for reading and decoding optical codes... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for reading and decoding optical codes..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for reading and decoding optical codes... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3582898

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.