Synthesis decoding and methods of use thereof

Registers – Coded record sensors – Particular sensor structure

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C235S462010, C235S462150

Reexamination Certificate

active

07416125

ABSTRACT:
Systems and methods for successfully decoding optical images of objects of interest when such objects are situated outside a conventional working range of an imaging system, or when such objects are degraded. The systems and methods are based upon a comparison of computed signals corresponding to distorted images that would be obtained from known objects, such as barcodes, when viewed outside a working range, or when degraded, with actual signals obtained from the object of interest. The computed signals in one embodiment are recorded in a lookup table. An advantage of the systems and methods disclosed is that there results a smooth, monotonic probability of correctly decoding the image of interest, even beyond the working distance. A handheld or portable apparatus useful for practicing the invention is also disclosed.

REFERENCES:
patent: 4567610 (1986-01-01), McConnell
patent: 4685143 (1987-08-01), Choate
patent: 4757551 (1988-07-01), Kobayashi et al.
patent: 4805224 (1989-02-01), Koezuka et al.
patent: 4879456 (1989-11-01), Cherry et al.
patent: 4962423 (1990-10-01), Yamada et al.
patent: 5134272 (1992-07-01), Tsuchiya et al.
patent: 5182777 (1993-01-01), Nakayama et al.
patent: 5227617 (1993-07-01), Christopher et al.
patent: 5235167 (1993-08-01), Dvorkis et al.
patent: 5276315 (1994-01-01), Surka
patent: 5335290 (1994-08-01), Cullen et al.
patent: 5373147 (1994-12-01), Noda
patent: 5471041 (1995-11-01), Inoue et al.
patent: 5475768 (1995-12-01), Diep et al.
patent: 5481098 (1996-01-01), Davis et al.
patent: 5504319 (1996-04-01), Li et al.
patent: 5510603 (1996-04-01), Hess et al.
patent: 5524065 (1996-06-01), Yagasaki
patent: 5550363 (1996-08-01), Obata
patent: 5591952 (1997-01-01), Krichever et al.
patent: 5644765 (1997-07-01), Shimura et al.
patent: 5739518 (1998-04-01), Wang
patent: 5793899 (1998-08-01), Wolff et al.
patent: 5796868 (1998-08-01), Dutta-Choudhury
patent: 5845007 (1998-12-01), Ohashi et al.
patent: 5867277 (1999-02-01), Melen et al.
patent: 5889270 (1999-03-01), van Haagen et al.
patent: 5943441 (1999-08-01), Michael
patent: 5953130 (1999-09-01), Benedict et al.
patent: 5987172 (1999-11-01), Michael
patent: 5992753 (1999-11-01), Xu
patent: 6000612 (1999-12-01), Xu
patent: 6002978 (1999-12-01), Silver et al.
patent: 6005978 (1999-12-01), Garakani
patent: 6035066 (2000-03-01), Michael
patent: 6575367 (2003-06-01), Longacre, Jr.
patent: 6601772 (2003-08-01), Rubin et al.
patent: 2001/0015378 (2001-08-01), Watanabe et al.
patent: 2006/0113387 (2006-06-01), Baker et al.
patent: 55-115166 (1980-09-01), None
Internet article “Bar Code 1” by Adams Communications available at the website http://www.adams1.com/pub/russadam/upccode.html and dated Dec. 15, 2001 at the Internet archival site http://www.archive.org.
Jacob Rabinow, Developments in Character Recognition Machines at Rabinow Engineering Company, Optical character Recognition, 1962, Spartan Books, USA, pp. 27-50.
George S. Blasiak, PTO/SB/24, Express Abandonment Under 37 CFR 1.138, U.S. Appl. No. 11/439,893, filed May 24, 2006. Express Abandonment filed via facsimile on Sep. 13, 2006.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Synthesis decoding and methods of use thereof does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Synthesis decoding and methods of use thereof, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Synthesis decoding and methods of use thereof will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4011758

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.