Method of identifying object using portion of random pattern...

Registers – Records – Particular code pattern

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C235S375000

Reexamination Certificate

active

08006914

ABSTRACT:
A method of identifying an object having coded data identifying a plurality of fiducials on a surface of the object and a random pattern superimposed with the coded data. The random pattern defines a fingerprint for the object. The method includes the steps of: receiving, in a computer system, fingerprint data from a data reader interacting with the surface, the fingerprint data identifying some of the random pattern and a fiducial; using the fiducial to identify a portion of the random pattern; and identifying, using the identified portion of the random pattern, an object identity.

REFERENCES:
patent: 4864618 (1989-09-01), Sekendur
patent: 5051736 (1991-09-01), Djuknic et al.
patent: 5396559 (1995-03-01), McGrew
patent: 5442147 (1995-08-01), Burns et al.
patent: 5477012 (1995-12-01), Sekendur
patent: 5652412 (1997-07-01), Lazzouni et al.
patent: 5661506 (1997-08-01), Lazzouni et al.
patent: 5692083 (1997-11-01), Bennett et al.
patent: 5757918 (1998-05-01), Hopkins
patent: 5852434 (1998-12-01), Sekendur
patent: 5937110 (1999-08-01), Petrie et al.
patent: 6076734 (2000-06-01), Dougherty et al.
patent: 6259790 (2001-07-01), Takagi et al.
patent: 6330976 (2001-12-01), Dymetman et al.
patent: 6444377 (2002-09-01), Jotcham et al.
patent: 6964374 (2005-11-01), Djuknic et al.
patent: 7201323 (2007-04-01), Kotovich et al.
patent: 7392950 (2008-07-01), Walmsley et al.
patent: 7874494 (2011-01-01), Lapstun et al.
patent: 2002/0052794 (2002-05-01), Bhadra
patent: 2002/0194476 (2002-12-01), Lewis et al.
patent: 2003/0115162 (2003-06-01), Konick
patent: 2003/0195820 (2003-10-01), Silverbrook et al.
patent: 2004/0053011 (2004-03-01), Behm et al.
patent: 2004/0128190 (2004-07-01), Campo et al.
patent: 2004/0195341 (2004-10-01), Lapstun et al.
patent: 2004/0195342 (2004-10-01), Silverbrook et al.
patent: 2004/0229597 (2004-11-01), Patel
patent: 2005/0038758 (2005-02-01), Hilbush et al.
patent: 2005/0086585 (2005-04-01), Robert et al.
patent: 2005/0149442 (2005-07-01), Adams et al.
patent: 2005/0162455 (2005-07-01), Silverbrook
patent: 2005/0167480 (2005-08-01), Silverbrook et al.
patent: 2005/0185198 (2005-08-01), Silverbrook
patent: 2005/0247793 (2005-11-01), Silverbrook et al.
patent: 2005/0257045 (2005-11-01), Bushman et al.
patent: 2005/0289061 (2005-12-01), Kulakowski et al.
patent: 2006/0095778 (2006-05-01), He et al.
patent: 2006/0124726 (2006-06-01), Kotovich et al.
patent: 2006/0238334 (2006-10-01), Mangan et al.
patent: 2006/0282330 (2006-12-01), Frank et al.
patent: 10304805 (2004-08-01), None
patent: 2306669 (1997-05-01), None
patent: 2004-094907 (2004-03-01), None
patent: WO 99/18487 (1999-04-01), None
patent: WO 99/50787 (1999-10-01), None
Dymetman, M., and Copperman, M., “Intelligent Paper in Electronic Publishing, Artist Imaging, and Digital Typography, Proceedings of EP '98”, Mar./Apr. 1993, Springer Verlag LNCS 1375, pp. 392-406.
TruckingMall, Inc. launches new fleet management systems for container trucking industry (New Services).(ModalTrack)(Brief Article) Transport Technology Today , 41(1) May 2002.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of identifying object using portion of random pattern... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of identifying object using portion of random pattern..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of identifying object using portion of random pattern... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2781550

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.