Apparatus and method for image-classifying, and recording...

Image analysis – Pattern recognition – Classification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07991234

ABSTRACT:
An image information-inputting unit inputs image information including position information indicating a position where an image was captured. A reference distance-calculating unit calculates a reference distance from a predetermined reference position utilizing the image information. An image information-classifying unit classifies the image information based on the reference distance.

REFERENCES:
patent: 5057607 (1991-10-01), Zmijewski et al.
patent: 5966698 (1999-10-01), Pollin
patent: 6437797 (2002-08-01), Ota
patent: 6452612 (2002-09-01), Holtz et al.
patent: 6606411 (2003-08-01), Loui et al.
patent: 6819356 (2004-11-01), Yumoto
patent: 7145695 (2006-12-01), Endo et al.
patent: 7231088 (2007-06-01), Echigo et al.
patent: 10-254746 (1998-09-01), None
patent: 2000-217057 (2000-08-01), None
patent: 2001-228528 (2001-08-01), None
patent: 2002-183206 (2002-06-01), None
patent: 2002-191015 (2002-07-01), None
patent: 2003-58867 (2003-02-01), None
patent: 2004-120486 (2004-04-01), None
patent: 2005-039359 (2005-02-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Apparatus and method for image-classifying, and recording... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Apparatus and method for image-classifying, and recording..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus and method for image-classifying, and recording... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2676467

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.