Unsupervised learning of object categories from cluttered...

Image analysis – Pattern recognition – Classification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S224000, C382S190000

Reexamination Certificate

active

10066318

ABSTRACT:
Unsupervised learning of object category from images is carried out by using an automatic image recognition system. A plurality of training images are automatically analyzed using an interest operator which produces an indication of features. Those features are clustered using a vector guantizer. The model is learned from the features using expectation maximization to assess a joint probability of which features are most relevant.

REFERENCES:
patent: 5577135 (1996-11-01), Grajski et al.
patent: 5774576 (1998-06-01), Cox et al.
patent: 6111983 (2000-08-01), Fenster et al.
patent: 6633670 (2003-10-01), Matthews
patent: 6701016 (2004-03-01), Jojic et al.
M.C. Burl, P. Perona, “Recognition of Planar Object Classes,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, San Francisco, CA, Jun. 1996, pp. 223-230.
Castleman, Kenneth, Digital Image Processing, Prentice Hall, Englewood Cliffs, NJ, 1996.
Basri et al.,Clustering Appearances of 3D Objects,Jun. 1998,IEEE,entire document.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Unsupervised learning of object categories from cluttered... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Unsupervised learning of object categories from cluttered..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Unsupervised learning of object categories from cluttered... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3833563

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.