Feature-based detection and context discriminate...

Image analysis – Pattern recognition – Classification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S103000, C382S132000, C382S228000

Reexamination Certificate

active

06990239

ABSTRACT:
A method is provided for the detection and classification of targets in a digital image of a structure having known characteristics. In general, windowed portions of the image are evaluated in context with the entire image and in terms of their location in the image. More specifically, a scoring scheme is used to identify relevant windows with the relevance of each window being evaluated in terms of location in the image and the known characteristics of the structure being imaged. Relevant windows satisfying a threshold criteria are grouped based on their relative location in the image. A group scoring scheme is applied to each group to identify and classify targets.

REFERENCES:
patent: 5787201 (1998-07-01), Nelson et al.
patent: 5982921 (1999-11-01), Alumot et al.
patent: 6026174 (2000-02-01), Palcic et al.
patent: 6137898 (2000-10-01), Broussard et al.
patent: 6353674 (2002-03-01), Dewaele
patent: 6493460 (2002-12-01), MacAulay et al.
patent: 6687397 (2004-02-01), DeYong et al.
patent: 2001/0031076 (2001-10-01), Campanini et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Feature-based detection and context discriminate... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Feature-based detection and context discriminate..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Feature-based detection and context discriminate... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3527376

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.