Method and apparatus for determining feature points

Image analysis – Pattern recognition – Feature extraction

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382199, 382236, 348413, 348416, G06K 946

Patent

active

056944870

ABSTRACT:
A method for determining feature points comprises a step for (a) providing directional gradients and a gradient magnitude for each pixel in the video frame, (b) normalizing the directional gradients by dividing the directional gradients with the gradient magnitude, (c) generating a first edge map having the gradient magnitude for each pixel, (d) generating a second edge map having the normalized direction gradients for each pixel, (e) dividing the first edge map into a plurality of blocks of an identical size, (f) providing, for each of the pixels included in each of the blocks, normalized directional gradients for a set of a predetermined number of pixels from the second edge map, (g) obtaining a variance for each of the pixels included in each of the blocks based on the provided normalized directional gradients, (h) determining a feature point for each of the blocks based on the gradient magnitude and variance corresponding to each of the pixels therein.

REFERENCES:
patent: 4817174 (1989-03-01), Nakatani
patent: 4910786 (1990-03-01), Eichel
patent: 5144688 (1992-09-01), Bovir et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for determining feature points does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for determining feature points, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for determining feature points will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-808161

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.