Method for aligning gesture features of image

Image analysis – Pattern recognition

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S203000

Reexamination Certificate

active

10628511

ABSTRACT:
A method for aligning gesture features of image is disclosed. An input gesture image is captured, and then a closed curve formed by a binary contour image of the gesture image is determined by processing the gesture image. A curvature scale space (CSS) image of the gesture image is drawn based on the closed curve. A convolution operation is performed with respect to the sequence of a coordinate-peak set formed by the CSS image and a predefined function to designate the coordinate with maximal value of integration as a basis point for obtaining a feature parameter of the gesture image. Finally, comparing the feature parameter of the gesture image with each feature parameter of a plurality of reference gesture shapes, thereby determining a gesture shape corresponding to the gesture image.

REFERENCES:
patent: 5454043 (1995-09-01), Freeman
patent: 5594810 (1997-01-01), Gourdol
patent: 6456728 (2002-09-01), Doi et al.
patent: 7068843 (2006-06-01), Chang et al.
Mokhtarian et al. (“Robust and Efficient Shape Indexing through Curvature Scale Space”, 1996, Dept. of Electrical Engineering, University of Surrey, England).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for aligning gesture features of image does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for aligning gesture features of image, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for aligning gesture features of image will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3883493

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.