Systems and methods for processing boundary information of a...

Image analysis – Pattern recognition – Feature extraction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S242000

Reexamination Certificate

active

07388988

ABSTRACT:
In one embodiment, the present invention is directed to a method for processing boundary information of a graphical object. The method may comprise: receiving a graphical image that comprises the graphical object, wherein the graphical object is defined by at least the boundary information; determining a plurality of vertices from the boundary information; and creating an approximated boundary utilizing at least the plurality of vertices, the graphical image, and a predetermined function that is operable to detect a contour between a pair of vertices by analyzing the graphical image.

REFERENCES:
patent: 4843630 (1989-06-01), Catros et al.
patent: 5471535 (1995-11-01), Ikezawa et al.
patent: 5774595 (1998-06-01), Kim
patent: 5974175 (1999-10-01), Suzuki
patent: 6055337 (2000-04-01), Kim
patent: 6332034 (2001-12-01), Makram-Ebeid et al.
Huitao Luo and Alexandro Eleftheriadis, “Desiging an Interactive Tool for Video Object Segmentation and Annotation,” Advent Group, Columbia University, Jul. 12, 1999.
Huitao Luo and Alexandros Eleftheriadis, “Designing an Interactive Tool for Video Object Segementation and Annotation,” Advent Group, Columbia University, Jul. 12, 1999.
Guido M. Schuster and Aggelos K. Katsaggelos, Fellow, IEEE, “An Optimal Polygonal Boundary Encoding Scheme in the Rate Distortion Sense,” IEEE Transactions on Image Processing, vol. 7, No. 1, Jan. 1998.
Schuster G M et al: “Operationally optimal Vertex-Based Shape Coding” IEEE Signal Processing Magazine, IEEE Inc. NY, US vol. 15, No. 6, Nov. 1998 pp. 91-108 XP001066520.
Orange C M et al: “Magnetic Contour Tracing” Visualization and Machine Version, 1994, Proceedings., IEEE Workshop on Seattle, WA USA Jun. 24, 1994, Los Alamitos, CA, USA IEEE Comput. Soc 1994 pp. 33-44 XP 0100099598.
Luo H. et al: “An interactive Authoring System for video object segmentation and annotation” Signal Processing. Image Communication, Elsevier Science Publishers, Amsterdam NL, vol. 17 No. 7 Aug. 2002 pp. 559-572 XP004372662.
“AMOS: An Active System for MPEG-4 Video Object Segmentation;” Zhong et al.; IEEE International Conference on Image Processing; Oct. 1998; 5 pps.
“Interactive Segmentation with Intelligent Scissors;” Mortensen et al.; Graphical Models and Image Processing, vol. 60, Jun. 1998; pp. 349-384.
“Partition-Based Image Representation as Basis for User-Assisted Segmentation;” Marques et al.; IEEE International Conference on Image Processing; Oct. 2001; 4 pps.
“Dynamic Programming for Detecting, Tracking, and Matching Deformable Contours;” Geiger et al.; IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 7, No. 3; Mar. 1995; pp. 294-302.
“Using Dynamic Programming for Solving Variational Problems in Vision;” Amini et al.; IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, No. 9; Sep. 1990; pp. 855-867.
“Snakes Active Contour Models;” Kass et al.; International Journal of Computer Vision, vol. 1, No. 4, 1987; pp. 321-331.
“Introduction to Algorithms;” Cormen et al.; “Dijkstra's Algortihm;” Chapter 24.3 MIT Press; 2001; pp. 595-599.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Systems and methods for processing boundary information of a... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Systems and methods for processing boundary information of a..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Systems and methods for processing boundary information of a... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2802329

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.