System and method for detecting a list in ink input

Image analysis – Pattern recognition

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

10850680

ABSTRACT:
A system and method for detection of a list in ink input is provided. A detector is provided that may detect a list such as a bulleted or numbered list of items in ink input. A group of lines may first be selected as a candidate list. Indentation level clustering and bullet detection may then be performed to determine the structure of the list. Bullet detection may be performed by detecting bullet partners, which are pairs of lines at the same indentation level that may begin with bullet candidates with similar features. The features of the bullet candidates in a pair of lines may be used to determine the likelihood of whether the pair of lines may be bullet partners. Finally, the structure of the list may be determined, including the relationship among the list items.

REFERENCES:
patent: 5038382 (1991-08-01), Lipscomb
patent: 5517578 (1996-05-01), Altman et al.
patent: 5544265 (1996-08-01), Bozinovic
patent: 5615283 (1997-03-01), Donchin
patent: 5864635 (1999-01-01), Zetts et al.
patent: 6525749 (2003-02-01), Moran et al.
patent: 7136082 (2006-11-01), Saund
patent: 7139004 (2006-11-01), Saund et al.
patent: 2004/0090439 (2004-05-01), Dillner
patent: 2005/0063592 (2005-03-01), Li
patent: 2005/0063594 (2005-03-01), Li
patent: 1331592 (2003-07-01), None
Fonseca, et al. “Experimental evaluation of an on-linescribble recognizer”, Pattern recognition letters, pp. 1311-1319, 2001.
Apte, et al. “Recognizing multistroke geometric shapes: an experimental evaluation”, Washington university, pp. 122-128, 1993.
Lank, et al. “An interactive system for recognizing hand drawn UML diagrams”, ACM, pp. 1-15, 2000.
Kojima, H., et al. “On-line hand-drawn line-figure recognition and its application” proceeding of the international conference on pattern recognition, pp. 1138-1142, 1988.
Galindo, D., et al. “Perceptually-based representation of network diagrams”, proceeding of the 4th international conference on document analysis and recognition, pp. 352-356, 1997.
Copy of international search report in corresponding EP application No. 04019840.2218, Jan. 2006.
Fonseca, et al. “Experimental evaluation of an on-line scribble recognizer” Pattern Recognition Letters, pp. 1311-1319, 2001.
Apte, et al. “Recognizing multistroke geometric shapes: an experimental evaluation”, Washington University, pp. 122-128, 1993.
Lank, et al. “An interactive system for recognizing hand drawn UML diagrams”, ACM, pp. 1-15, 2000.
Kojima, H, et al. “Online hand-drawn line-figure recognition and its application” Proceedings of the International Conference on Pattern Recognition. pp. 1138-1142, 1988.
Galindo, D., et al.: “Perceptually-based representation of network diagrams”: Proceedings of the 4th International Conference on Document Analysis and Recognition, pp. 352-356, 1997.
Office Action mailed Jun. 13, 2007, cited in related application, Serial No. 10/850,948.
Office Action mailed Jun. 15, 2007, cited in related application, Serial No. 10/850,718.
Copy of International Search Report in Corresponding EP Application No. 04019840.2218, Jan. 2006.
Notice of Allowance mailed Aug. 10, 2007, cited in related application Serial No.: 10/850,948.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for detecting a list in ink input does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for detecting a list in ink input, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for detecting a list in ink input will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3856880

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.