Method and system for collecting data from a plurality of...

Image analysis – Image segmentation – Distinguishing text from other regions

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S175000, C382S173000

Reexamination Certificate

active

07668372

ABSTRACT:
In a method and system for collection of data from documents present in machine-readable form, at least one already-processed document stored as a template and designated as a template document is associated with a document to be processed designated as a read document. Fields for data to be extracted are defined in the template document. Data contained in the read document are already extracted from regions that correspond to the fields in the template document. Should an error have occurred or no suitable template document having been associated given the automatic extraction of the data, the read document is shown on a screen and fields are manually inputted in the read document from which the data are extracted. After the manual input of the fields in the read document, the read document with field specifications is stored as a new template document or the previous template document is corrected corresponding to the newly input fields.

REFERENCES:
patent: 3611291 (1971-10-01), Frank
patent: 3925760 (1975-12-01), Mason et al.
patent: 4272756 (1981-06-01), Kakumoto et al.
patent: 4933979 (1990-06-01), Suzuki et al.
patent: 5140650 (1992-08-01), Casey et al.
patent: 5317646 (1994-05-01), Sang, Jr. et al.
patent: 5448375 (1995-09-01), Cooper et al.
patent: 5594809 (1997-01-01), Kopec et al.
patent: 5666549 (1997-09-01), Tsuchiya et al.
patent: 5680223 (1997-10-01), Cooper et al.
patent: 5689620 (1997-11-01), Kopec et al.
patent: 5835712 (1998-11-01), DuFresne
patent: 5923792 (1999-07-01), Shyu et al.
patent: 5963966 (1999-10-01), Mitchell et al.
patent: 5966473 (1999-10-01), Takahashi et al.
patent: 6028970 (2000-02-01), DiPiazza et al.
patent: 6131102 (2000-10-01), Potter
patent: 6353840 (2002-03-01), Saito et al.
patent: 2002/0034328 (2002-03-01), Naoi et al.
patent: 2002/0141660 (2002-10-01), Bellavita et al.
patent: 2003/0115189 (2003-06-01), Srinivasa et al.
patent: 2004/0243552 (2004-12-01), Titemore et al.
patent: 2007/0201768 (2007-08-01), Schiehlen
patent: 2008/0195968 (2008-08-01), Schacht
patent: 10342594 (2005-04-01), None
patent: 1510962 (2005-03-01), None
patent: 09062758 (1997-03-01), None
patent: WO 98/47098 (1998-10-01), None
patent: WO-2005/043452 (2005-05-01), None
patent: WO-2007/006687 (2007-01-01), None
IBM Systems Journal 29 (1990) No. 3, Armonk, NY US Intelligent Forms Processing Casey et al.
Indexing and Searching—pp. 192-219, 2003.
International Search Report and Written Opinion; International Application No. PCT/EP2004/009539; Mailed Mar. 31, 2005; 2 pages.
International Search Report and Written Opinion; International Application No. PCT/EP2004/009538; Mailed Jan. 25, 2005; 3 pages.
Oce Document Technologies—XP-002399738—Feb. 18, 2005.
Oce Document Technologies XP-002399736 DOKuStar Produktfamilie—2004.
XP-002399735—“Single Click Entry” das neue Tool von Oce Document Technologies macht Nachbearbeitung von Dokumenten noch schneller—2004.
XP-002399747 Single Click Entry Oder Wie Die Maus in Der Datenerfassung Salonfiihig Wird—Version 1.2—Jan. 2005.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for collecting data from a plurality of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for collecting data from a plurality of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for collecting data from a plurality of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4176721

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.