Graphic symbols and method and system for identification of...

Registers – Coded record sensors – Particular sensor structure

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C235S453000, C235S494000, C235S462010, C235S472010

Reexamination Certificate

active

06460766

ABSTRACT:

TECHNICAL FIELD
The present invention relates to a method for note taking information from documents. In particular, the invention relates to graphics containing machine readable spatial symbology.
BACKGROUND ART
Non-electronic methods for taking notes from books, documents, and other printed materials are generally slow or unreliable. Students, lawyers, and other people dependent upon prior authority have limited resources by which they may select desired portions of a document and reproduce them quickly and accurately. Generally, this requires marking selected passages from printed documents with tabs or highlighters and reproducing them manually at a later time. For example, U.S. Pat. No. 3,958,816 to Remmey, III discloses notation related book markers wherein double sided tabs may be adhered to pages of books. Numbers are written in opposite directions on opposite sides of the tab so that the tab may be affixed to a left or right page of a book to mark the page, and the point on the page, to which it refers. Key cards have numbers corresponding to the tabs. Notes are hand written on lines on the cards and the cards are attached to the front leaf of a book.
Recent developments in the area of optical character recognition (OCR) scanning, such as that disclosed in U.S. Pat. No. 5,920,877 to Kolster, make it possible to acquire discrete text strings and organize those strings in a preselected format for later use. Such devices, however, depend upon the scanner's ability to recognize a character based on known OCR techniques, such as geometric OCR, which detects the printed character's shape. Variations in fonts make it difficult to achieve total accuracy with geometric OCR.
U.S. Pat. No. 5,875,261 to Fitzpatrick et al. discloses a method of enhancing spatial character recognition by combining it with color coded OCR. Color coded OCR is a technique involving attempts to recognize a character based on color embedded in the character. For example, red may denote the letter “A”, whereas pink denotes the letter “a” and yellow indicates the letter “Y”. Color coding, by itself, may be subject to color processing errors involving color intensity, color density, color shifts, and color scanner misalignment. Anything effecting the spectrum, such as saturation or changes in hue, could also effect the processing. Consequently, Fitzpatrick et al. mixes color coding with geometric OCR and utilizes a hypothesis testing technique wherein the geometric OCR is used to generate a null hypothesis that a particular character has been recognized, and an alternative hypothesis that the character has not been recognized. Color coded OCR is then employed to select either the null hypothesis or the alternative hypothesis. This method increases accuracy of the scanning device, however, it is time consuming and requires color printing.
Barcode recognition and processing methods, such as those disclosed in U.S. Pat. No. 4,992,650 to Somerville, U.S. Pat. No. 5,227,616 to Lee, U.S. Pat. No. 5,229,584 to Erickson, and U.S. Pat. No. 5,451,760 to Renvall, provide optical scanning techniques with both speed and accuracy. Barcodes can be used for automatic reproduction of strings of printed marks and characters, however, barcode detracts from a printed page to be read by humans.


REFERENCES:
patent: 3832686 (1974-08-01), Bilgutay
patent: 3958816 (1976-05-01), Remmey, III
patent: 4183465 (1980-01-01), Dobras
patent: 4954699 (1990-09-01), Coffey et al.
patent: 4992650 (1991-02-01), Somerville
patent: 5038393 (1991-08-01), Nanba
patent: 5062666 (1991-11-01), Mowry et al.
patent: 5227616 (1993-07-01), Lee
patent: 5229584 (1993-07-01), Erickson
patent: 5301243 (1994-04-01), Olschafskie et al.
patent: 5324922 (1994-06-01), Roberts
patent: 5380998 (1995-01-01), Bossen et al.
patent: 5395181 (1995-03-01), Dezse et al.
patent: 5396564 (1995-03-01), Fitzpatrick et al.
patent: 5412188 (1995-05-01), Metz
patent: 5430558 (1995-07-01), Sohaei et al.
patent: 5451760 (1995-09-01), Renvall
patent: 5465291 (1995-11-01), Barrus et al.
patent: 5480306 (1996-01-01), Liu
patent: 5486686 (1996-01-01), Zdybel, Jr. et al.
patent: 5507527 (1996-04-01), Tomioka et al.
patent: 5521368 (1996-05-01), Adachi
patent: 5574804 (1996-11-01), Olschafskie et al.
patent: 5596652 (1997-01-01), Piatek et al.
patent: 5640193 (1997-06-01), Wellner
patent: 5754308 (1998-05-01), Lopresti et al.
patent: 5760382 (1998-06-01), Li et al.
patent: 5781914 (1998-07-01), Stork et al.
patent: 5835625 (1998-11-01), Fitzpatrick et al.
patent: 5869819 (1999-02-01), Knowles et al.
patent: 5875261 (1999-02-01), Fitzpatrick et al.
patent: 5897648 (1999-04-01), Henderson
patent: 5899700 (1999-05-01), Williams et al.
patent: 5920877 (1999-07-01), Kolster
patent: 5945656 (1999-08-01), Lemelson et al.
patent: 5999666 (1999-12-01), Gobeli et al.
patent: 6036094 (2000-03-01), Goldman et al.
patent: 6095418 (2000-08-01), Swartz et al.
patent: 6134338 (2000-10-01), Solberg et al.
patent: 2 494 873 (1980-11-01), None
patent: WO-96/37861 (1996-11-01), None
U.S. Ser. No. 09/565,799, J. A. Franklin, Mar. 4, 2002.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Graphic symbols and method and system for identification of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Graphic symbols and method and system for identification of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Graphic symbols and method and system for identification of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2931497

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.