Database engines for processing ideographic characters and...

Image analysis – Pattern recognition – Limited to specially coded – human-readable characters

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S185000

Reexamination Certificate

active

06956968

ABSTRACT:
A computer-implemented method for encoding a handwritten stroke set, each of the handwritten stroke set being representative of a constituent stroke of an ideographic character, to obtain an encoded input sequence. The method includes ascertaining a shape of a first stroke of the handwritten stroke set and ascertaining one of a location information and a size information pertaining to the first stroke. The method further includes assigning a first code to the encoded input sequence responsive to a determination of the shape of the first stroke and a determination of the one of the location information and the size information of the first stroke. The first code is predefined to represent the shape of the first stroke and the one of the location information and the size information of the first stroke. The first code is sufficiently unique to distinguish the first code from other codes representing other permutations of shape and the one of the location information and the size information of the first stroke.

REFERENCES:
patent: 4379288 (1983-04-01), Leung et al.
patent: 4505602 (1985-03-01), Wong
patent: 4879653 (1989-11-01), Shinoto
patent: 5109352 (1992-04-01), O'Dell
patent: 5212769 (1993-05-01), Pong
patent: 5223831 (1993-06-01), Kung et al.
patent: 5257938 (1993-11-01), Tien
patent: 5410306 (1995-04-01), Ye
patent: 5475767 (1995-12-01), Du
patent: 5579408 (1996-11-01), Sakaguchi et al.
patent: 5586198 (1996-12-01), Lakritz
patent: 5734750 (1998-03-01), Arial et al.
patent: 5790055 (1998-08-01), Yu
patent: 5831636 (1998-11-01), Merchant et al.
patent: 6052482 (2000-04-01), Arial et al.
patent: 6801659 (2001-06-01), O'Dell
patent: 6373473 (2002-04-01), Sakaguchi et al.
patent: 6411948 (2002-06-01), Hetherington et al.
patent: 6766179 (2004-07-01), Shiau et al.
patent: 2100899 (1983-01-01), None
patent: 2161004 (1986-01-01), None
Leon, N.H.,Character Indexes of Modern Chinese, Institute of Asian Studies Monograph Series, No. 42, Curzon Press. (Physical copy of document furnished on Nov. 24, 2004).
Darragh J. J. et al. ‘The Reactive Keyboard: A Predictive Typing Aid’ http://pharos.cpsc.ucalgary.ca/Dienst/Repository/2.0/Body
cstrl.ucalgary cs/1989-371-33/pdf. Originally published: Computer, US, IEEE Computer Society, Long Beach, CA vol. 23, vol. 11 Nov. 1, 1990. (Physical copy of document furnished on Nov. 24, 2004).
U.S. Appl. No.: 10/054,673, Title: Systems and methods for inputting ideographic characters, Filed: Jan. 1, 2002, First Inventor Name: O'Dell.
U.S. Appl. No.: 10/055,707, Title: Systems and methods for inputting ideographic characters using arbitrary element strokes, Filed: Jan. 22, 2002, First Inventor Name: O'Dell.
U.S. Appl. No.: 10/078,254, Title: Systems and methods for inputting ideographic characters using alternative language inputs, Filed: Feb. 15, 2002, First Inventor Name: O'Dell.
Masui T: ‘An Efficient Text Input Method for Pen-Based Computers’ CHI Conference Proceedings, Human Factors in Computing Systems, US, NY, NY: ACM, 1998, pp. 328-335. (Physical Copy of document furnished on Nov. 24, 2004).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Database engines for processing ideographic characters and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Database engines for processing ideographic characters and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Database engines for processing ideographic characters and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3485688

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.