Character generation using data of plural types to which...

Computer graphics processing and selective visual display system – Computer graphics processing – Character generating

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S467000, C345S472000

Reexamination Certificate

active

06577314

ABSTRACT:

BACKGROUND OF THE INVENTION
Field of the Invention
The present invention relates to a character generating method having a function to convert characters or the like which have been coded in a vector format, into data in a dot format.
Related Background Art
Hitherto, when a plurality of type styles or family are outputted, they are stored in a ROM on a type style unit basis (i.e., type style by type style) and, in an outputting mode, necessary data is taken out from an address of the requested type style stored in the ROM, with reference to the content of the requested type style, and data of a bit map format is formed from data in a vector format and is generated.
In the above conventional example, however, when the number of type styles is small, the needed memory capacity of the ROM is not excessively large. However, when the number of type styles increases and families must be prepared in correspondence to them, there is a problem in that the required capacity of the ROM is extremely large, and such a large ROM is not practical for inclusion in a marketable product.
There is also a problem in that, in a case where the data type on the development requesting side is one type, e.g., outline type or stroke type, the apparatus, if prepared only to handle another type (e.g., stroke type or another type, respectively), maybe unable to accommodate the development request.
SUMMARY OF THE INVENTION
In consideration of the above points, to solve the above problems, it is an object of the invention to provide a system that can handle both base stroke data and type style data and for which, it is sufficient to prepare only one kind of data, e.g., base stroke data, irrespective of the number of type styles, and by providing as many type style data as the number of type styles to be accommodated, the capacity of ROM that is installed in a product can be reduced without a concurrent reduction in output quality.
Another object of the invention is to make it possible to perform a character development according to a request by covering from stroke type data to outline type data.
Still another object of the invention is to provide a character generating method whereby, where characters are generated from data stored in a vector format, provision of two kinds of data including base skeleton data and data which depends on the type style, permits generation of a plurality of type styles.


REFERENCES:
patent: 4298945 (1981-11-01), Kyte et al.
patent: 4897638 (1990-01-01), Kokunishi et al.
patent: 4931953 (1990-06-01), Uehara et al.
patent: 5105471 (1992-04-01), Yoshida et al.
patent: 5113491 (1992-05-01), Yamazaki
patent: 5155805 (1992-10-01), Kaasila
patent: 5159668 (1992-10-01), Kaasila
patent: 5257016 (1993-10-01), Fujii et al.
patent: 5289169 (1994-02-01), Corfield et al.
patent: 228581 (1994-08-01), None
patent: 1-166967 (1989-06-01), None
patent: 78106819 (1990-08-01), None
English translation of claims 1-7 of Chinese document 228581, dated Aug. 1994.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Character generation using data of plural types to which... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Character generation using data of plural types to which..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Character generation using data of plural types to which... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3151023

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.