Method and apparatus for recording characters

Boots – shoes – and leggings

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

358284, 340728, G09G 106

Patent

active

046270027

ABSTRACT:
A method and apparatus for element-by-element and image-line-by-image-line recording of print characters on a full page by means of an exposing beam deflectable over the full page, and which is switched on and off by a control signal. The characters are stored in a character memory in the form of contour-coded character data. The contour-coded character data are read out of the character memory in the sequence required for exposure and are converted into control signals suitable for the image-line-by-image-line recording of the characters.

REFERENCES:
patent: Re30679 (1981-07-01), Evans et al.
patent: 3305841 (1967-02-01), Schwartz
patent: 3480943 (1969-11-01), Manber
patent: 4199815 (1980-04-01), Kyte et al.
patent: 4511893 (1985-04-01), Fukuda
patent: 4553172 (1985-11-01), Yamada et al.
IEEE Transactions on Computers, vol. C-23, No. 12, Dec., 1973, "An Improved Algorithm for the Generation of Nonparametric Curves", by Bernard W. Jordan, Jr. et al., pp. 1052-1060.
IEEE Transactions on Computers, vol. C-28, No. 10, Oct. 1979, "A High-Speed Algorithm for the Generation of Straight Lines and Circular Arcs", by Yasuhito Suenaga et al., pp. 728-736.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for recording characters does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for recording characters, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for recording characters will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2297262

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.