Device and method for generating read addresses for video memory

Computer graphics processing and selective visual display system – Display driving control circuitry – Physically integral with display elements

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

G09G 536

Patent

active

056253834

ABSTRACT:
A read address generator for generating read addresses for a read memory wherein input read addresses indicated using an orthogonal coordinate system is converted into equivalent addresses in a polar coordinate system; the polar coordinate system is divided into concentric blocks according to distances from the pole of the polar coordinate system; vectors having sequentially increasing or decreasing angles are set for the concentric block; coordinate values indicated by the vectors are converted into second equivalent coordinate values in the orthogonal coordinate system; and output read addresses are generated by subtracting the second coordinate values from the input read addresses indicated in said orthogonal coordinate system and are supplied to the video memory.

REFERENCES:
patent: 4648049 (1987-03-01), Dines et al.
patent: 5051734 (1991-09-01), Lake, Jr.
patent: 5225824 (1993-07-01), Yamamoto et al.
WPI Abstract Acc No: 93-023134/03 and JP 04 351078 A (NEC) see abstract.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Device and method for generating read addresses for video memory does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Device and method for generating read addresses for video memory, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Device and method for generating read addresses for video memory will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-709619

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.