Method and apparatus of rendering a video image

Image analysis – Image transformation or preprocessing – Changing the image coordinates

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

348580, 348590, 345441, 345474, G06K 932

Patent

active

058646396

ABSTRACT:
A method of rendering at least part of an image from an at least one external source into a two-dimensional memory array having integer x and y pixel coordinates based upon a set of integer coordinates of an outline of the rendered image. The method includes the steps of forming a parallelogram defined by a parallel series of effect lines forming an angle e with either the x or y axis of the memory array and bounded on opposing ends of the scan lines by a scan axis, the parallelogram surrounding at least part of the two-dimensional memory array and rendering the image from the at least one external source into the memory array while moving unidirectionally along the effect lines and scan axis using the outline coordinates as transition points for selection and non-selection of the image.

REFERENCES:
patent: 4831445 (1989-05-01), Kawabe
patent: 5046165 (1991-09-01), Pearman et al.
patent: 5053762 (1991-10-01), Sarra
patent: 5214511 (1993-05-01), Tanaka
patent: 5233332 (1993-08-01), Watanabe et al.
patent: 5295199 (1994-03-01), Shino

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus of rendering a video image does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus of rendering a video image, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus of rendering a video image will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1456161

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.