Method and apparatus for dividing an input image into a pluralit

Image analysis – Image enhancement or restoration – Image filter

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382275, 248606, G06K 940

Patent

active

059206527

ABSTRACT:
A method and apparatus for processing an input image by converting an input frame of a predetermined number of pixels into a plurality of output frames, the spatial frequency spectrum of each of said plurality of output frames being that of a respective predetermined frequency band of the spatial frequency spectrum of the input frame, the processing including filtering the input frame in one direction so as to produce a first output frame of said predetermined number of pixels, the first output frame having only one of the high spatial frequencies and the low spatial frequencies of said input frame in said one direction, and subtracting said first output frame from said input frame so as to produce a second output frame of said predetermined number of pixels, the second output frame having the other of the high spatial frequencies and the low spatial frequencies of said input signal in said one direction.

REFERENCES:
patent: 5343309 (1994-08-01), Roetling
patent: 5384869 (1995-01-01), Wilkinson et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for dividing an input image into a pluralit does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for dividing an input image into a pluralit, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for dividing an input image into a pluralit will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-905397

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.