Method to derive edge extensions for wavelet transforms and inve

Image analysis – Histogram processing – For setting a threshold

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382232, 382248, 382266, 348384, H04N 141

Patent

active

058288490

ABSTRACT:
A method derives edge extensions for wavelet transforms and inverse wavelet transforms of two-dimensional images. The method overcomes the necessity of side computations by treating the two-dimensional matrix of values as a one-dimensional array of values. The use of a one-dimensional array reduces the required flushing and loading of registers by allowing the flushing and loading to be performed in between frames, rather than in between rows or columns of the matrix.

REFERENCES:
patent: 5101446 (1992-03-01), Resnikoff et al.
patent: 5381354 (1995-01-01), Soloff
patent: 5398067 (1995-03-01), Sakamoto
patent: 5546477 (1996-08-01), Knowles et al.
patent: 5586200 (1996-12-01), Devaney et al.
patent: 5600373 (1997-02-01), Chui et al.
patent: 5615287 (1997-03-01), Fu et al.
Hilton, Michael L., et al, Compressing Still and Moving Images with Wavelets, Multimedia Systems, Apr. 18, 1994, vol. 2 No. 3, pp. 1-17.
Gray, Robert M., Vector Quantization, IEEE ASSP Magazine, Apr. 1984, pp. 4-29, U.S.A.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method to derive edge extensions for wavelet transforms and inve does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method to derive edge extensions for wavelet transforms and inve, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method to derive edge extensions for wavelet transforms and inve will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1621297

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.