System and method for synthesizing background texture in an imag

Image analysis – Image enhancement or restoration

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382280, 358448, 358452, G06K 940, H04N 140

Patent

active

057844984

ABSTRACT:
A method backfills areas of an image with synthesized textures. This is realized by selecting a texture exemplar from an image and identifying an area of the image to be backfilled. An estimate of the synthesized texture is generated and predetermined spatial domain constraints are applied to the estimate. The spatially constrained estimate is operated upon by a Fourier transform to create a spectral function. Predetermined spectral domain constraints are applied to the spectral function and an an inverse Fourier transform is performed thereon to produce a synthesized texture. The synthesized texture is inserted into the area of the image to be backfilled if it is determined to be adequate. If the synthesized texture is inadequate, the synthesized texture is put through the synthesis process again until the the texture is adequate for backfilling.

REFERENCES:
patent: 5555194 (1996-09-01), Cok
Restoration from Phase and Magnitude by Generalized Projections by Aharon Levi and Henry Stark pp. 277-319., from Image Recovery: Theory and Application.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for synthesizing background texture in an imag does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for synthesizing background texture in an imag, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for synthesizing background texture in an imag will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1655356

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.