Embedded and efficient low-complexity hierarchical image...

Image analysis – Image compression or coding – Pyramid – hierarchy – or tree structure

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

06671413

ABSTRACT:

BACKGROUND OF THE INVENTION
The present invention relates generally to image coders and/or decoders (hereinafter simply coders) and corresponding methods. More specifically, the present invention relates to an embedded and highly efficient low-complexity hierarchical image coder and corresponding methods therefor. Advantageously, a corresponding software program for converting a general-purpose computer into an embedded and highly efficient low-complexity hierarchical image coder is also disclosed.
Image coding utilizing scalar quantization on hierarchical structures of transformed images has proven to be a very effective and computationally simple technique. Shapiro was the first to describe such a technique with his Embedded Zerotree Wavelet (EZW) algorithm in his paper entitled “Embedded Image Coding Using Zerotrees of Wavelet Coefficients (IEEE Trans. Signal Processing 41, pp. 3445-3462 (December 1993)). Different variants of this technique have appeared in the literature which provide an improvement over the initial work. Said & Pearlman, in two papers entitled “An Improved Zero-tree Method for Image Compression (IPL TR-122, ECSE Dept., Rensselaer Polytechnic. Inst., Troy, N.Y. (November 1992))” and “Image Compression Using the Spatial-orientation Tree (IEEE Int. Symposium on Circuits and Systems, Chicago, Ill., pp. 279-282 (May 1993)),” successively improved the EZW algorithm by extending this coding scheme. Moreover, Said & Pearlman succeeded in presenting a different implementation based on a set-partitioning sorting algorithm, as described in “A New, Fast and Efficient Image Codec Based on Set Partitioning in Hierarchical Trees (IEEE Trans. Circuits and Systems for Video Technology, vol. 6(3), pp. 243-250 (June 1996)).” This new coding scheme, called the Set-Partitioning in Hierarchical Trees (SPIHT), provided an even better performance than the improved version of EZW.
It will be noted that all of these scalar quantized schemes employ some kind of significance testing of sets or groups of pixels, i.e., the set is tested to determine whether the maximum magnitude in the set is above a predetermined threshold. The results of these significance tests determine the path taken by the coder in encoding the source samples. These significance testing schemes are all based on some very simple principles which allow them to exhibit excellent performance which include, among others:
(1) the significance testing scheme provides for the partial ordering of magnitude coefficients with a set-partitioning sorting algorithm;
(2) the significance testing scheme provides for bit plane transmission in decreasing bit plane order; and
(3) the significance testing scheme permits exploitation of self-similarity across different scales of an image wavelet transform.
It will also be noted that these significance testing schemes all possess relatively low computational complexity, considering the fact that their performance is comparable to the best-known image coding algorithms. This feature seems in conflict with the well-known tenets of information theory that the computational complexity of a stationary source, i.e., source sample aggregates) increases as the coding efficiency of the source increases, as discussed in the book by T. Cover and J. Thomas entitled “Elements of Information Theory (John Wiley & Sons, Inc., New York (1991)).” Stated another way, these coding schemes seem to have provided a breathing space in the world of simultaneously increasing efficiency and computational complexity.
It should be mentioned that an important characteristic possessed by the class of coders discussed immediately above is that of progressive transmission and embeddedness. Progressive transmission refers to the transmission of information in decreasing order with respect to its information content. In other words, the coefficients with the highest magnitudes are transmitted first. Since all of these coding schemes transmit bits in decreasing bit plane order, this ensures that the transmission is progressive. Such a transmission scheme makes it possible for the bitstream to be embedded, i.e., a single coded file can used to decode the image at various rates less than or equal to the coded rate, to give the best reconstruction possible with the particular coding scheme.
With these desirable features of excellent performance and low complexity, along with other characteristics such as embeddedness and progressive transmission, it will be appreciated that these scalar quantized significance testing schemes have recently become very popular in the search for practical, fast and efficient image coders and, in fact, have become the basis for serious consideration for future image compression standards.
It will be appreciated that while the significance testing schemes employing trees permits the exploitation of the similarity across different subbands and, thus, the clustering of energy in frequency and space in hierarchical structures of transformed images, such schemes do not make use of sets in the form of blocks. It will be appreciated that block-based coding is an efficient technique for exploiting the clustering of energy found in image transforms. It is a known fact that the statistics of an image transform vary remarkably as one moves from one spatial region to another. By grouping transform source samples in the form of blocks and coding those blocks independently, one advantageously can exploit the statistics of each block in an appropriate manner. It will be noted that this is one of the reasons that block-based coders work quite well. However, there is an increasing demand for some desirable properties for image coders, such as embeddedness and progressive transmission, which are very useful and much needed in the fast-growing MultiMedia and networking environment markets. Both the SWEET and the AGP block-processing algorithms, which are discussed below, are very efficient but do not possess these desirable properties.
What is needed is a low-complexity hierarchical image coder and corresponding methods for coding/decoding blocks wherein the signal is completely embedded. Moreover, what is needed is a low-complexity hierarchical image coder and corresponding methods which employ progressive transmission. Furthermore, a low-complexity hierarchical image coder and corresponding method which have low dynamic memory requirements would be extremely desirable, particularly when the low-complexity hierarchical image coder is to be employed as a discrete module in a device having limited computing power. Lastly, a low-complexity hierarchical image coder and corresponding method which can be employed for both lossy and lossless compression would be particularly advantageous.
SUMMARY OF THE INVENTION
Based on the above and foregoing, it can be appreciated that there presently exists a need in the art for an embedded and highly efficient low-complexity hierarchical image coder and corresponding method which mitigates the above-described deficiencies. The present invention was motivated by a desire to overcome the drawbacks and shortcomings of the presently available technology, and thereby fulfill this need in the art.
One aspect according to the present invention is provided by a method for use in encoding an decoding a data set corresponding to an image. Preferably, the method includes a first subroutine for partitioning the data set into first and second sets, for adding the first set into a list of insignificant sets (LIS), and for initializing a list of significant pixels (LSP), a second subroutine for testing the first and second sets for significance with respect to a threshold value, partitioning significant members of the first and second sets in accordance with first and second partitioning functions, respectively, and adding significant pixels to the LSP, a third subroutine for refining the quantization of the pixels in the LSP, and a fourth subroutine for decrementing the threshold value. It will be appreciated that the term image encompasses a sequence of images which vary over time. Prefer

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Embedded and efficient low-complexity hierarchical image... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Embedded and efficient low-complexity hierarchical image..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Embedded and efficient low-complexity hierarchical image... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3115104

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.