Decoding apparatus and method

Image analysis – Image compression or coding – Including details of decompression

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S238000, C382S239000

Reexamination Certificate

active

06560365

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a decoding apparatus and method for decoding compressed image data.
2. Related Background Art
Recently, there has been an increase in the number of applications for compressing static images as image data and for externally transmitting the compressed image data, or for storing it in a memory. For these purposes, it is preferable that lossless compression and encoding be used, especially when the data is for static images used for medical purposes, so that no deterioration of image quality occurs.
Accordingly, various efficient lossless compression and encoding methods have been proposed. For example, a lossless compression and encoding method has been proposed for outputting a difference between a pixel to be encoded and a predicted value generated by using peripheral pixels, and for performing Golomb-Rice coding for this difference.
With this method, when decoding is performed, the original value of the object pixel is reconstructed by adding the value of each difference to a predicted value that is generated based on the decoded values for peripheral pixels.
However, a specific apparatus configuration for decoding coded data has not yet been established.
SUMMARY OF THE INVENTION
To resolve this problem, it is one objective of the present invention to provide an arrangement for employing the above described peripheral pixels to perform fast decoding, and in particular to perform fast decoding while taking into account the timing whereat the values of peripheral pixels are obtained.
To achieve the above objective, according to the present invention a decoding apparatus, which decodes each pixel based on a plurality of peripheral pixels (corresponding to a, b, c and d in the preferred embodiments), comprises:
a plurality of memories (corresponding to memories
202
to
210
), for storing a predetermined parameter (corresponding a k parameter or a parameter C&agr;) necessary for decoding, which corresponds to a set of first and second statuses (corresponding to |Q
3
| and R
3
) that are obtained from a specific pixel (corresponding to pixel a);
a determination units (corresponding to a status generating unit
103
that generates |Q
2
|) for determining a read address in the plurality of memories based on a third status (corresponding to |Q
2
|) obtained from peripheral pixels (corresponding to b, c and d) other than the predetermined peripheral pixel; and
a selection unit (corresponding to memories
202
to
210
and a selector
211
) for inputting the first and the second statuses obtained from the predetermined pixel and for selecting one of the plurality of memories in accordance with the inputted first and second statuses.


REFERENCES:
patent: 5581373 (1996-12-01), Yoshida
patent: 5751860 (1998-05-01), Su et al.
patent: 5764374 (1998-06-01), Seroussi et al.
patent: 5801650 (1998-09-01), Nakayama
patent: 5818970 (1998-10-01), Ishikawa et al.
patent: 5841381 (1998-11-01), Nakayama
patent: 5945930 (1999-08-01), Kajiwara
patent: 5986594 (1999-11-01), Nakayama et al.
patent: 6028963 (2000-02-01), Kajiwara
patent: 6031938 (2000-02-01), Kajiwara
patent: 6173078 (2001-01-01), Kadono

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Decoding apparatus and method does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Decoding apparatus and method, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Decoding apparatus and method will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3020715

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.