Coder

Coded data generation or conversion – Digital code to digital code converters – To or from code based on probability

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C341S050000, C341S051000

Reexamination Certificate

active

07839312

ABSTRACT:
A coder has a binarizing circuit (130) for converting multivalued data into a binary symbol sequence, the multivalued data being generated from an input signal and having a plurality of contexts, an arithmetic code amount approximating circuit (200) for calculating a prediction code amount in the predetermined coding unit from the binary symbol sequence, and a coding circuit (102) for coding the input signal arithmetically on the basis of the prediction code amount. The arithmetic code amount approximating circuit (200) includes a selector (230) for dividing the binary symbol sequence into a plurality of groups based on the contexts, a plurality of code amount approximating circuits (211-214) for calculating, from the binary symbol sequence divided into a plurality of groups, the prediction code amount of the group based on at least the section range in arithmetic coding, and an adder (231) for adding the prediction code amounts from all code amount approximating circuits and outputting the prediction code amount in the specified coding unit.

REFERENCES:
patent: 5410352 (1995-04-01), Watanabe
patent: 5870145 (1999-02-01), Yada et al.
patent: 6411231 (2002-06-01), Yanagiya et al.
patent: 6850175 (2005-02-01), Bossen
patent: 2002/0114527 (2002-08-01), Horie
patent: 2002/0114529 (2002-08-01), Horie
patent: 2005/0129122 (2005-06-01), Booth et al.
patent: 2005/0129320 (2005-06-01), Koto
patent: 2005/0156762 (2005-07-01), Tsuru
patent: 2005/0180505 (2005-08-01), Ogawa et al.
patent: 2005/0243930 (2005-11-01), Asano et al.
patent: 2005/0249289 (2005-11-01), Yagasaki et al.
patent: 2007/0033019 (2007-02-01), Yasunaga et al.
patent: 2007/0100613 (2007-05-01), Yasunaga et al.
patent: 2007/0255558 (2007-11-01), Yasunaga et al.
patent: 2008/0275698 (2008-11-01), Yasunaga et al.
patent: 2009/0012781 (2009-01-01), Yasunaga et al.
patent: 5-252403 (1993-09-01), None
patent: 4111055528 (1999-02-01), None
patent: 2003-18593 (2003-01-01), None
patent: 2004-135251 (2004-04-01), None
patent: 2004-135252 (2004-04-01), None
patent: 2005-151391 (2005-06-01), None
patent: 2005-184232 (2005-07-01), None
patent: 2005-203905 (2005-07-01), None
patent: 2005-252374 (2005-09-01), None
patent: 2005-318296 (2005-11-01), None
patent: 96/28937 (1996-09-01), None
patent: 2004/028165 (2004-04-01), None
English language abstract of JP 2005-184232 A, Jul. 7, 2005.
English language abstract of JP 2004-135252 A, Apr. 30, 2004.
English language abstract of JP 2004-135251 A, Apr. 30, 2004.
English language abstract of JP 2005-151391 A, Jun. 9, 2005.
English language abstract of JP 2003-18593 A, Jan. 17, 2003.
English language abstract of JP 2005-318296 A, Nov. 10, 2005.
English language abstract of JP 2005-203905 A, Jul. 28, 2005.
Yamakage et al., “HD DVD ni Mochiru Dogazo Fugoka Gijutsu”, Toshiba Review, vol. 60, No. 1, Jan. 1, 2005, pp. 17-20.
ITU-T Recommendation H.264, “Series H: Audiovisual and multimedia Systems, Infrastructure of audiovisual services—Coding of moving video, Advanced video coding for generic audiovisual services,” Nov. 2007.
English language abstract of JP 2005-252374 A, Sep. 15, 2005.
English language abstract of JP 5-252403 A, Sep. 28, 1993.
U.S. Appl. No. 12/439,021 to Tanaka, filed Feb. 26, 2009.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Coder does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Coder, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Coder will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4191224

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.