Method and apparatus for recognition and tagging of multiple...

Coded data generation or conversion – Digital code to digital code converters – Adaptive coding

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C341S106000

Reexamination Certificate

active

11390533

ABSTRACT:
Embodiments herein provide a method and apparatus for recognition and tagging of multiple layered entropy coding system. The method comprises receiving a signal and performing entropy average coding to produce an entropy coded bit stream. Next, the method adds a tag to a beginning portion of the entropy coded bit stream. The tag comprises a plurality of headers, wherein each of the headers differs from one another, and wherein the headers comprise an elongated bit stream. Following this, the method reads the tag when processing the entropy coded bit stream and avoids subsequent entropy averaging on the entropy coded bit stream when the tag is present. The avoiding of the subsequent entropy averaging avoids unnecessary coding and entropy averaging of the entropy coded bit stream.

REFERENCES:
patent: 6633242 (2003-10-01), Brown
patent: 6765510 (2004-07-01), Koyama et al.
patent: 6987890 (2006-01-01), Joshi et al.
patent: 7111094 (2006-09-01), Liu et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for recognition and tagging of multiple... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for recognition and tagging of multiple..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for recognition and tagging of multiple... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3940636

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.