Computer graphics processing and selective visual display system – Display driving control circuitry – Controlling the condition of display elements
Reexamination Certificate
1999-03-30
2002-12-10
Kincaid, Kristine (Department: 2174)
Computer graphics processing and selective visual display system
Display driving control circuitry
Controlling the condition of display elements
C345S215000, C345S215000, C345S215000, C345S215000
Reexamination Certificate
active
06493005
ABSTRACT:
BACKGROUND
The present invention relates to the digital processing of video to be displayed on a video display, and more particularly, to control of the display pipeline on a reduced instruction set processor between decoded digital video and a display output.
Techniques for digital transmission of video promise increased flexibility, higher resolution, and better fidelity. Recent industry collaborations have brought digital video closer to reality; digital video transmission and storage standards have been generated, and consumer digital video products have begun to appear. The move toward digital video has been encouraged by the commercialization of digital technologies in general, such as personal computers and compact discs, both of which have increased consumer awareness of the possibilities of digital technology.
Personal computers, which have recently become common and inexpensive, contain much of the computing hardware needed to produce digital video, including a microprocessor/coprocessor for performing numeric calculations, input and output connections, and a large digital memory for storing and manipulating image data. Unfortunately, personal computers are not suitable for consumer digital video reception, because the microprocessor in a personal computer is a general purpose processor, and typically cannot perform the calculations needed for digital video fast enough to produce full-motion, high definition video output.
Accordingly, special purpose processors, particularly suited for performing digital video-related calculations, have been developed for use in digital video receivers for consumer applications. The first attempts in the early 1990s included separate application specific integration circuits (ASICs) for audio and for video processing. In addition, these early ASICs performed only low-level functions, and thus burdened a host processor with most of the management of the audio and video processing. These ASICs relied on standard audio/video synchronization and simple error concealment techniques all to be performed by the host processor.
Thereafter, some audio/video processing components were introduced that provided some integration of audio and video decoding with some primitive levels of features. However, these components largely shared the same drawbacks as the early ASICs in that host processors largely managed the audio and video processing.
Other audio/video processing components attempted to provide more features in a cost effective way by combining more firmware functionality onto the same integrated circuit (IC). However, such inflexible approaches narrowed applications to which such ICs could be used and narrowed the functionality when used. Design choices made in firmware constricted the Application Program Interface (API).
A more flexible approach has been made by providing a specific processor with a high-speed architecture which allows programming flexibility with its open, multi-level Application Programming Interface (API). This specific processor is disclosed in commonly-assigned, copending U.S. patent application Ser. No. 08/865,749, entitled SPECIAL PURPOSE PROCESSOR FOR DIGITAL AUDIO/VIDEO DECODING, filed by Moshe Bublil et al. on May 30, 1997, which is hereby incorporated by reference herein in its entirety, and a memory controller for use therewith is disclosed in commonly-assigned, copending U.S. patent application Ser. No. 08/846,590, entitled “MEMORY ADDRESS GENERATION FOR DIGITAL VIDEO”, filed by Edward J. Paluch on Apr. 30, 1997, which is hereby incorporated herein in its entirety.
The above-referenced U.S. patent applications describe an application specific integrated circuit (ASIC) for performing digital video processing, which is controlled by a reduced instruction set CPU (RISC CPU). The RISC CPU controls computations and operations of other parts of the ASIC to provide digital video reception. As is typical of CPU's of many varieties, the CPU described in the above-referenced U.S. patent applications supports flow control instructions such as BRANCH, CALL and RETURN, as well as providing hardware interrupt services.
Due to the limitations of the RISC CPU, a number of functions are provided in the operating system rather than in hardware. A specific operating system of this kind is disclosed in commonly-assigned, copending U.S. patent application Ser. No. 08/866,419, entitled TASK AND STACK MANAGER FOR DIGITAL VIDEO DECODING, filed by Taner Ozcelik et al. on May 30, 1997, which is hereby incorporated by reference herein in its entirety; and software running under control of this operating system for controlling high-level digital video decoding functions is described in U.S. patent application Ser. No. 09/177,214 entitled “COMMAND MANAGER” filed by Cem I. Duruoz et al. on Oct. 22, 1998, which is hereby incorporated by reference herein in its entirety; and U.S. patent application Ser. No. 09/177,261 entitled METHOD AND APPARATUS FOR A VIRTUAL SYSTEM TIME CLOCK FOR DIGITAL/AUDIO/VIDEO PROCESSOR filed by Cem I. Duruoz et al. on Oct. 22, 1998, which is hereby incorporated by reference herein in its entirety. Thus, certain functions like scheduling audio/video processing and synchronization such processes are handled by a digital audio/video processor, unburdening a host processor, while providing intimate control of such processes by the host when desirable.
One aspect of the aforementioned digital audio/video processor is accommodating various digital video formats. For instance, the industry sponsored Motion Pictures Expert Group (MPEG) chartered by the International Organization for Standardization (ISO) has specified a format for digital video and two channel stereo audio signals that has come to be known as MPEG-1, and, more formally, as ISO-11172. MPEG-1 specifies formats for representing data inputs to digital decoders, or the syntax for data bitstreams that will carry programs in digital formats that decoders can reliably decode. In practice, the MPEG-1 standards have been used for recorded programs that are usually read by software systems. The program signals include digital data of various programs or program components with their digitized data streams multiplexed together by parsing them in the time domain into the program bitstreams. The programs include audio and video frames of data and other information. MPEG-1 recordings may be recorded on an optical disk and referred to as a Video Compact Disc, or VCD.
An enhanced standard, known colloquially as MPEG-2 and more formally as ISO-13818, has more recently been agreed upon by the ISO MPEG. Products using MPEG-2 are often provided on an optical disk referred to as a Digital Video Disc, or DVD. This enhanced standard has grown out of needs for specifying data formats for broadcast and other higher noise applications, such as high definition television (HDTV), where the programs are more likely to be transmitted than recorded and more likely to be decoded by hardware than by software. The MPEG standards define structure for multiplexing and synchronizing coded digital and audio data, for decoding, for example, by digital television receivers and for random access play of recorded programs. The defined structure provides syntax for the parsing and synchronizing of the multiplexed stream in such applications and for identifying, decoding and timing the information in the bitstreams.
The MPEG video standard specifies a bitstream syntax designed to improve information density and coding efficiency by methods that remove spacial and temporal redundancies. For example, the transformation of blocks of 8×8 luminance pels (pixels) and corresponding chrominance data using Discrete Cosine Transform (DCT) coding is contemplated to remove spacial redundancies, while motion compensated prediction is contemplated to remove temporal redundancies. For video, MPEG contemplates Intra (I) frames, Predictive (P) frames and Bidirectionally Predictive (B) frames. The I-frames are independently coded and are the least efficiently coded of the three frame types. P-frames are
Kincaid Kristine
Sony Corporation
Tran Mylinh
Wood Herron & Evans L.L.P.
LandOfFree
On screen display does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with On screen display, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On screen display will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2995337