Merged pipeline for color interpolation and edge enhancement...

Television – Camera – system and detail – Combined image signal generator and general image signal...

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S223100, C382S266000

Reexamination Certificate

active

06642962

ABSTRACT:

FIELD OF THE INVENTION
This invention relates to digital cameras, and more particularly to digital signal processing that integrates color interpolation with edge detection.
BACKGROUND OF THE INVENTION
Digital cameras are being improved and lowered in cost at an amazing rate. In a recent year, more digital cameras were sold than traditional film cameras. Images from digital cameras can be downloaded and stored on personal computers. Digital pictures can be converted to common formats such as JPEG and sent as e-mail attachments or posted to virtual photo albums on the Internet. Video as well as still images can be captured, depending on the kind of digital camera.
FIG. 1
is a block diagram for a typical digital camera. Light focused through a lens is directed toward sensor
12
, which can be a charge-coupled device (CCD) array or a complementary metal-oxide-semiconductor (CMOS) sensor array. The light falling on the array generates electrical currents, which are amplified by analog amp
14
before being converted from analog to digital values by A/D converter
16
. An 8, 9, or 10-bit mono-color pixel is output to processor
10
. These mono-color pixels are in a Bayer-pattern as shown in FIG.
2
. Each pixel is either a red, a blue, or a green intensity.
The R, G, or B digital values in the Bayer pattern are processed by processor
10
to generate luminance-chrominance YUV pixels. The YUV pixels can then be displayed on display
19
or compressed by compressor
18
and stored on disk
17
or on a solid-state memory. YUV pixels often have a 4:4:4 format, with 8 bits for each of 2 colors and for the luminance.
Sensor
12
detects red, blue and green colors. However, each array point in sensor
12
can detect only one of the three primary colors. Rather than outputting an RGB pixel, sensor
12
can output only a single-color pixel at any given time. For example, a line of pixels output by sensor
12
might have a red pixel followed by a green pixel. Another line might have alternating green and blue pixels.
Each pixel represents the intensity of one of the primary colors at a point in the sensor array. Thus a red pixel indicates the intensity of red light at a point, while a neighboring green pixel indicates the intensity of green light at the next point in the sensor array. Each pixel contains only one-third of the total color information.
The remaining color information is obtained by interpolation. The green intensity of a red pixel is calculated by averaging the green intensities of neighboring green pixels. The blue intensity for that red pixel is calculated by averaging or interpolating the nearest blue pixels. Processor
10
performs this color interpolation, calculating the missing primary-color intensities for each pixel location.
Processor
10
also may perform other enhancements to the image. Edges may appear fuzzy because the color interpolation tends to spread out features. These edges can be sharpened by detecting the edges and enhancing the color change at the edge to make the color transition more abrupt. Color conversion from RGB to YUV is also performed by processor
10
.
The electrical currents produced by the different primary colors can vary, depending on the sensor used and the wavelength and energy of the light photons. An adjustment known as a white-balance is often performed before processor
10
, either on analog or digital values. Each primary color can be multiplied by a different gain to better balance the colors. Compensation can also be made for different lighting conditions, increasing all primary colors for dark pictures or decreasing all colors for bright pictures (overexposure).
Bayer Pattern—FIG.
2
FIG. 2
shows an image captured by a sensor that generates single-color pixels in a Bayer pattern. The example shows an 800×600 frame or image for display in the common super-VGA resolution. A total of 600 lines are captured by the sensor, with 800 pixels per line.
A personal computer displays full-color pixels that have all three primary-color intensities (RGB). In contrast, the sensor in a digital camera can detect only one of the three primary colors for each point in the 800×600 sensor array. Detectors for green are alternated with red detectors in the first line, while green detectors are alternated with blue detectors in the second line.
The first horizontal line and each odd line have alternating red and green detectors, so pixels output from these odd lines are in a R-G-R-G-R-G-R-G sequence. The second horizontal line and each even line have alternating green and blue detectors, so pixels output from these even lines are in a G-B-G-B-G-B-G-B sequence.
Half of the pixels are green pixels, while one-quarter of the pixels are read and the last quarter are blue. The green pixels form a checkerboard pattern, with blue and red pixels surrounded by green pixels. Since the human eye is more sensitive to green, the Bayer pattern has more green pixels than red or blue.
The green intensity for a red pixel location can be interpolated by averaging the four green pixels that surround the red pixel. For example, the green intensity for red pixel at location (
3
,
3
) is the sum of green pixels (
3
,
2
), (
3
,
4
), (
2
,
3
), and (
4
,
3
), divided by four. Likewise, the green intensity for a blue pixel location can be interpolated by averaging the four surrounding green pixels. For blue pixel (
2
,
4
), the interpolated green intensity is the sum of green pixels (
2
,
3
), (
2
,
5
), (
1
,
4
), and (
3
,
4
), divided by four.
The red and blue values for a green pixel location can also be calculated from the 2 red and 2 blue pixels that surround each green pixel. For green pixel (
2
,
3
), the interpolated red value is the average of red pixels (
1
,
3
) and (
3
,
3
) above and below the green pixel, while the interpolated blue value is the average of blue pixels (
2
,
2
) and (
2
,
4
) to the right and left of the green pixel.
Many different techniques have been used for color interpolation and white balance. See U.S. Pat. Nos. 5,504,524 and 5,260,774, which show white-balance from analog signals. Sometimes a whole frame buffer is used for white balance or interpolation. Whole-frame buffers can be large, mega-pixel buffers that hold all 800×600 pixels. See, U.S. Pat. No. 5,260,774, FIGS. 1-3. Color and edge enhancement are often not performed or are performed by a separate unit, perhaps also using a whole-frame buffer.
While such digital-camera processors are useful, cost reduction is desirable since digital cameras are price-sensitive consumer devices. Whole-frame buffers require large memories, and as digital cameras are increased in resolution, larger memories are needed for the larger number of pixels.
What is desired is a digital-camera processor that does not use a whole-frame buffer. It is desired to perform color interpolation of Bayer-pattern pixels without storing all the pixels in a frame. It is desired to use smaller line buffers, which store only a few lines of pixels rather than all 600 lines in a SVGA image. It is further desired to perform both color interpolation and edge detection at the same time, using integrated hardware. It is desired to merge the edge detector into the interpolator. It is desired to also perform while balance, edge enhancement, and YUV conversion without using a whole-frame buffer. It is desired to process all pixels in a frame in a single pass, without storing all the pixels.
SUMMARY OF THE INVENTION
A digital-image processor has a line buffer that receives mono-color pixels captured by an image sensor. The line buffer stores only a fraction of a whole frame of an image. The fraction is less than 5 percent of a number of pixels in the whole frame.
A merged pipeline receives an array of mono-color pixels from the line buffer. It generates missing color values for a middle pixel. The middle pixel is a mono-color pixel in a middle of the array from the line buffer.
The merged pipeline also generates an upper primary color value for an upper pixel immediately above the middle pixel and a lower primar

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Merged pipeline for color interpolation and edge enhancement... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Merged pipeline for color interpolation and edge enhancement..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Merged pipeline for color interpolation and edge enhancement... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3158794

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.