Method and apparatus for mapping samples in a rendering...

Computer graphics processing and selective visual display system – Computer graphic processing system – Plural graphics processors

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S424000, C345S589000, C345S592000

Reexamination Certificate

active

06424346

ABSTRACT:

FIELD OF THE INVENTION
This invention relates generally to volume rendering, and more particularly, to classifying samples in a rendering pipeline.
BACKGROUND OF THE INVENTION
Volume rendering is often used in computer graphics applications where three-dimensional data need to be visualized. For example, the volume data are scans of physical objects, or atmospheric, geophysical or other scientific models. With volume rendering, the internal structure, as well as the external surface of physical objects and models are visualized. Voxels are usually the fundamental data items used in volume rendering. A voxel is a data item that represents attributes of a particular three-dimensional portion of the object or model.
A voxel represents some particular intensity value of the object or model, such as physical parameters, e.g., density, elasticity, velocity, to name but a few, inferred by CT, MRI, PET, SPECT, ultrasound, or other scanning means. During rendering, the voxel values are converted to pixel color and opacity (RGBa) values which can be projected onto a two-dimensional image plane for viewing.
One frequently used technique during the rendering process is ray casting. A set of imaginary rays are “traced” through the array of voxels. For a particular viewing orientation, the rays are cast to the image plane in either a back-to-front, or a front-to-back order. The voxel values are sampled along the rays, and various techniques are known to reduce the sampled value to pixel values.
Rendering Pipeline
Volume rendering can be done by software or hardware. In one prior art hardware implementation, as shown in simplified form in
FIG. 1
, the hardware is arranged as a multi-stage pipeline
100
, see U.S. patent application Ser. No. 09/190,643 “Fast Storage and Retrieval of Intermediate Values in a Real-Time Volume Rendering System,” filed by Kappler et al. on Nov. 12, 1998. The input to the pipeline
100
is voxel stored in a voxel memory
101
, and the output is pixels stored in a pixel memory or frame buffer
109
. The stages can include decoding
110
, interpolating
120
, gradient estimating
130
, classifying and shading
140
, and compositing
150
.
During operation of the pipeline
100
, the decoder (address generator)
110
generates addresses of voxels stored in the voxel memory
101
. The addresses are generated in a suitable order. Blocks of voxels are read from the memory, and presented to the pipeline for processing one at the time.
The interpolator
120
assigns values to sample points along the rays based upon voxel values in the neighborhood of the sample points. Typically, one can interpolate either voxel fields, or color-opacity fields using a predetermined interpolation mode, e.g., linearly, probabilistically, or nearest neighbor.
During gradient estimation
130
, vectors (G
uvw
) representing the direction and rate of change of voxel or sample values are estimated. Gradients with large magnitudes denote surfaces, or boundaries between, for example, types of material. The gradients are applied to shading and illumination functions to produce highlights that enhance the three-dimensional appearance.
During classification
140
, color and opacity values are assigned to each sample point. During illumination
145
, also known as shading, sample points are illuminated with highlights and shadows to produce a more realistic three-dimensional appearance. For example, Phong shading can be applied, see Phong, “Illumination for computer generated pictures,” Communications of the ACM 18(6), pp. 49-59.
The output of the shading stage
145
is a stream of color and opacity values at sample points. This stream is fed to the compositing unit for accumulation into the pixel values of the rays. The pixel value of each ray must be accumulated one sample point at a time. Finally, after the color and opacity values of all of the sample points on an individual ray have been accumulated, the resulting pixel value of that ray is written to the pixel memory.
It will be appreciated that the order of gradient estimation, interpolation, and classification can be permuted. In fact, different prior art systems use different orders for the stages. However, in the prior art of ray-casting, the order of the stages in a particular system is fixed.
It is desired that the classification of voxels be generalized. In the prior art, voxel typically are single values or a fixed format dedicated to a specific application. Generalizing the format, will allow the pipeline to support a greater variety of applications.
SUMMARY OF THE INVENTION
A method maps samples in a rendering pipelines. The samples are stored in a sample memory. Each sample has a plurality of fields. A descriptor is stored for each field in a field format register. Each sample is read from the memory into a mapping unit. The fields are extracted from each sample according to the corresponding descriptor, and forwarded in parallel to the rendering pipeline.
The fields can overlap, and can be enumerated in any order. The fields can store intensity values or physical values, or combinations thereof. The fields are scaled to fit data paths in the pipeline. The fields can be interpolated first and classified later, are classified first and interpolated later.


REFERENCES:
patent: 4835712 (1989-05-01), Drebin et al.
patent: 5313567 (1994-05-01), Civanlar et al.
patent: 5630034 (1997-05-01), Oikawa et al.
patent: 5831623 (1998-11-01), Negishi et al.
patent: 5956041 (1999-09-01), Koyamada et al.
patent: 5963211 (1999-10-01), Oikawa et al.
patent: 6008813 (1999-12-01), Lauer et al.
patent: 6211884 (2001-04-01), Knittel et al.
patent: 6266733 (2001-07-01), Knittel et al.
patent: 6268861 (2001-07-01), Sanz-Pastor et al.
patent: 6297799 (2001-10-01), Knittel et al.
patent: 6310620 (2001-10-01), Lauer et al.
patent: 6313841 (2001-11-01), Ogata et al.
patent: 6342885 (2002-01-01), Knittel et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for mapping samples in a rendering... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for mapping samples in a rendering..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for mapping samples in a rendering... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2851382

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.