Statistical probability distribution-preserving accumulation...

Data processing: measuring – calibrating – or testing – Measurement system in a specific environment – Biological or biochemical

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S019000, C702S030000, C702S187000

Reexamination Certificate

active

06778910

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates in general to statistical analysis of one or more data sample distributions, such as a histogram of photoluminescence data produced by a blood flow cytometer system, and is particularly directed to a signal processing operator that is operative to preserve statistical probability distribution characteristics for a quantized data set that is subjected to a dynamic range expansion transform, such as a logarithmic operator, which, without the statistical probability preservation mechanism, would introduce undesirable binning artifacts into the transformed data.
BACKGROUND OF THE INVENTION
Flow cytometry derives from the quantitative measurement (meter) of structural features of biological cells (cyto) transported by a carrier in a controlled flow through a series of primarily optical detectors. Flow cytometers have been commercially available since the early 1970's, and their use has been continuously increasing. The most numerous flow cytometers are those employed for complete blood cell counts in clinical laboratories. Flow cytometers are found in all major biological research institutions. They are also numerous in medical centers, where they are used for diagnosis as well as research. There are currently on the order of 7,000 flow cytometers in use worldwide. Chromosome count and cell cycle analysis of cancers is the major diagnostic use. Lymphomas and leukemia are intensively studied for surface markers of diagnostic and prognostic value. Flow cytometry,has been the method of choice for monitoring AIDs patients.
The general architecture of a flow cytometer system is shown diagrammatically in FIG.
1
(
a
). Cells in suspension (retained in a saline carrier reservoir
10
) are caused to flow one at a time (typically at rates of over
100
cells per second) through a transport medium
12
(such as a capillary tube
12
). As the stream of cells flows through the capillary, the cells pass through an illumination region or window
14
, one at the time, where they are illuminated by a focused optical output beam produced by a laser
16
. Distributed around the illumination window are a plurality of optical sensors
18
, that are located so as to intercept and measure the optical response of each cell to the laser beam illumination, including forward scatter intensity (proportional to cell diameter), orthogonal scatter intensity (proportional to cell granularity), and fluorescence intensities at various wavelengths. Each optical sensor's measurement output is then digitized and coupled to a computer (signal processor)
20
for processing.
Because different cell types can be differentiated by the statistical properties of their measurements, flow cytometry can be use to separate and count different cell populations in a mixture (blood sample, for example). In addition, the cells can be stained with fluorescent reagents or dyes that bind to specific biochemical receptors of certain cells, allowing the measurement of biological and biochemical properties. The object of flow cytometry is to separate and quantify cell populations. Typically, the acquired data is accumulated into one or two dimensional data distributions so that the morphological variability of the distributions can be interpreted to distinguish the cellular populations.
In the current approaches for analyzing flow cytometric histograms, a pre-processing step known as the log-transformation, is employed to increase the dynamic range of the data distribution, in order to facilitate analysis and henceforth enhance interpretation. Unfortunately, although this step serves to broaden the dynamic range of the data distributions, it introduces an undesirable artifact, known as the “picket fence” or “binning effect”, that undermines the one aspect of the solution it is intended to reinforce.
This may be illustrated by reference to FIGS.
1
(
b
) and
1
(
c
), wherein FIG.
1
(
b
) shows two overlapped Gaussian distributions as the original data, and FIG.
1
(
c
) shows in discontinuous vertical bold black lines the binning or picket fence effect in the histogram of the log-transformed data; the continuous gray line in FIG.
1
(
c
) is the ideal continuous transformation. This transformation data is beset with contentious aspects. Even if dynamically expanded, it introduces a significant binning artifact; moreover, the result from the discrete log transformation is not suitable for data analysis or any direct and consequential statistical analysis.
In an effort to counter the binning effect's undesired artifacts, which skew the analysis/interpretation of the results, a number of practitioners have relied upon filtering techniques, that attempt to attenuate this undesired effect without introducing changes that might undermine the statistical values of the original data. However, these filtering approaches are fraught with an irreconcilable issues: that of relating the degree to which the artifact must be filtered with the point at which the filtered data is still considered to retain statistics similar to the original data.
More particularly, to resolve the binning effect issue, averaging schemes based on finite impulse response (FIR) filters are typically used. Most effective FIR filtering schemes are achieved using a traditional (1-2-1) 3-tap FIR filter. Such a filter has have positive attributes, provided that the following constraints are met: 1-maintaining the area under the curve requires that the sum of the FIR filter coefficients equals 1, and that the appropriate boundary conditions have been met; 2- to prevent the data from being skewed, the filter coefficients must be symmetric around the center (this requires that the number of filter taps in the FIR filter be odd), and the individual distributions must be symmetrical and not overlapping, which is not typically true when analyzing log-transformed data from cell populations.
FIGS.
2
(
a
),
2
(
b
),
2
(
c
) and
2
(
d
) respectively illustrate the effects of an FIR filtering scheme, after 20, 100, 200 and 500 passes of the binned histogram of FIG.
1
(
c
) through a traditional (1-2-1) FIR filter. In each of these Figures, the bold black curve is the filtered histogram, while the gray curve shows the ideal transformed data. From qualitative analyses of the results of the FIR smoothing, one might infer that there is an optimal number of passes required for the closest approximation to the actual distribution. However, FIR filtering cannot be optimal, because the log transformation function has been applied to initially Gaussian populations, and the resultant log-normal populations will be skewed. Clearly, after 500 passes, this can be observed. Another factor that makes the use of FIR filtering techniques inappropriate is the fact that, depending on the physical properties of the cells and the particular sensors used, the cell populations may overlap, and therefore any excessive filtering will skew their statistical properties.
Other practitioners have attempted to ameliorate the binning effect by making use of log-amplifiers to electronically transform the input signal in its analog form before digitalization. This approach has a number of drawbacks: 1- it requires the use of additional and expensive hardware; 2- logarithmic amplifiers are notoriously noisy and unstable; and 3- when linear data is also required, the instruments must send and store twice the amount of data.
Another proposal to overcome the binning effect is to use high-resolution analog-to-digital converters (ADCs), in an effort to prevent the log transformation from exhibiting discontinuities in the lower range histogram channels. This approach is not perfect and has the following problems: 1- high-resolution ADCs converters are expensive; 2- the amount of data the instruments must send and store will increase proportionally with the increase in bit resolution of the ADC—another expensive proposition; and 3- no matter what the resolution of the ADC converter used, the binning effect, although minimized, will still be present in the o

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Statistical probability distribution-preserving accumulation... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Statistical probability distribution-preserving accumulation..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Statistical probability distribution-preserving accumulation... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3344468

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.