Method and system for using cooperative game theory to...

Data processing: measuring – calibrating – or testing – Measurement system – Measured signal processing

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S181000, C703S001000

Reexamination Certificate

active

06640204

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to the fields of cooperative game theory and statistical analysis. More specifically, it relates to a method and system for using cooperative game theory to resolve joint effects in statistical analysis.
BACKGROUND OF THE INVENTION
Many statistical procedures estimate how an outcome is affected by factors that may influence it. For example, a multivariate statistical model may represent variations of a dependent variable as a function of a set of independent variables. A limitation of these procedures is that they may not be able to completely resolve joint effects among two or more independent variables.
A “joint effect” is an effect that is the joint result of two or more factors. “Statistical joint effects” are those joint effects remaining after the application of statistical methods. Cooperative resolution is the application of cooperative game theory to resolve statistical joint effects.
A performance measure is a statistic derived from a statistical model that describes some relevant aspect of that model such as its quality or the properties of one of its variables. A performance measure may be related to a general consideration such as assessing the accuracy of a statistical model's predictions. Cooperative resolution can completely attribute the statistical model's performance, as reflected in a performance measure, to an underlying source such as the statistical model's independent variables.
Most performance measures fall in to one of two broad categories. The first category of performance measure gauges an overall “explanatory power” of a model. The explanatory power of a model is closely related to its accuracy. A typical measure of explanatory power is a percentage of variance of a dependent variable explained by a multivariate statistical model.
The second category of performance measure gauges a “total effect.” Measures of total effect address the magnitude and direction of effects. An example of such a total effect measure is a predicted value of a dependent variable in a multivariate statistical model.
Some of the limits of the prior art with respect to the attribution of explanatory power and total effect may be illustrated with reference to a standard multivariate statistical model. A multivariate statistical model is commonly used to determine a mathematical relationship between its dependent and independent variables. One common measure of explanatory power is a model's “R
2
” coefficient. This coefficient takes on values between zero percent and 100% in linear statistical models, a common statistical model. An R
2
of a model is a percentage of a variance of a dependent variable, i.e., a measure of its variation, explained by the model. The larger an R
2
value, the better the model describes a dependent variable.
The explanatory power of a multivariate statistical model is an example of a statistical joint effect. As is known in the art, in studies based on a single independent variable, it is common to report the percentage of variance explained by that variable. An example from the field of financial economics is E. Fama and K. French, “Common risk factors in the returns on stocks and bonds,”
Journal of Financial Economics
, v. 33, n. 1. 1993, pp. 3-56. In multivariate statistical models, however, it may be difficult or impossible, relying only on the existing statistical arts, to isolate a total contribution of each independent variable.
The total effect of a multivariate statistical model in its estimation of a dependent variable is reflected in estimated coefficients for its independent variables. If there are no interaction variables, independent variables that represent joint variation of two or more other independent variables, then, under typical assumptions, it is possible to decompose this total effect into separate effects of the independent variables. However, in the presence of interaction variables there is no accepted method in the art for resolving the effects of the interaction variables to their component independent variables.
The theory of variance decomposition is the area of statistics that comes closest to addressing the resolution of statistical joint effects. However, the decomposition of the explained variance is often not explained with respect to the independent variables in the model. For example, David Harville, in “Decomposition of prediction error,”
Journal of the American Statistical Association
, v. 80 n. 389, 1985, pp. 132-138, shows how an error variance in a model may be divided between different types of statistical sources of error.
Typically, however, these sources are not directly associated with particular independent variables, but rather with aspects of the estimation procedure.
Variance decomposition in vector autoregression (VAR) addresses the resolution of statistical joint effects in the prediction error associated with the variables of a time series model. It is based on a model of the effects of a one-time variation or “shock” in a single series to future variations in time series variables in the model. This procedure is introduced in C. Sims in “Macroeconomics and Reality,”
Econometrica
v. 48, 1980, pp. 1-48. Resolution of joint effects by this method is based on assuming a particular causally ordered relationship between shocks, and, hence, is based on a different resolution principle. H. Pesaran and Y. Shin, “Generalized impulse response analysis in linear multivariate models,”
Economics Letters
, v. 58, 1998, pp. 17-29, describes a different VAR variance decomposition method that produces unique results. This method averages joint effects rather than resolving them and does not utilize cooperative game theory. VAR variance decomposition is not applicable to general multivariate statistical models.
A related topic in the statistical arts is the estimation of variance components. An analysis of variance model may be understood to have “fixed” and “random” effects. Random effects may arise when observations in a sample are randomly selected from a larger population. Variance components methods take population variation into account when constructing statistical tests. These methods do not provide a way to resolve statistical joint effects between independent variables in a multivariate statistical model.
Factor analysis and principal components analysis may be the most closely related statistical techniques. They represent a set of variables by a smaller set of underlying factors. These factors may be constructed to be mutually orthogonal, in which case the variance of the complete model may be completely attributed to these underlying factors. These procedures cannot generate a natural unique set of factors and the factors generated may be difficult to interpret in relation to the original variables in the model.
One accepted method to determine the explanatory power of independent variables in a multivariate statistical model is by assessment of their “statistical significance.” An independent variable is statistically significant if a “significance test” determines that its true value is different than zero. As is known in the art, a significance test has a “confidence level.” If a variable is statistically significant at the 95% confidence level, there is a 95% chance that its true value is not zero. An independent variable is not considered to have a “significant effect” on the dependent variable unless it is found to be statistically significant. Independent variables may be meaningfully ranked by their statistical significance. However, this ranking will generally provide limited insight into their relative contributions to explained variance.
Cooperative game theory can be used to resolve statistical joint effects problems. As is known in the art, game theory is a mathematical approach to the study of strategic interaction among people. Participants in these games are called “players.” Cooperative game theory allows players to make contracts and has been used to solve problems of bargaining over the allocation of jo

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for using cooperative game theory to... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for using cooperative game theory to..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for using cooperative game theory to... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3173071

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.