Device, method, and medium for predicting a probability of...

Data processing: measuring – calibrating – or testing – Measurement system – Measured signal processing

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S179000, C702S181000, C702S198000

Reexamination Certificate

active

06766280

ABSTRACT:

BACKGROUND OF THE INVENTION
This invention relates to technology for statistical prediction and, in particular, to technology for prediction based on Bayes procedure.
Conventionally, a wide variety of methods have been proposed to statistically predict a data on the basis of a sequence of data generated from the unknown source. Among the methods, Bayes prediction procedure has been widely known and has been described or explained in various textbooks concerned with statistics and so forth.
As a problem to be solved by such statistical prediction, there is a problem for sequentially predicting, by use of an estimation result, next data which appear after the data sequence. As regards this problem, proof has been made about the fact that a specific Bayes procedure exhibits a very good minimax property by using a particular prior distribution which may be referred to as Jeffreys prior distribution. Such a specific Bayes procedure will be called Jeffreys procedure hereinafter. This proof is done by B. Clarke and A. R. Barron in an article which is published in Journal of Statistical Planning and Inference, 41:37-60, 1994, and which is entitled “Jeffreys prior is asymptotically least favorable under entropy risk”. This procedure is guaranteed to be always optimum whenever a probability distribution hypothesis class is assumed to be a general smooth model class, although some mathematical restrictions are required in strict sense.
Herein, let logarithmic regret be used as another index. In this event also, it is again proved that the Jeffery procedure has a minimax property on the assumption that a probability distribution hypothesis class belongs to an exponential family. This proof is made by J. Takeuchi and A. R. Barron in a paper entitled “Asymptotically minimax regret for exponential families”, in Proceedings of 20th Symposium on Information Theory and Its Applications, pp. 665-668, 1997.
Furthermore, the problem of the sequential prediction can be replaced by a problem which provides a joint (or simultaneous) probability distribution of a data sequence obtained by cumulatively multiplying prediction probability distributions.
These proofs suggest that the Jeffreys procedure can have excellent performance except that the prediction problem is sequential, when the performance measure is the logarithmic loss.
Thus, it has been proved by Clarke and Barron and by Takeuchi and Barron that the Bayes procedure is effective when the Jeffreys prior distribution is used. However, the Bayes procedure is effective only when the model class of the probability distribution is restricted to the exponential family which is very unique, in the case where the performance measure is the logarithmic regret instead of redundancy.
Under the circumstances, it is assumed that the probability distribution model class belongs to a general smooth model class which is different from the exponential family. In this case, the Jeffreys procedure described in above B. Clarke and A. R. Barron's document does not guarantee the minimax property. To the contrary, it is confirmed by the instant inventors in this case that the Jeffreys procedure does not have the minimax property.
Furthermore, it often happens that a similar reduction of performance takes place in a general Bayes procedure different from the Jeffreys procedure when estimation is made by using the logarithmic regret in lieu of the redundancy.
SUMMARY OF THE INVENTION
It is an object of this invention to provide a method which is capable of preventing a reduction of performance.
It is a specific object of this invention to provide improved Jeffreys procedure which can accomplish a minimax property even when logarithmic regret is used a performance measure instead of redundancy.
According to a first embodiment of the invention, a Bayes mixture density calculator operable in response to a sequence of vectors &khgr;
n
=(&khgr;
1
, &khgr;
2
, . . . , &khgr;
n
) selected from a vector value set &khgr; to produce a Bayes mixture density on occurrence of the &khgr;
n
, comprising a probability density calculator, supplied with a sequence of data &khgr;
t
and a vector value parameter u, for calculating a probability density for the &khgr;
t
, p(&khgr;
t
|u), a Bayes mixture calculator for calculating a first approximation value of a Bayes mixture density p
w
(&khgr;
n
) on the basis of a prior distribution w(u) predetermined by the probability density calculator to produce the first approximation value, an enlarged mixture calculator for calculating a second approximation value of a Bayes mixture m(&khgr;
n
) on exponential fiber bundle in cooperation with the probability density calculator to produce the second approximation value, and a whole mixture calculator for calculating (1−&egr;)p
w
(&khgr;
n
)+&egr;·m(&khgr;
n
) to produce a calculation result by mixing the first approximation value of the Bayes mixture density p
w
(&khgr;
n
) with a part of the second approximation value of the Bayes mixture m(&khgr;
n
) at a rate of 1−&egr;:&egr; to produce the calculation result where &egr; is a value smaller than unity.
According to a second embodiment of the invention which can be modified based on the first embodiment of the invention, a Jeffreys mixture density calculator operable in response to a sequence of vector &khgr;
n
=(x
1
, x
2
, . . . , x
n
) selected from a vector value set &khgr; to produce a Bayes mixture density on occurrence of the x
n
, comprising a probability density calculator responsive to a sequence of data &khgr;
t
and a vector value parameter u for calculating a probability density p(&khgr;
t
|u) for the &khgr;
t
, a Jeffreys mixture calculator for calculating a first approximation value of a Bayes mixture density p
J
(&khgr;
n
) based on a Jeffreys prior distribution w
J
(u) in cooperation with the probability density calculator to produce the first approximation value, an enlarged mixture calculator for calculating a second approximation value of a Bayes mixture m(&khgr;
n
) on exponential fiber bundle in cooperation with the probability density calculator to produce the second approximation value, and a whole mixture calculator for calculating (1−&egr;)p
J
(&khgr;
n
)+&egr;·m(&khgr;
n
) to produce a calculation result by mixing the first approximation value of the Bayes mixture density p
J
(&khgr;
n
) with a part of the second approximation value of the Bayes mixture m(&khgr;
n
) at a rate of 1−&egr;:&egr; to produce the calculation result where &egr; is a value smaller than unity.
Also, when hypothesis class is curved exponential family, it is possible to provide with a third embodiment of the invention by modifying the first embodiment of the invention. According to the third embodiment of the invention, a Bayes mixture density calculator operable in response to a sequence of vector &khgr;
n
=(&khgr;
1
, &khgr;
2
, . . . , &khgr;
n
selected from a vector value set &khgr; to produce a Bayes mixture density on occurrence of the &khgr;
n
, comprising a probability density calculator responsive to a sequence of data &khgr;
t
and a vector value parameter u for outputting probability density p(&khgr;
t
|u) for the &khgr;
t
on curved exponential family, a Bayes mixture calculator for calculating a first approximation value of a Bayes mixture density p
w
(&khgr;
n
) on the basis of a prior distribution w(u) predetermined by the probability density calculator to produce the first approximation value, an enlarged mixture calculator for calculating a second approximation value of a Bayes mixture m(&khgr;
n
) on exponential family including curved exponential family in cooperation with the probability density calculator to produce the second approximation value, and a whole mixture calculator for calculating (1−&egr;)p
w
(&khgr;
n
)+&egr;·m(&khgr;
n
) to produce a calculation result by mixing the first approximation value of the Bayes mixture density p
w
(&khgr;
n
) with a part of the second approximation value of the Bayes mixture m(&khgr;
n
) a

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Device, method, and medium for predicting a probability of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Device, method, and medium for predicting a probability of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Device, method, and medium for predicting a probability of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3258001

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.