Feature extracting device for pattern recognition

Image analysis – Pattern recognition – Feature extraction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S224000, C382S155000

Reexamination Certificate

active

06778701

ABSTRACT:

BACKGROUNDS OF THE INVENTION
1. Field of the Invention
The present invention relates to a feature extracting device for calculating features in pattern recognition such as image recognition, and more particularly to a feature extracting device having a pattern learning function such as a neural network.
2. Description of the Related Art
As a method of deciding features for use in pattern recognition from a group of learning patterns, a method based on a discriminant analysis has been well known and widely used (for example, methods disclosed in Japanese Patent Publication Laid-Open (Kokai) No. Heisei 01-321591 have been known).
The discriminant analysis is a method of deciding features to be extracted so as to get the greatest difference between classes (categories) while restraining the variation of features within a class (variation within category) (for example, refer to “Mathematical Study on Feature Extraction in Pattern Recognition” written by Ootsu, Electro-Technical Laboratory Report No. 818, 1981), which method is characterized by high isolation ability among classes compared with the other feature deciding method such as a principal component analysis.
Brief description about the discriminant analysis will be made here. Assume that a group of learning patterns is given and that classes these patterns belong to are given.
In the discriminant analysis, covariance matrix within class S
w
and covariance matrix between classes S
b
are required from these learning patterns then to solve the characteristic equation Sw−
1
Sb·f
i
=&lgr;
i
·f
i
.
A predetermined number, M of characteristic vectors f
i
is selected from thus required characteristic vectors in decreasing order of the characteristic value &lgr;
i
.
The feature extraction is performed by calculating the inner product Z
i
=(f
i
,X), (i=1 to M) from an objective input pattern X, using these characteristic vectors, and the characteristic Z
i
is extracted.
According to the above discriminant analysis, linear feature extraction in which variation within class is small and difference between classes is large can be achieved, as is well known.
On the other hand, as a learning method of input/output relationship of a pattern using a group of learning patterns consisting of each pair of an input pattern and an output pattern, an error-back propagation learning (back propagation) using a multi-layered perceptron neural network has been known and widely used (for example, refer to “Neuro-Computer” compiled under the supervisor of Nakano Hajime, Gijutsu-Hyoron Co., Ltd., 1989, and “Parallel Distributed Processing”, written by D. E. Rumelhart, MIT Press, 1986).
FIG. 7
shows the structure of a three-layered perceptron neural network. In
FIG. 7
, an input pattern entered into an input layer is sequentially processed through an intermediate layer and an output layer, hence to calculate the output pattern.
In the error back propagation learning, each parameter (connection weight) of each layer of a neural network is updated so to conform the output pattern to a desired output pattern as a learning pattern as well as possible.
The above point will be described in detail.
In
FIG. 7
, an output H
j
of a unit j of intermediate layer is calculated from an input pattern I
i
, using a connection weight W
ji
and a threshold &thgr;
j
, by the following expression.
H
j
=
f

(
U
j
)
,


U
j
=

i



W
ji
·
I
i
+
θ
j
,
f

(
x
)
=
1
/
{
1
+
exp

(
-
2

x
/
u
0
)
}
,
The symbol f(x) is a function called a sigmoid function.
The symbol u
0
is a predetermined parameter.
An output O
k
of a unit of an output layer is calculated from the output H
j
of an intermediate layer unit thus calculated, by the following expression.
O
k
=
f

(
S
k
)
,


S
k
=

j



V
kj
·
H
j
+
γ
k
,
(V
kj
is the connection weight, and &ggr;
k
is the threshold.)
At this time, assuming that the desired output pattern is T
k
, learning will be performed by updating each parameter (such as connection weight) (generally represented as p) according to the gradient (−∂E/∂p) so as to reduce the error to be shown in the following expression.
E
=
<

k



(
T
k
-
O
k
)
2
>
Here, the symbol <·> indicates the mean operation as for a learning pattern. As the result, an output of the neural network approaches a desired one.
The features obtained by the above-mentioned conventional discriminant analysis, however, are defectively fragile to variation of a pattern, because of being linear features.
Although the discriminant analysis is, of course, a feature selecting method of reducing the variation of features within a class according to a pattern variation (compared with the variation between classes), naturally it cannot absorb variations such as deviation, rotation, scaling of a pattern, because the obtained features are linear.
While, since a multi-layered perceptron neural network could learn the non-linear input/output relationship, it could be tough against the above-mentioned pattern variation in principle. However, in order to make a network learn so as to absorb the pattern variation and to do the pattern recognition, extravagant learning is actually required, which is not practical.
Therefore, a method of restraining an influence of a pattern variation by pre-processing such as size normalization and alignment of an input pattern, or a method of previously extracting a feature amount decided in an experimental way and doing multi-layered perceptron learning using this feature amount as a new input, is adopted.
Namely, a multi-layered perceptron neural network also has a problem of being fragile to a pattern variation actually.
SUMMARY OF THE INVENTION
A first object of the present invention is to provide a feature extracting device suitable for pattern recognition, tough against a pattern variation, in order to solve the above conventional problem.
A second object of the present invention is to provide a feature extracting device tough against a pattern variation, with no need of extravagant learning.
According to the first aspect of the invention, a feature extracting device comprises
feature vector calculating means for projecting a learning pattern to be recognized on a subspace group, so to calculate squares of projection length on each subspace as feature vectors, and
subspace basis vector learning means including at least parameter updating means for updating basic vectors of each subspace forming the subspace group, so as to increase the ratio of variation between classes to variation within a class, as for each component of the feature vectors.
In the preferred construction, the feature vector calculating means normalizes the learning pattern, hence to project the same on the subspace group, and calculates squares of projection length on each subspace, or quantity derived from there, as feature vectors.
In another preferred construction, the subspace basis vector learning means
includes calibrating means for calibrating the feature vectors by performing restraint processing among features based on a restraint parameter predetermined as for the calculated feature vectors.
In another preferred construction, the feature vector calculating means
normalizes the learning pattern, hence to project the same on the subspace group, and calculates squares of projection length on each subspace, or quantity derived from there, as feature vectors, and
the subspace basis vector learning means
includes calibrating means for calibrating the feature vectors by performing restraint processing among features based on a restraint parameter predetermined as for the calculated feature vectors.
In another preferred construction, the parameter updating means performs normalized orthogonalization on the basis vectors obtained by update processing, according to the Gram-Schmid orthogonalization.
In another preferred construction, the feature vector calculating means
normalizes the learning pattern, hence

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Feature extracting device for pattern recognition does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Feature extracting device for pattern recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Feature extracting device for pattern recognition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3298114

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.