Neuron architecture having a dual structure and neural...

Data processing: artificial intelligence – Neural network – Structure

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C706S015000, C706S026000, C706S041000, C706S042000

Reexamination Certificate

active

06502083

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to neural network systems and more particularly to an improved neuron architecture having a dual structure designed to generate local/global signals. This improved neuron architecture allows the neuron to work either as a single neuron or as two independent neurons to construct very flexible artificial neural networks (ANNs). Moreover, it is well adapted for integration in VLSI semiconductor chips.
CO-PENDING PATENT APPLICATION
Improved neuron structure and artificial neural networks incorporating the same, attorney docket number FR9-98-081, filed on the same date herewith.
BACKGROUND OF THE INVENTION
Artificial neural networks (ANNs) are more and more used in applications where no mathematical algorithm can describe the problem to be solved and they are very successful as far as the classification or recognition of objects is concerned. ANNs give very good results because they learn by examples and are able to generalize in order to respond to an input vector which was never presented. So far, most ANNs have been implemented in software and only a few in hardware, however the present trend is to implement ANNs in hardware, typically in semiconductor chips. In this case, hardware ANNs are generally based upon the Region Of Influence (ROI) algorithm. The ROI algorithm gives good results if the input vector presented to the ANN can be separated into classes of objects well separated from each other. If an input vector has been recognized by neurons belonging to two different classes (or categories), the ANN will respond by an uncertainty. This uncertainty may be reduced in some extent by the implementation of the K Nearest Neighbor (KNN) algorithm. Modern neuron and artificial neural network architectures implemented in semiconductor chips are described in the following U.S. patents:
U.S. Pat. No. 5,621,863 “Neuron Circuit”
U.S. Pat. No. 5,701,397 “Circuit for Pre charging a Free Neuron Circuit”
U.S. Pat. No. 5,710,869 “Daisy Chain Circuit for Serial Connection of Neuron Circuits”
U.S. Pat. No. 5,717,832 “Neural Semiconductor Chip and Neural Networks Incorporated Therein”
U.S. Pat. No. 5,740,326 “Circuit for Searching/Sorting Data in Neural Networks”
which are incorporated herein by reference. These patents are jointly owned by IBM Corp. and Guy Paillet. The chips are manufactured and commercialized by IBM France under the ZISC036 label. ZISC is a registered Trade Mark of IBM Corp. The following description will be made in the light of the U.S. patents recited above, the same vocabulary and names of circuits will be kept whenever possible.
In U.S. Pat. No. 5,621,863 (see FIG.
5
and related description), there is disclosed a neuron circuit architecture (
11
) according to the ZISC technology. The ZISC neuron circuit can be easily connected in parallel to build an ANN having the desired size to meet the application needs such as defined by the user. This specific neuron circuit architecture is adapted to generate local result signals, e.g. of the fire type and local output signals, e.g. of the distance or category type. The neuron circuit is connected to six input buses which transport input data (e.g. the input category), feed back signals and control signals. A typical neuron circuit includes the essential circuits briefly discussed below. A multi-norm distance evaluation circuit (
200
) calculates the distance D between an input vector and the prototype vector stored in a R/W (weight) memory circuit (
250
) placed in each neuron circuit once it has been learned. A distance compare circuit (
300
) compares the distance D with either the actual influence field (AIF) of the prototype vector or the lower limit thereof (MinIF) stored in an IF circuit (
350
) to generate first and second intermediate result signals (LT, LTE). An identification circuit (
400
) processes the said intermediate result signals, the input category signal (CAT), the local category signal (C) and a feed back signal to generate a local/global result signal (F, UNC/FIRE.OK, . . . ) which represents the response of a neuron circuit to the presentation of an input vector. A minimum distance determination circuit (
500
) is adapted to determine the minimum distance Dmin among all the distances calculated by the neuron circuits of the ANN to generate a global output signal (NOUT) of the distance type. The same processing applies to categories. The feed back signals present on the OR-BUS are global signals collectively generated by all the neuron circuits and result of ORing all the local output signals. A daisy chain circuit (
600
) is serially connected to the corresponding daisy chain circuits of the two adjacent neuron circuits to structure these neurons as a chain forming thereby said ANN. Its role is to determine the neuron circuit state: free, first free in the chain or engaged. Finally, a context circuitry (
100
/
150
) is capable to allow or not the neuron circuit to participate with the other neuron circuits of the ANN in the generation of the said feed back signal.
Unfortunately, depending upon the application, the number of input vector components that is required is not necessarily the same. Some applications may need a high number of components while others not. If a chip is built with such a high number for a specific application, for an application requiring only a small number of components, a significant part of the memory space will not be used.
Moreover, this neuron architecture is not optimized in terms of circuit density because many functions are decentralized locally within each neuron and thus are duplicated every time a neuron is added to the ANN. Moreover, in each neuron circuit, the distance compare block
300
is used to compare the distance with the AIF value. A comparator performs this comparison at the end of the distance evaluation, but at this time, the distance evaluation circuit
200
which also includes a comparator is not busy, so that one of the comparators is not really necessary. This duplication is a source of wasted silicon area. In normal ZISC operation, the contents of the local norm/context register
100
is seldom changed. As a consequence, the NS signal generated by the matching circuit
150
does not change either and therefore, the matching circuit is not busy most of the time. In each neuron circuit, both the identification circuit
400
and the matching circuit
150
also include compare circuits for comparison purposes. Likewise, there is no occasion to have these circuits operating together. All these duplicated circuits require unnecessary additional logic gates.
In the ROI mode, it is often useful to know the minimum/maximum AIF among the AIF values determined by all the neuron circuits of the ANN, but because there is no connection between the IF circuit
350
and the Dmin determination circuit
500
, it is thus not possible to perform this operation. Another missing functionality results of the impossibility to determine the minimum/maximum value of the prototype components stored in the R/W memory and the norm/context value stored in the register
100
of all the neurons. Finally, for the selected neurons (those which are at the minimum distance or have the minimum category), it is impossible to change the prototype components (weights) stored in the said R/W memory after the recognition phase.
In the ZISC neuron architecture, there are four input data buses to feed each neuron, but only a few data need to be applied at the same time to a determined neuron circuit. A high number of buses induces a high number of wires and drivers for electrical signal regeneration that are a source of silicon area consumption in the chip.
SUMMARY OF THE INVENTION
Therefore, it is a primary object of the present invention to provide an improved neuron architecture having a dual structure that can operate either as a single neuron (single mode) or as two independent neurons with a reduced number of components, referred to as the even and odd neurons (dual mode), at user's will.
It is still another object of the pres

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Neuron architecture having a dual structure and neural... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Neuron architecture having a dual structure and neural..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Neuron architecture having a dual structure and neural... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2998284

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.