Miscellaneous active electrical nonlinear devices – circuits – and – Signal converting – shaping – or generating – Converting input voltage to output current or vice versa
Reexamination Certificate
2002-11-13
2003-12-16
Nuton, My-Trang (Department: 2816)
Miscellaneous active electrical nonlinear devices, circuits, and
Signal converting, shaping, or generating
Converting input voltage to output current or vice versa
C706S033000
Reexamination Certificate
active
06664818
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention generally relates to a neural circuit, and more particularly a neural circuit which is used for approximating sigmoid function.
2. Description of the Prior Art
Neural networks are mathematical models that are inspired by the connections and the functioning of neurons in biological systems. Neural networks have given rise to a branch of research called neural computing, being used or tried out in many disciplines. The basic concept is based on two simple concepts, the topology of nodes and connections between them, and transfer functions which relate the input and output of each node. A node receives input data through its input connections, performs a very simple operation on these (weighted sum and some kind of thresholding function), and passes the result on its output connection(s), as final output or for use in other nodes.
The inherent simplicity of Neural networks suggests that massive parallelism and possibly special, very simple hardware can be taken advantage of in the implementation of. Neural networks, e.g. semiconductors or optical elements. More relevant than implementation questions, however, appears to be the understanding of the virtues and pitfalls of Neural networks as algorithms. One of their important properties is that they can be trained, i.e. they can be given training samples of events of different classes, and by learning algorithms of various complications, can adjust the weights associated to all input connections until some overall function is maximized which characterizes the quality of the decision mechanism. The optimization is often viewed in analogy with the minimizing of a physical potential (Boltzmann machine); the function is then termed an “energy function”. There are functions such as thresholding function, linearly separable function and sigmoid functions, which can approximate the energy function. Among the three, the sigmoid function is most widely used because it can be implemented by simple multiplier and is very applicable using back-propagation learning which can adjust weights to improve discrimination when input data is incomplete or noisy.
One structure of a conventional neuron circuit for approximating energy function includes a simple circuit for comparing two electrical quantities provided by a nonlinear voltage saved within “off-chip digital RAM” and a reference source. Though the circuit with this structure can result in an energy function, it needs a lot of external inputs, hence increases circuit complexity and makes it unavailable to be implemented in VLSI technology. Another structure of neuron circuit for approximating energy function includes dual-transistor amplifier with low gains. The transfer function of the circuit with this structure can not generate sigmoid-like function correctly therefore this circuit is not applicable using back-propagation learning. There's also another neuron circuit which is made from transistors. By utilizing the characteristic of transistors, this circuit can approximate sigmoid function perfectly but still, it needs complexity of BICOMS technologies and its gain is unable to be adjusted.
In accordance with the above description, a new and improved neural circuit which is simple constructed, current controlled and gain adjustable with high precision is therefore necessary, so as to approximate sigmoid function with insignificant error.
SUMMARY OF THE INVENTION
In accordance with the present invention, a neural circuit is provided that substantially overcomes the drawbacks of the above mentioned problems when approximating sigmoid function.
Accordingly, it is one object of the present invention to provide a sigmoid neural circuit which is simple constructed and is able to be implemented for VLSI manufacturing.
It is another object for present invention to provide a neural sigmoid circuit which can generate sigmoid-like function with insignificant error and enable the result of the circuit to be used by back-propagation learning.
It is still another object for present invention to provide a neural sigmoid circuit which is gain adjustable and won't increase complexity of the circuit.
According to the foregoing objectives, the present invention provides a simple neuron circuit design and it can generate sigmoid-like function with errorless effect so as to fit for back-propagation learning. By using adjustable threshold and gain factor, the circuit has a large range and high noise immunity.
REFERENCES:
patent: 5309036 (1994-05-01), Yang et al.
patent: 5648926 (1997-07-01), Douglas et al.
patent: 5745655 (1998-04-01), Chung et al.
patent: 6429699 (2002-08-01), Shi et al.
Chen Lu
Lu Chun
Shi Bingxue
Nuton My-Trang
Rosenberg , Klein & Lee
Winbond Electronics Corporation
LandOfFree
Current controlled sigmoid neural circuit does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Current controlled sigmoid neural circuit, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Current controlled sigmoid neural circuit will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3153454