E-cell (equivalent cell) and the basic circuit modules of...

Data processing: artificial intelligence – Neural network – Structure

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C706S026000, C706S033000, C706S040000, C706S042000

Reexamination Certificate

active

06397201

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of Invention
The invention pertains to the fields of pattern recognition, scene segmentation, machine vision and machine speech recognition, and neuromorphic engineering. The invention constitutes a basic building block of a new class of systems to implement perceptual and cognitive functions in various sensory and conceptual domains.
2. Discussion of Related Art
The history of neurologically inspired computing technology goes back at least several decades. The field lay relatively quiescent until the mid 1980s when the popularization of the multi-layer “artificial neural net” (ANN), usually emulated in software, overcame a decade and a half of skepticism about the technology largely engendered by an influential book “Perceptrons” by Marvin Minsky and Seymour Papaert. The multilayer ANN, particularly ones trained by various algorithms of backward propagation of an error signal, proved capable of a wide range of classification tasks. Though the differences between ANNs and biological neural circuitry was evident, the lack of an alternative computational hypothesis for the biological circuitry at first attracted some in the neuroscience community to adopt the ANN in its various forms as a model of neural computation. The limitations of the ANN, however, became apparent in both technological application and as a computational model for the brain. The ANN has become an accepted part of the computational repertoire for solving certain classes of classification problems and problems involving capturing of complex functions from a large set of training data.
However, the ANN on its own has not provided the basis for solutions to the more complex problems of vision, speech recognition, and other sensory and cognitive computation. Problems of segmentation, recognition of signals under various transformations, learning from single or very limited presentations of data, regularity extract, and many other real world recognition problem have to date eluded solution by ANN or any other neurally inspired techniques.
Experimental neuroscience has revealed that cortical and other neural architecture is far more complicated than any ANN. Realistic simulations of large neurons such as pyramidal or Purkinje cells suggest that individual neurons are capable of significant computation on their own, which must be the basis of the computations performed by circuits containing myriads of such cells. Experiment has also determined that connectivity between neurons is highly specific and becomes more so during the initial learning by organisms. Neurons themselves are specialized, not uniform. Strong evidence exists that synchronized oscillations across the cortices play a role in recognition. And perhaps most important is the fact that backward or descending signal paths are at least as numerous as the forward or ascending signal paths from sensory organs to progressively “higher” cortices.
BRIEF DESCRIPTION OF TEE INVENTION
The invention (the e-cell and building block e-circuit modules composed of e-cells) constitutes the base components of a technology system inspired by the neuroanatomical and neurophysiological evidence mentioned above which is ignored by the traditional ANN technology, and gives rise to system organization and computational capabilities quite distinct from ANNs. When combined into basic e-circuit modules, the e-cell provides the basis of systems capable of scene segmentation, recognition of signals under translation, and where applicable under scaling and rotation and well as degradation by noise and occlusion. The systems are capable of learning novel percepts with a single presentation, establishing association and context between percepts and percept classes. In addition the systems are capable, without supervision or auxiliary “hints”, of extracting signal regularities from unsegmented signal fields so as to identify which segments of the fields constitute stable percepts such as words or objects.
The E-Cell
The e-cell is the sole computational element of which circuits are built. It can be configured by a combination of specification and learning to perform a multiplicity of computational tasks. Important configuration variants involve
(1) “dendritic” architecture and behavior
(2) rise time and decay rates
(3) learning protocols
(4) bias inputs
It consists of a cell body, an output terminal, and set of tree-connected input regions each of which have one or more input terminals and corresponding stored weights. Each input and its corresponding weight is termed a synapse. Each region computes the dot product of the vector of its current inputs and current weights and may modify this with a bias term. The regions may be instructed to learn by altering their current weights as a function of the currently applied inputs to each region. The regions are joined to the cell body by a binary tree structure in which the outputs of the regions comprise the leaves of the tree and each node is a two input junction which applies one of two classes of function to the node's pair of inputs, each of which may be either a region output or the output of another node. The body of the cell has at least one input—the output of the root node of the tree—and may have a bias input and a learning control input. The body of the cell computes the new output value of the cell based on the output of the root node of the tree and the current output of the cell itself. If the cell has a bias input, the value of that bias input is distributed to a certain class of regions which modify the dot product of their input and weight vectors by either adding the bias or by multiplying by 1.0 plus bias. If the cell has a learning input, the value of that input is added to the output of the root node and if a certain learning threshold is exceeded then all the regions capable of learning are signaled to apply their learning algorithm. If the sum of the inputs to a region instructed to learn are either zero or below a certain low threshold, the region follows one of two rules: (1) its weights are set to zero or (2) its weights are left unchanged. If the sum of inputs to a region instructed to learn are above the mentioned threshold, then the region alters its weight in one of two ways: (1) each weight is altered to be proportional to its corresponding input such that the dot product of the new weights and current inputs is normalized to a standard output (e.g. 1.0) or (2) each weight is altered incrementally from its current weight toward the weights that would satisfy condition (1). The first learning rule is termed episodic learning and is intended to capture the values of a set of inputs in one learning cycle. The second learning rule is termed incremental and is intended to capture a set of values representative of multiple but related sets of inputs.
Basic E-Circuit Modules
The invention includes a set of elementary e-circuit modules assembled from e-cells. The resulting system organization and computational capabilities are quite distinct from ANNs. The e-circuit modules constituting this invention can be composed into higher level e-circuits capable of scene segmentation, recognition of signals under translation, and where applicable under scaling and rotation and well as degradation by noise and occlusion. The systems are capable of learning novel percepts with a single presentation, establishing association and context between percepts and percept classes. These higher level e-circuits are the subject of an associated utility application.
The first component e-circuit of the invention is termed the basic e-cell pair. It actually contains three e-cells. The two major e-cells (the pair) are connected in a “complementary” fashion thus providing segments of associated forward and backward signal paths. An inhibitory e-cell connects the output of backward e-cell to an input of the forward e-cell and thus provides a inhibitory feedback path which is responsible for the oscillatory dynamics at the heart of the e-circuit's computation. The inhibitory e-cell has no connecti

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

E-cell (equivalent cell) and the basic circuit modules of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with E-cell (equivalent cell) and the basic circuit modules of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and E-cell (equivalent cell) and the basic circuit modules of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2902921

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.