MULTILAYERED NEURAL-LIKE NETWORK OF DIRECT PROPAGATION WITH THE ADJUSTMENT ACCORDING TO SIMILARITY MEASURES OF VECTORS OF THE LEARNING SAMPLE
Андрей Евгеньевич Краснов, Евгений Николаевич Надеждин, Дмитрий Николаевич Никольский, Елена Германовна Шмакова
Реферат
The architecture of a multilayer network consisting of several levels of active elements is considered. The input level forms the signals propagated to the connectors (synapses) of the first level. All odd layers of the network consist of connectors (synapses), and even ones consist of switches (neurons). The number of connectors and switches in each layer corresponds to the number of reference signals, the training sample vectors. The process of recurrent adjustment of synaptic connections and neuronal responses of the network is explained both by measures of the similarity of the training sample vectors and similarity measures of these similarity measures. An experimental study using the example of a six-layer network showed that a multi-layer neural-like network of direct propagation is much easier to learn than a recursive network trained by the method of the error back propagation. At the same time, the proposed network is resistant to significant interference when distinguishing signals, which is due to the consideration of additional connections between the components of the reference signals. When analyzing signals against a noise background, under the condition of "interference amplitude / signal amplitude" is less than the average spread of the reference signals, this advantage can become decisive, since it makes it possible to realize an almost error-free signal difference.