Abstract
In this paper we start from the analysis of the three most common paradigms in neural networks: the relaxation neural networks, studied according to the method of equilibrium statistical mechanics, the oscillatory neural networks, and the chaotic neural networks. Hence, we propose a fourth neural architecture capable of displaying all the three precedent behaviors. The core of our architecture is a principle of mutual redefinition between the activation dynamics and the weight dynamics of the net for each pattern to be recognized, so as to overcome some computability problems typical of pattern recognition. The main characteristic of such an architecture are: (1) its capability to automatically redefine the optimal correlation neighborhood of each neuron of the network in view of the correct recognition of a given input; (2) the constructive use of noise, which, added automatically to the net dynamics and to the input itself, allows the correct recognition of highly noisy patterns. To illustrate this point, we develop a systematic comparison between the performances of this new architecture and of a backpropagation architecture in solving the classical “T-C” recognition problem. Then, we show a successful application of our new architecture to real time particle discrimination in a high energy physics experiment (“FENICE” experiment) carried out at the ADONE e+e− storage ring in Frascati (Italy).
Get full access to this article
View all access options for this article.
