Abstract
This paper proposes a new architecture for supervised incremental learning using neural networks. The key feature of this architecture is a special perceptron, called monitor perceptron, which decides whether a new sample belongs to a new class or to one of the known (already learnt) classes. In case if the decision by the monitor perceptron is that the sample belongs to a new class then the network is extended such that the new class is learnt by the network. The final network is a set of parallel neural networks (one for each class) whose output is fed into the monitor perceptron. A series of experiments are performed using benchmark data sets. The results obtained in these experiments are comparable with or better than those obtained using other, state of art, techniques. The growth in number of neurons is linear with respect to the growth of number of classes.
Get full access to this article
View all access options for this article.
