Abstract
Probabilistic neural networks (PNNs) are artificial neural network algorithms widely used in pattern recognition and classification problems. In the traditional PNN algorithm, the probability density function (PDF) is approximated using the entire training dataset for each class. In some complex datasets, classmate clusters may be located far from each other and these distances between clusters may cause a reduction in the correct class's posterior probability and lead to misclassification. This paper presents a novel PNN algorithm, the competitive probabilistic neural network (CPNN). In the CPNN, a competitive layer ranks kernels for each class and an optimum fraction of kernels are selected to estimate the class-conditional probability. Using a stratified, repeated, random subsampling cross-validation procedure and 9 benchmark classification datasets, CPNN is compared to both traditional PNN and the state of the art(e.g. enhanced probabilistic neural network, EPNN). These datasets are examined with and without noise and the algorithm is evaluated with several ratios of training to testing data. In all datasets (225 simulation categories), performance percentages of both CPNN and EPNN are greater than or equivalent to that of the traditional PNN; in 73% of simulation categories, the CPNN analyses show modest improvement in performance over the state of the art.
Get full access to this article
View all access options for this article.
