Abstract
The activation functions are essential in neural networks because they are the ones responsible for gaining the abstract characteristics of the input through the use of nonlinear transformations. The paper presents a new activation function, which is referred to as Hybrid Discontinuous Activation Function (HDAF). It is continuously differentiable except at zero, unbounded above, bounded below, and non-monotonic. Our results demonstrate that HDAF outperforms ReLU on a number of challenging datasets and neural network models. In the beginning of the experiment, neural networks are trained and classified using benchmark data like Breast Cancer Wisconsin (Diagnostic) and Iris Flower datasets and HDAF achieved 92.98% and 95.56% accuracy, respectively. Secondly, experiments were conducted on VGG16 over the MNIST and CIFAR10 datasets. The HDAF obtained 99.21%, and 84.95% accuracy respectively. Statistical feature measurements demonstrate that HDAF has the best mean accuracy, lowest standard deviation, lowest Root Mean squared error, lowest variance, and lowest Mean squared error. The study indicated that HDAF has faster convergence compared to ReLU, making it a valuable factor in deep learning. The findings from the experiments indicate that HDAF can be a promising for ReLU, leading to better performance in neural network models.
Keywords
Get full access to this article
View all access options for this article.
