A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions | ||||
Journal of the ACS Advances in Computer Science | ||||
Article 7, Volume 3, Issue 1, 2009, Page 83-102 PDF (1.89 MB) | ||||
Document Type: Original Article | ||||
DOI: 10.21608/asc.2009.158226 | ||||
![]() | ||||
Abstract | ||||
Slow convergence and long training times are still the disadvantages often mentioned when neural networks are compared with other competing techniques. One of the reasons of slow convergence in Backpropagation learning is the diminishing value of the derivative of the commonly used activation functions as the nodes approach extreme values, namely, 0 or 1. In this paper, we propose eight activation functions to accelerate learning speed by eliminating the number of iterations and increasing the convergence rate. Mathematical proving of the errors for the output and hidden layers using these activation functions are concluded. Statistical measures are also obtained using these different activation functions. Through the simulated results, these activation functions are analyzed, compared and tested. The analytical approach indicates considerable improvement in training times and convergence performance. | ||||
Keywords | ||||
Neural Networks; Neural Network Learning; Backpropagation; Activation Functions; Convergence speed | ||||
Statistics Article View: 149 PDF Download: 184 |
||||