A New Multi-layer Perceptron Trainer based on Dragonfly Optimization Algorithm. | ||||
MEJ- Mansoura Engineering Journal | ||||
Article 6, Volume 41, Issue 4, December 2016, Page 11-19 PDF (912.61 K) | ||||
Document Type: Research Studies | ||||
DOI: 10.21608/bfemu.2020.103962 | ||||
View on SCiNiTO | ||||
Authors | ||||
Mohy Eldin A. Abo-Elsoud1; Mohamed Morsy2; Shaima Elnazer 3 | ||||
1Prof.,Communication department., Faculty of Eng., Mansoura University | ||||
2Doctor.,Communication department., Faculty of Eng., Mansoura University. | ||||
3, lecture assistant at Nile academy, Mansoura University | ||||
Abstract | ||||
In this paper, Dragonfly Optimizer (DO) was used to train Multi-Layer Perceptron (MLP). DO was used to find the weights and biases of the MLP to achieve a minimum error and a high classification accuracy. Four standard classification datasets were used to benchmark the performance of the proposed method. In addition, the performance of the proposed method were compared with three well-known optimization algorithms, namely, Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), and Grey Wolf Optimizer (GWO) which were used to train MLP also. The experimental results showed that the DO algorithm with the MLP was very competitive as it solved the local optima problem and achieved high accuracy | ||||
Keywords | ||||
Dragonfly Optimizer; Neural Network and Multi-Layer Perceptron | ||||
Statistics Article View: 101 PDF Download: 231 |
||||