AN ENHANCED DIFFERENTIAL EVOLUTION ALGORITHM WITH ADAPTIVE WEIGHT BOUNDS FOR EFFICIENT TRAINING OF NEURAL NETWORKS
Article Sidebar
Open full text
Issue Vol. 13 No. 1 (2023)
-
AN ENHANCED DIFFERENTIAL EVOLUTION ALGORITHM WITH ADAPTIVE WEIGHT BOUNDS FOR EFFICIENT TRAINING OF NEURAL NETWORKS
Saithip Limtrakul, Jeerayut Wetweerapong4-13
-
APPLICATION OF EXPLAINABLE ARTIFICIAL INTELLIGENCE IN SOFTWARE BUG CLASSIFICATION
Łukasz Chmielowski, Michał Kucharzak, Robert Burduk14-17
-
AUTOMATIC DETECTION OF ALZHEIMER'S DISEASE BASED ON ARTIFICIAL INTELLIGENCE
Achraf Benba, Abdelilah Kerchaoui18-21
-
IMPLEMENTATION OF AN ARTIFICIAL INTELLIGENCE-BASED ECG ACQUISITION SYSTEM FOR THE DETECTION OF CARDIAC ABNORMALITIES
Achraf Benba, Fatima Zahra El Attaoui, Sara Sandabad22-25
-
DEVELOPMENT AND MODELING OF THE ANTENNA SYSTEM THE DIRECTION FINDER UNMANNED AERIAL VEHICLE
Juliy Boiko, Oleksiy Polikarovskykh, Vitalii Tkachuk26-32
-
SENSOR PLATFORM OF INDUSTRIAL TOMOGRAPHY FOR DIAGNOSTICS AND CONTROL OF TECHNOLOGICAL PROCESSES
Krzysztof Król, Tomasz Rymarczyk, Konrad Niderla, Edward Kozłowski33-37
-
FEATURES OF THE IMPLEMENTATION OF COMPUTER VISION IN THE PROBLEMS OF AUTOMATED PRODUCT QUALITY CONTROL
Nataliia Stelmakh, Ihor Mastenko, Olga Sulima, Tetiana Rudyk38-41
-
SELF-OSCILLATING PARAMETRIC HUMIDITY SENSOR WITH FREQUENCY OUTPUT SIGNAL
Iaroslav Osadchuk, Alexander Osadchuk, Vladimir Osadchuk, Lyudmila Krylik42-49
-
DEVELOPMENT OF A MODIFIED METHOD OF NETWORK TRAFFIC FORMING
Valeriy Kozlovskiy, Natalia Yakymchuk, Yosyp Selepyna, Serhii Moroz, Anatolii Tkachuk50-53
-
DIGITALIZATION OF ENTERPRISE WITH ENSURING STABILITY AND RELIABILITY
Gulnar Balakayeva, Paul Ezhilchelvan, Yerlan Makashev, Christofer Phillips, Dauren Darkenbayev, Kalamkas Nurlybayeva54-57
-
MODIFICATIONS OF EVANS PRICE EQUILIBRIUM MODEL
Serhii Zabolotnii, Sergii Mogilei58-63
-
A COMPARATIVE STUDY OF VARIOUS MODELS OF EQUIVALENT PLASTIC STRAIN TO FRACTURE
Volodymyr Mykhalevych, Yurii Dobraniuk, Victor Matviichuk, Volodymyr Kraievskyi, Oksana Тiutiunnyk, Saule Smailova, Ainur Kozbakova64-70
Archives
-
Vol. 15 No. 3
2025-09-30 24
-
Vol. 15 No. 2
2025-06-27 24
-
Vol. 15 No. 1
2025-03-31 26
-
Vol. 14 No. 4
2024-12-21 25
-
Vol. 14 No. 3
2024-09-30 24
-
Vol. 14 No. 2
2024-06-30 24
-
Vol. 14 No. 1
2024-03-31 23
-
Vol. 13 No. 4
2023-12-20 24
-
Vol. 13 No. 3
2023-09-30 25
-
Vol. 13 No. 2
2023-06-30 14
-
Vol. 13 No. 1
2023-03-31 12
-
Vol. 12 No. 4
2022-12-30 16
-
Vol. 12 No. 3
2022-09-30 15
-
Vol. 12 No. 2
2022-06-30 16
-
Vol. 12 No. 1
2022-03-31 9
-
Vol. 11 No. 4
2021-12-20 15
-
Vol. 11 No. 3
2021-09-30 10
-
Vol. 11 No. 2
2021-06-30 11
-
Vol. 11 No. 1
2021-03-31 14
Main Article Content
DOI
Authors
Abstract
Artificial neural networks are essential intelligent tools for various learning tasks. Training them is challenging due to the nature of the data set, many training weights, and their dependency, which gives rise to a complicated high-dimensional error function for minimization. Thus, global optimization methods have become an alternative approach. Many variants of differential evolution (DE) have been applied as training methods to approximate the weights of a neural network. However, empirical studies show that they suffer from generally fixed weight bounds. In this research, we propose an enhanced differential evolution algorithm with adaptive weight bound adjustment (DEAW) for the efficient training of neural networks. The DEAW algorithm uses small initial weight bounds and adaptive adjustment in the mutation process. It gradually extends the bounds when a component of a mutant vector reaches its limits. We also experiment with using several scales of an activation function with the DEAW algorithm. Then, we apply the proposed method with its suitable setting to solve function approximation problems. DEAW can achieve satisfactory results compared to exact solutions.
Keywords:
References
Baioletti M., Di Bari G., Milani A., Poggioni V.: Differential Evolution for Neural Networks Optimization. Mathematics 8(1), 2020, 69 [http://doi.org/10.3390/math8010069]. DOI: https://doi.org/10.3390/math8010069
Bartlett P. L.: For Valid Generalization, the Size of the Weights is More Important than the Size of the Network. Proceedings of the 9th International Conference on Neural Information Processing Systems, 1996, 134–140.
Bartlett P. L.: The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Transactions on Information Theory 44, 1998, 525–536 [http://doi.org/10.1109/18.661502]. DOI: https://doi.org/10.1109/18.661502
Chen L.: A global optimization algorithm for neural network training. Proceedings of International Conference on Neural Networks 1993, 443–446 [http://doi.org/10.1109/IJCNN.1993.713950]. DOI: https://doi.org/10.1109/IJCNN.1993.713950
Chihaoui M., Bellil W., Amar C. B.: Multi Mother Wavelet Neural Network based on Genetic Algorithm for 1D and 2D Functions Approximation. Proceedings of the International Conference on Fuzzy Computation and International Conference on Neural Computation 2010, 429–434 [http://doi.org/10.5220/0003083704290434]. DOI: https://doi.org/10.5220/0003083704290434
Cong H., Nguyen N., Huy V. N., Bùi T.: The Influence of Initial Weights on Neural Network Training. Journal of Science and Technology 95, 2013, 18–25.
Das S., Suganthan P. N.: Differential Evolution: A Survey of the State-of-the-Art. IEEE Transactions on Evolutionary Computation 15, 2011, 4–31 [http://doi.org/10.1109/TEVC.2010.2059031]. DOI: https://doi.org/10.1109/TEVC.2010.2059031
Das S., Mullick S. S., Suganthan P. N.: Recent advances in differential evolution – An updated survey. Swarm and Evolutionary Computation 17, 2016, 1–30 [http://doi.org/10.1016/j.swevo.2016.01.004]. DOI: https://doi.org/10.1016/j.swevo.2016.01.004
Dhar V. K., Tickoo A. K., Koul R., Dubey B. P.: Comparative performance of some popular artificial neural network algorithms on benchmark and function approximation problems. Pramana 74, 2010, 307–324 [http://doi.org/10.1007/s12043-010-0029-4]. DOI: https://doi.org/10.1007/s12043-010-0029-4
Gao Y., Liu J.: A modified differential evolution algorithm and its application in the training of BP neural network. IEEE/ASME International Conference on Advanced Intelligent Mechatronics 2008, 1373–1377.
Garro B. A., Sossa H., Vázquez R. A.: Evolving Neural Networks: A Comparison between Differential Evolution and Particle Swarm Optimization. Advances in Swarm Intelligence 2011, 447–454 [http://doi.org/10.1007/978-3-642-21515-5_53]. DOI: https://doi.org/10.1007/978-3-642-21515-5_53
Hahm N., Hong B. I.: An approximation by neural networkswith a fixed weight. Computers and Mathematics with Applications 47, 2004, 1897–1903 [http://doi.org/10.1016/j.camwa.2003.06.008]. DOI: https://doi.org/10.1016/j.camwa.2003.06.008
Ismailov V. E.: Approximation by neural networks with weights varying on a finite set of directions. Journal of Mathematical Analysis and Applications 389, 2012, 72–83 [http://doi.org/10.1016/j.jmaa.2011.11.037]. DOI: https://doi.org/10.1016/j.jmaa.2011.11.037
Jesus R. J., Antunes M. L., da Costa R. A., Dorogovtsev S. N., Mendes J. F., Aguiar R. L.: Effect of the initial configuration of weights on the training and function of artificial neural networks. Mathematics 9, 2021, 1–16 [http://doi.org/10.3390/math9182246]. DOI: https://doi.org/10.3390/math9182246
Mendes R., Cortez P., Rocha M., Neves J.: Particle swarms for feedforward neural network training. Proceedings of the International Joint Conference on Neural Networks – IJCNN'02 2002, 1895–1899 [http://doi.org/10.1109/IJCNN.2002.1007808]. DOI: https://doi.org/10.1109/IJCNN.2002.1007808
Mezura M. E., Velázquez R. J., Coello C.: A comparative study of differential evolution variants for global optimization. GECCO 2006 – Genetic and Evolutionary Computation Conference 1, 2006, 485–492 [http://doi.org/10.1145/1143997.1144086]. DOI: https://doi.org/10.1145/1143997.1144086
Migdady H.: Boundness of a Neural Network Weights Using the Notion of a Limit of a Sequence. International Journal of Data Mining and Knowledge Management Process 4, 2014, 1–8 [http://doi.org/10.5121/ijdkp.2014.4301]. DOI: https://doi.org/10.5121/ijdkp.2014.4301
Mirjalili S. A., Hashim S. Z. M., Sardroudi H. M. Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Applied Mathematics and Computation 218, 2012, 11125–11137 [http://doi.org/10.1016/j.amc.2012.04.069]. DOI: https://doi.org/10.1016/j.amc.2012.04.069
Morse G., Stanley K. O.: Simple Evolutionary Optimization Can Rival Stochastic Gradient Descent in Neural Networks. Proceedings of the Genetic and Evolutionary Computation Conference 2016, 477–484 [http://doi.org/10.1145/2908812.2908916]. DOI: https://doi.org/10.1145/2908812.2908916
Mohamad F. A., Nor A. M. I., Wei H. L., Koon M. A.: Differential evolution: A recent review based on state-of-the-art works. Alexandria Engineering Journal 61(5), 2022, 3831–3872 [http://doi.org/10.1016/j.aej.2021.09.013]. DOI: https://doi.org/10.1016/j.aej.2021.09.013
Piotrowski A. P.: Differential Evolution algorithms applied to Neural Network training suffer from stagnation. Applied Soft Computing 21, 2014, 382–406 [http://doi.org/10.1016/j.asoc.2014.03.039]. DOI: https://doi.org/10.1016/j.asoc.2014.03.039
Prechelt L.: A quantitative study of experimental evaluations of neural network learning algorithms: Current research practice. Neural Networks 9, 1996, 457–462 [http://doi.org/10.1016/0893-6080(95)00123-9]. DOI: https://doi.org/10.1016/0893-6080(95)00123-9
Prieto A., Prieto B., Ortigosa E. M., Ros E.: Neural networks: An overview of early research, current frameworks and new challenges. Neurocomputing 214, 2016, 242–268 [http://doi.org/10.1016/j.neucom.2016.06.014]. DOI: https://doi.org/10.1016/j.neucom.2016.06.014
Rumelhart D. E., Hinton G. E., Williams R. J.: Learning representations by back-propagating errors. Nature 323, 1986, 533–536 [http://doi.org/10.1038/323533a0]. DOI: https://doi.org/10.1038/323533a0
Si T., Hazra S., Jana N. D.: Artificial Neural Network Training Using Differential Evolutionary Algorithm for Classification. Advances in Intelligent and Soft Computing 232, 2012, 769–778 [http://doi.org/10.1007/978-3-642-27443-5-88]. DOI: https://doi.org/10.1007/978-3-642-27443-5_88
Storn R., Price K.: Differential Evolution A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. Journal of Global Optimization 11, 1997, 341–359 [http://doi.org/10.1023/A:1008202821328]. DOI: https://doi.org/10.1023/A:1008202821328
Subudhi B., Jena D.: An improved differential evolution trained neural network scheme for nonlinear system identification. International Journal of Automation and Computing 6, 2009, 137–144 [http://doi.org/10.1007/s11633-009-0137-0]. DOI: https://doi.org/10.1007/s11633-009-0137-0
Yang S., Ting T. O., Man K. L., Guan S. U.: Investigation of Neural Networks for Function Approximation. Procedia Computer Science 17, 2013, 586–594 [http://doi.org/10.1016/j.procs.2013.05.076]. DOI: https://doi.org/10.1016/j.procs.2013.05.076
Zainuddin Z., Pauline O.: Function Approximation Using Artificial Neural Networks. International Journal of Systems Applications, Engineering and Development 1, 2007, 173–178 [http://doi.org/10.5555/1466915.1466916].
Zhang J. R., Lok T. M., Lyu M. R.: A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Applied Mathematics and Computation 185, 2007, 1026–1037 [http://doi.org/10.1016/j.amc.2006.07.025]. DOI: https://doi.org/10.1016/j.amc.2006.07.025
UCI machine learning benchmark repository. the UC Irvine Machine Learning Repository, 2019 [http://archive.ics.uci.edu/ml/datasets.php].
Article Details
Abstract views: 355
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
