USING GA FOR EVOLVING WEIGHTS IN NEURAL NETWORKS

Wafaa Mustafa HAMEED

wafaa.mustafa@sulicihan.edu.krd
Assistant lecturer, Department of Computer Science, Cihan University – Slemani, Slemani (Iraq)

Asan Baker KANBAR


Assistant lecturer, Department of Computer Science, Cihan University – Slemani, Slemani (Iraq)

Abstract

This article aims at studying the behavior of different types of crossover operators in the performance of Genetic Algorithm. We have also studied the effects of the parameters and variables (crossover probability (pc), mutation probability (pm), population size (pop-size) and number of generation (NG)) for controlling the algorithm. This research accumulated most of the types of crossover operators these types are implemented on evolving weights of Neural Network problem. The article investigates the role of crossover in GAs with respect to this problem, by using a comparative study between the iteration results obtained from changing the parameters values (crossover probability, mutation rate, population size and number of generation). From the experimental results, the best parameters values for the Evolving Weights of XOR-NN problem are NG = 1000, pop-size = 50, pm = 0.001, pc = 0.5 and the best operator is Line Recombination crossover.


Keywords:

genetic algorithm, neural network, crossover, mutation

Al-Inazy, Q. A. (2005). A Comparison between Lamarckian Evolution and Behavior Evolution of Neural Network (Unpublished M.Sc. Thesis). Al- Mustansriyah University, Baghdad, Iraq.
  Google Scholar

Arjona, D. (1996). A hybrid artificial neural network/genetic algorithm approach to on-line operations for the optimization of electrical power systems. In IECEC 96. Proceedings of the 31st Intersociety Energy Conversion Engineering Conference (pp. 2286–2290 vol. 4). Washington, DC, USA. https://doi.org/10.1109/IECEC.1996.561174
DOI: https://doi.org/10.1109/IECEC.1996.561174   Google Scholar

Goldberg, D. E. (1989). Genetic Algorithms in search, Optimization, and Machine Learning. Boston, MA, USA: Addison–Wesley Longman Publishing Co., Inc.
  Google Scholar

Koza, J. R. (1992). Genetic programming: on the programming of computers by means of natural selection. Cambridge, MA, USA: MIT Press.
  Google Scholar

Michalewicz, Z. (1996). Genetic Algorithm + Data Structure = Evolution Programs, 3rd Revised Extended Edition. New York, USA: Springer – Verlag Berlin Heidelberg.
DOI: https://doi.org/10.1007/978-3-662-03315-9   Google Scholar

Mitchell, M. (1998). An Introduction of Genetic Algorithms. Cambridge, MA, USA: MIT Press.
DOI: https://doi.org/10.7551/mitpress/3927.001.0001   Google Scholar

Montana, D., & Davis, L. (1989). Training Feed Forward neural networks using Genetic Algorithms, In IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence (pp. 762–767). San Francisco, CA, USA: Morgan Kaufmann Publishers Inc.
  Google Scholar

Hameed, W. M., & Kanbar, A. B. (2017). A Comparative Study of Crossover Operators for Genetic Algorithms to Solve Travelling Salesman Problem. International Journal of Research – Granthaalayah, 5(2), 284–291. https://doi.org/10.5281/zenodo.345734
DOI: https://doi.org/10.29121/granthaalayah.v5.i2.2017.1740   Google Scholar

Hameed, W. M. (2016). The Role of Crossover on Optimization of a Function Problem Using Genetic Algorithms. International Journal of Computer Science and Mobile Computing, 5(7), 425–429.
  Google Scholar

Weisman, O., & Pollack, Z. (2002). Neural Networks Using Genetic Algorithm. Retrieved from http://www.cs.bgu.ac.il/NNUGA.
  Google Scholar

Whitley, D., Starkweather, T., & Fuquay, D. A. (1989). Scheduling Problems and Traveling Salesman: The Genetic Edge Recombination Operator. ICGA.
  Google Scholar

Whitley, D. (1995). Genetic Algorithms and Neural Networks. In J. Periaux & G. Winter (Eds.), Genetic Algorithms in Engineering and Computer Science (pp. 191-201). John Wiley & Son Corp.
  Google Scholar

Wright, A. H. (1991). Genetic Algorithms for Real Parameters Optimization. Foundation of Genetic Algorithms, 1, 205-218. https://doi.org/10.1016/B978-0-08-050684-5.50016-1
DOI: https://doi.org/10.1016/B978-0-08-050684-5.50016-1   Google Scholar

Download


Published
2019-09-30

Cited by

HAMEED, W. M., & KANBAR, A. B. (2019). USING GA FOR EVOLVING WEIGHTS IN NEURAL NETWORKS. Applied Computer Science, 15(3), 21–33. https://doi.org/10.23743/acs-2019-18

Authors

Wafaa Mustafa HAMEED 
wafaa.mustafa@sulicihan.edu.krd
Assistant lecturer, Department of Computer Science, Cihan University – Slemani, Slemani Iraq

Authors

Asan Baker KANBAR 

Assistant lecturer, Department of Computer Science, Cihan University – Slemani, Slemani Iraq

Statistics

Abstract views: 37
PDF downloads: 2


License

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

All articles published in Applied Computer Science are open-access and distributed under the terms of the Creative Commons Attribution 4.0 International License.


Similar Articles

1 2 3 4 5 6 7 8 9 > >> 

You may also start an advanced similarity search for this article.