POPRAWA PARAMETRÓW REGRESJI WEKTORA NOŚNEGO V Z RÓWNOLEGŁYM WYBOREM CECHY POPRZEZ WYKORZYSTANIE ALGORYTMU QUASI-OPOZYCYJNEGO I ALGORYTMU OPTYMALIZACJI HARRIS HAWKS

Omar Mohammed Ismael


Ministry of Education, Directorate of Education in Nineveh (Irak)
https://orcid.org/0009-0005-6739-4790

Omar Saber Qasim

omar.saber@uomosul.edu.iq
University of Mosul, Department of Mathematics (Irak)
https://orcid.org/0000-0003-3301-6271

Zakariya Yahya Algamal


University of Mosul, Department of Statistics and Informatics (Irak)
https://orcid.org/0000-0002-0229-7958

Abstrakt

Liczne problemy występujące w świecie rzeczywistym rozwiązano za pomocą regresji wektora nośnego, w szczególności regresji wektora nośnego v (v-SVR), ale niektóre parametry wymagają ręcznej zmiany. Ponadto v-SVR nie obsługuje wyboru funkcji. Do identyfikacji cech i estymacji hiperparametrów wykorzystano techniki inspirowane naturą. W tym badaniu wprowadzono quasi-opozycyjną metodę optymalizacji Harris Hawks (QOBL-HHOA), aby osadzić selekcję cech i jednocześnie optymalizować hiperparametr v-SVR. Wyniki eksperymentów przeprowadzono przy użyciu czterech zbiorów danych. Wykazano, że pod względem predykcji, liczby możliwych do wybrania cech oraz czasu wykonania zaproponowany algorytm sprawdza się lepiej niż metody krzyżowej walidacji i wyszukiwania siatki. W porównaniu z innymi algorytmami inspirowanymi naturą wyniki eksperymentalne QOBL-HHOA pokazują jego skuteczność w poprawianiu dokładności przewidywań i czasu przetwarzania. Wykazuje również zdolność QOBL. Wyszukując optymalne wartości hiperparametrów, HHOA mogą zlokalizować funkcje, które są najbardziej przydatne w zadaniach predykcyjnych. W rezultacie algorytm QOBL-HHOA może być bardziej odpowiedni niż inne algorytmy do identyfikacji łącza danych pomiędzy cechami wejścia a pożądaną zmienną. Natomiast wyniki numeryczne wykazały wyższość tej metody nad wymienionymi metodami, na przykład błąd średniokwadratowy wyników metody QOBL-HHOA (2,05E-07) z zestawem danych dotyczących neuraminidazy grypy był lepszy niż w pozostałych. Jest to niezwykle pomocne przy przewidywaniu innych sytuacji w świecie rzeczywistym.


Słowa kluczowe:

regresja wektora v-nośnego, algorytm Harris hawks, wybór hiperparametrów, uczenie się quasi-opozycyjne

Al-Fakih, A. et al.: A QSAR model for predicting antidiabetic activity of dipeptidyl peptidase-IV inhibitors by enhanced binary gravitational search algorithm. SAR and QSAR in Environmental Research 30(6), 2019, 403–416.
  Google Scholar

Al-Fakih A. et al.: QSAR classification model for diverse series of antifungal agents based on improved binary differential search algorithm. SAR and QSAR in Environmental Research 30(2), 2019, 131–143.
  Google Scholar

Algamal Z. Y. et al.: High‐dimensional QSAR prediction of anticancer potency of imidazo [4, 5‐b] pyridine derivatives using adjusted adaptive LASSO. Journal of Chemometrics 29(10), 2015, 547–556.
  Google Scholar

Al-Thanoon N. A., Qasim O. S., Algamal Z. Y.: A new hybrid firefly algorithm and particle swarm optimization for tuning parameter estimation in penalized support vector machine with application in chemometrics. Chemometrics and Intelligent Laboratory Systems 184, 2019, 142–152.
  Google Scholar

Al-Thanoon N. A., Qasim O. S., Algamal Z. Y.: Improving nature-inspired algorithms for feature selection. Journal of Ambient Intelligence and Humanized Computing 2022, 1–11.
  Google Scholar

Al-Thanoon N. A., Qasim O. S., Algamal Z. Y.: Tuning parameter estimation in SCAD-support vector machine using firefly algorithm with application in gene selection and cancer classification. Comput Biol Med 103, 2018, 262–268.
  Google Scholar

Al-Thanoon N. A., Qasim O. S., Algamal Z. Y.: Variable selection in gamma regression model using binary gray wolf optimization algorithm. Journal of Physics: Conference Series. 2020.
  Google Scholar

Cao G., Wu L.: Support vector regression with fruit fly optimization algorithm for seasonal electricity consumption forecasting. Energy 115, 2016, 734–745.
  Google Scholar

Cheng C.-T. et al.: Optimizing Hydropower Reservoir Operation Using Hybrid Genetic Algorithm and Chaos. Water Resources Management 22(7), 2007, 895–909.
  Google Scholar

Cheng J., Qian J., Guo Y.-N.: Adaptive chaotic cultural algorithm for hyperparameters selection of support vector regression. International Conference on Intelligent Computing. Springer 2009.
  Google Scholar

Cherkassky V., Ma Y.: Practical selection of SVM parameters and noise estimation for SVM regression. Neural networks 17(1), 2004, 113–126.
  Google Scholar

Chou J.-S., Pham A.-D.: Nature-inspired metaheuristic optimization in least squares support vector regression for obtaining bridge scour information. Information Sciences 399, 2017, 64–80.
  Google Scholar

Chuang C.-C., Lee Z.-J.: Hybrid robust support vector machines for regression with outliers. Applied Soft Computing 11(1), 2011, 64–72.
  Google Scholar

Fan Q., Chen Z., Xia Z.: A novel quasi-reflected Harris hawks optimization algorithm for global optimization problems. Soft Computing, 2020.
  Google Scholar

Ganesh N. et al.: Efficient feature selection using weighted superposition attraction optimization algorithm. Applied Sciences 13(5), 2023, 3223.
  Google Scholar

Golilarz N. A. et al.: A New Automatic Method for Control Chart Patterns Recognition Based on ConvNet and Harris Hawks Meta Heuristic Optimization Algorithm. IEEE Access 7, 2019, 149398–149405.
  Google Scholar

Heidari A. A. et al.: Harris hawks optimization: Algorithm and applications. Future Generation Computer Systems 97, 2019, 849–872.
  Google Scholar

Hong W.-C. et al.: SVR with hybrid chaotic genetic algorithms for tourism demand forecasting. Applied Soft Computing 11(2), 2011, 1881–1890.
  Google Scholar

Houssein E. H. et al.: Optimal Sink Node Placement in Large Scale Wireless Sensor Networks Based on Harris’ Hawk Optimization Algorithm. IEEE Access 8, 2020, 19381–19397.
  Google Scholar

Huang C.-F.: A hybrid stock selection model using genetic algorithms and support vector regression. Applied Soft Computing 12(2), 2012, 807–818.
  Google Scholar

Ismael O. M., Qasim O.S., Algamal Z.Y.: A new adaptive algorithm for v-support vector regression with feature selection using Harris hawks optimization algorithm. in Journal of Physics: Conference Series, 2021.
  Google Scholar

Ito K., Nakano R..: Optimizing support vector regression hyperparameters based on cross-validation. Proceedings of the International Joint Conference on Neural Networks, 2003.
  Google Scholar

Kaneko H., Funatsu K.: Fast optimization of hyperparameters for support vector regression models with highly predictive ability. Chemometrics and Intelligent Laboratory Systems 142, 2015, 64–69.
  Google Scholar

Kazem A. et al.: Support vector regression with chaos-based firefly algorithm for stock market price forecasting. Applied Soft Computing 13(2), 2013, 947–958.
  Google Scholar

Kong D. et al.: Tool wear monitoring based on kernel principal component analysis and v-support vector regression. The International Journal of Advanced Manufacturing Technology 89(1–4), 2016, 175–190.
  Google Scholar

Laref R. et al.: On the optimization of the support vector machine regression hyperparameters setting for gas sensors array applications. Chemometrics and Intelligent Laboratory Systems 184, 2019, 22–27.
  Google Scholar

Li S., Fang H., Liu X.: Parameter optimization of support vector regression based on sine cosine algorithm. Expert Systems with Applications 91, 2018, 63–77.
  Google Scholar

Menesy S. A. et al.: Developing and Applying Chaotic Harris Hawks Optimization Technique for Extracting Parameters of Several Proton Exchange Membrane Fuel Cell Stacks. IEEE Access 8, 2020, 1146–1159.
  Google Scholar

Nait Amar M., Zeraibi N.: Application of hybrid support vector regression artificial bee colony for prediction of MMP in CO2-EOR process. Petroleum, 2018.
  Google Scholar

Naveh I. M. H. et al.: A Quasi-Oppositional Method for Output Tracking Control by Swarm-Based MPID Controller on AC/HVDC Interconnected Systems With Virtual Inertia Emulation. IEEE Access 9, 2021, 77572–77598.
  Google Scholar

Priyadarshini J. et al.: Analyzing physics-inspired metaheuristic algorithms in feature selection with K-nearest-neighbor. Applied Sciences 13(2), 2023, 906.
  Google Scholar

Qu C. et al.: Harris Hawks optimization with information exchange. Applied Mathematical Modelling 84, 2020, 52–75.
  Google Scholar

Rahnamayan S., Tizhoosh H. R., Salama M. M.: Quasi-oppositional differential evolution. IEEE congress on evolutionary computation, 2007. IEEE. regression with outliers. Applied Soft Computing 11(1), 2011, 64–72.
  Google Scholar

Schölkopf B. et al.: New support vector algorithms. Neural computation 12(5), 2000, 1207–1245.
  Google Scholar

Shaik K. et al.: Big Data Analytics Framework Using Squirrel Search Optimized Gradient Boosted Decision Tree for Heart Disease Diagnosis. Applied Sciences 13(9), 2023, 5236.
  Google Scholar

Shehabeldeen T. A. et al.: Modeling of friction stir welding process using adaptive neuro-fuzzy inference system integrated with harris hawks optimizer. Journal of Materials Research and Technology 8(6), 2019, 5882–5892.
  Google Scholar

Tizhoosh H. R.: Opposition-based learning: a new scheme for machine intelligence. in International conference on computational intelligence for modelling, control and automation and international conference on intelligent agents, web technologies and internet commerce – CIMCA-IAWTIC'06, 2005.
  Google Scholar

Too A., Mohd S..: A New Quadratic Binary Harris Hawk Optimization for Feature Selection. Electronics 8(10), 2019, 1130.
  Google Scholar

Tsirikoglou P. et al.: A hyperparameters selection technique for support vector regression models. Applied Soft Computing 61, 2017, 139–148.
  Google Scholar

Üstün B. et al.: Determination of optimal support vector regression parameters by genetic algorithms and simplex optimization. Analytica Chimica Acta 544(1–2), 2005, 292–305.
  Google Scholar

Vapnik V. N.: An overview of statistical learning theory. IEEE transactions on neural networks 10(5), 1999, 988–999.
  Google Scholar

Wu C.-H., Tzeng G.-H., Lin R.-H.: A Novel hybrid genetic algorithm for kernel function and parameter optimization in support vector regression. Expert Systems with Applications 36(3), 2009, 4725–4735.
  Google Scholar

Xu S. et al.: An improved variable selection method for support vector regression in NIR spectral modeling. Journal of Process Control 67, 2018, 83–93.
  Google Scholar

Zhang J. et al.: Optimization enhanced genetic algorithm-support vector regression for the prediction of compound retention indices in gas chromatography. Neurocomputing 240, 2017, 183–190.
  Google Scholar

Zhao Y.-P., Sun J.-G.: Robust truncated support vector regression. Expert Systems with Applications 37(7), 2010, 5126–5133.
  Google Scholar


Opublikowane
2024-06-30

Cited By / Share

Ismael, O. M., Qasim, O. S., & Algamal, Z. Y. (2024). POPRAWA PARAMETRÓW REGRESJI WEKTORA NOŚNEGO V Z RÓWNOLEGŁYM WYBOREM CECHY POPRZEZ WYKORZYSTANIE ALGORYTMU QUASI-OPOZYCYJNEGO I ALGORYTMU OPTYMALIZACJI HARRIS HAWKS. Informatyka, Automatyka, Pomiary W Gospodarce I Ochronie Środowiska, 14(2), 113–118. https://doi.org/10.35784/iapgos.5729

Autorzy

Omar Mohammed Ismael 

Ministry of Education, Directorate of Education in Nineveh Irak
https://orcid.org/0009-0005-6739-4790

Autorzy

Omar Saber Qasim 
omar.saber@uomosul.edu.iq
University of Mosul, Department of Mathematics Irak
https://orcid.org/0000-0003-3301-6271

Autorzy

Zakariya Yahya Algamal 

University of Mosul, Department of Statistics and Informatics Irak
https://orcid.org/0000-0002-0229-7958

Statystyki

Abstract views: 82
PDF downloads: 58


Licencja

Creative Commons License

Utwór dostępny jest na licencji Creative Commons Uznanie autorstwa 4.0 Międzynarodowe.