A NEW CONJUGATE GRADIENT METHOD BASED ON LOGISTIC MAPPING FOR UNCONSTRAINED OPTIMIZATION AND ITS APPLICATION IN REGRESSION ANALYSIS

Authors

  • Sarwar Ahmad Hamad Department of Mathematics, College of Science, University of Zakho, Zakho, Kurdistan Region, Iraq
  • Dlovan Haji Omar Department of Mathematics, College of Science, University of Zakho, Zakho, Kurdistan Region, Iraq
  • Diman Abdulqader Sulaiman Department of Mathematics, College of Science, University of Zakho, Zakho, Kurdistan Region, Iraq
  • Alaa Luqman Ibrahim Department of Mathematics, College of Science, University of Zakho, Zakho, Kurdistan Region, Iraq

DOI:

https://doi.org/10.25271/sjuoz.2024.12.4.1310

Keywords:

optimization, conjugate gradient, step size, regression analysis

Abstract

The study tackles the critical need for efficient optimization techniques in unconstrained optimization problems, where conventional techniques often suffer from slow and inefficient convergence. There is still a need for algorithms that strike a balance between computational efficiency and robustness, despite advancements in gradient-based techniques. This work introduces a novel conjugate gradient algorithm based on the logistic mapping formula. As part of the methodology, descent conditions are established, and the suggested algorithm's global convergence properties are thoroughly examined. Comprehensive numerical experiments are used for empirical validation, and the new algorithm is compared to the Polak-Ribière-Polyak (PRP) algorithm. The suggested approach performs better than the PR algorithm, according to the results, and is more efficient since it needs fewer function evaluations and iterations to reach convergence. Furthermore, the usefulness of the suggested approach is demonstrated by its actual use in regression analysis, notably in the modelling of population estimates for the Kurdistan Region of Iraq. In contrast to conventional least squares techniques, the method maintains low relative error rates while producing accurate predictions. All things considered, this study presents the novel conjugate gradient algorithm as an effective tool for handling challenging optimisation problems in both theoretical and real-world contexts.

References

Andrei, N. (2008). Another hybrid conjugate gradient algorithm for unconstrained optimization. Numerical Algorithms, 47(2), 143–156. https://doi.org/10.1007/s11075-007-9152-9

Christensen, R. (1996). Analysis of Variance, Design, and Regression: Applied Statistical Methods. CRC Press, Chapman and Hall.

Dai, Y. H., & Yuan, Y. (1999). A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property. SIAM Journal on Optimization, 10(1), 177–182. https://doi.org/10.1137/S1052623497318992

Dai, Y. H., & Yuan, Y. (2001). An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization. Annals of Operations Research, 103(1), 33–47. https://doi.org/10.1023/A:1012930416777

Fletcher, R. (1987). Practical Methods of Optimization, Unconstrained Optimization (Vol. 1). John Wiley and Sons.

Fletcher, R., & Reeves, C. M. (1964). Function minimization by conjugate gradients. The Computer Journal, 7(2), 149–154. https://doi.org/10.1093/comjnl/7.2.149

Gilbert, J. C., & Nocedal, J. (1992). Global Convergence Properties of Conjugate Gradient Methods for Optimization. SIAM Journal on Optimization, 2(1), 21–42. https://doi.org/10.1137/0802003

Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards, 49(6), 409. https://doi.org/10.6028/jres.049.044

Hu, Y. F., & Storey, C. (1991). Global convergence result for conjugate gradient methods. Journal of Optimization Theory and Applications, 71(2), 399–405. https://doi.org/10.1007/BF00939927

Ibrahim, A. L., & Mohammed, M. G. (2022). A new three-term conjugate gradient method for training neural networks with global convergence. Indonesian Journal of Electrical Engineering and Computer Science, 28(1), 551–558. https://doi.org/10.11591/ijeecs.v28.i1.pp551-558

Ibrahim, A. L., & Mohammed, M. G. (2024). A new conjugate gradient for unconstrained optimization problems and its applications in neural networks. Indonesian Journal of Electrical Engineering and Computer Science, 33(1), 93–100. https://doi.org/10.11591/ijeecs.v33.i1.pp93-100

Ibrahim, A. L., & Shareef, S. G. (2019). A new class of three-term conjugate Gradient methods for solving unconstrained minimization problems. General Letters in Mathematics, 7(2). https://doi.org/10.31559/glm2019.7.2.4

Jahwar, B. H., Ibrahim, A. L., Ajeel, S. M., & Shareef, S. G. (2024). Two new classes of conjugate gradient method based on logistic mapping. Telkomnika (Telecommunication Computing Electronics and Control), 22(1), 86–94. https://doi.org/10.12928/TELKOMNIKA.v22i1.25264

Kurdistan Region Statistics Office, Ministry of Planning, K. R. G. (2021). Population Statistics. https://krso.gov.krd/en/statistics/population

Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: Theory. Journal of Optimization Theory and Applications, 69(1), 129–137. https://doi.org/10.1007/BF00940464

Lu, H., Zhang, H., & Ma, L. (2006). New optimization algorithm based on chaos. Journal of Zhejiang University - Science A: Applied Physics & Engineering, 7, 539–542. https://doi.org/10.1631/jzus.2006.A0539

Polak, E., & Ribiere, G. (1969). Note sur la convergence de méthodes de directions conjuguées. Revue Française d’informatique et de Recherche Opérationnelle. Série Rouge, 3(R1), 35–43. http://www.numdam.org/item/M2AN_1969__3_1_35_0/

Polyak B. T. (1969). The Conjugate Gradient Method in Extremal Problems. Zh. Vychisl. Mat. Mat. Fiz., 9(4), 807–821.

Shareef, S. G., & Ibrahim, A. L. (2016). A New Conjugate Gradient for Unconstrained Optimization Based on Step Size of Barzilai and Borwein. Science Journal of University of Zakho, 4(1 SE-), 104–114. https://sjuoz.uoz.edu.krd/index.php/sjuoz/article/view/311

Touati-Ahmed, D., & Storey, C. (1990). Efficient hybrid conjugate gradient techniques. Journal of Optimization Theory and Applications, 64(2), 379–397. https://doi.org/10.1007/BF00939455

Vandeginste, B. G. M. (1989). Nonlinear regression analysis: Its applications, D. M. Bates and D. G. Watts, Wiley, New York, 1988. ISBN 0471‐816434. Price: £34.50. Journal of Chemometrics, 3, 544–545. https://api.semanticscholar.org/CorpusID:120419559

Zhang, L., Zhou, W., & Li, D.-H. (2006). A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA Journal of Numerical Analysis, 26(4), 629–640. https://doi.org/10.1093/imanum/drl016

Zoutendijk, G. (1970). Nonlinear Programming Computational Methods. In: Abadie, J. Ed., Integer and Nonlinear Programming, NorthHolllad, Amsterdam, 37-86.

Downloads

Published

2024-11-13

How to Cite

Hamad , S. A., Omar , D. H., Sulaiman , D. A., & Ibrahim , A. L. (2024). A NEW CONJUGATE GRADIENT METHOD BASED ON LOGISTIC MAPPING FOR UNCONSTRAINED OPTIMIZATION AND ITS APPLICATION IN REGRESSION ANALYSIS. Science Journal of University of Zakho, 12(4), 484–489. https://doi.org/10.25271/sjuoz.2024.12.4.1310

Issue

Section

Science Journal of University of Zakho