THE NEW RANK ONE CLASS FOR UNCONSTRAINED PROBLEMS SOLVING

Ahmed Anwer Mustafa*

  Mathematics Dep., Faculty of Education, Zakho University, Kurdistan Region – Iraq

 ahmed.mustafa@uoz.edu.krd

Received:31, Oct.,2022, /Accepted: 09 Jan..,2023/Published:25, Apr.,2023                   https://doi.org/10.25271/sjuoz.2023.11.2.1049

ABSTRACT:

One of the most well-known methods for unconstrained problems is the quasi-Newton approach, iterative solutions. The great precision and quick convergence of the quasi-Newton methods are well recognized. In this work, the new algorithm for the symmetric rank one SR1 method is driven.

The strong Wolfe line search criteria define the step length selection. We also proved the new quasi-Newton equation and positive definite matrix theorem. Preliminary computer testing on the set of fourteen unrestricted optimization test functions leads to the conclusion that this new method is more effective and durable than the implementation of classical SR1 method in terms of iterations count and functions.

KEYWORDS: Quasi-Newton, Symmetric Rank One, Positive Definite Matrix, Unconstrained Optimization, Nonlinear Optimization.


INTRODUCTION

There are several initiatives to improve the Hessian matrix's approximation. Zhang J. and Xu Ch. proposed the modified Quasi-Newton condition (Zhang  and Xu Ch.,2001).  They used both gradient and function-meaning knowledge to achieve a higher-order accuracy in approximating the second curvature of the objective function.

Based on the extended quasi-Newton condition that was modified by authors in (Issam, Basim and Aadil ,2022), the symmetric rank one update ensures that the approach preserves its symmetric and positive definite properties. It also guarantees global and superliner convergence. The following unrestricted optimization issue is taken into consideration:

 

(*)

 

where  ,  is the number of variables and, is a continuously differentiable function. There are several iterative approaches available to solve the issue (*) One of the most often-used approaches is the quasi-Newton (QN) method (Farzin , Malik  and Wah ,2009).

 

The classic quasi-Newton equation is satisfied by some well-known updates of H, but very few of them have been able to compete numerically with the well-known Cullum and Brayton (SR1) formula (Cullum and Brayton,1979) (Issam, Basim and  Aadil A.,2022).

 

 =  +            

To get more updates on the values of see (Basheer,hl, 2016),(Dennis & Schnabel,1982), (Fletcher,1980),( Gill, P, Murray&Wright,1981),( Mahmood & Farqad ,2017)( Philipp,2013), (Saad & Jaafer ,2022) and ( Wah & Malik A. ,2009) iteratively determining a new solution approximation using quasi-Newton techniques.

                   

                                                            

where  is currently the iterative point.,  is a step length and  is the search direction. This direction is calculated by

                                                                                                                                

Every time an iteration occurs, the matrix  is changed to a new Hessian inverse approximation, , for which the usual QN equation applies.:

   

where        ,     =   -

The strong Wolfe terms (SWT), which is the line search method is defined as follows:

                                                                                

                                                                                     

where

THE NEW METHOD'S DERIVATION AND ITS ALGORITHM

Derivation of New Algorithm

       To identify a minimum of many dimensions’ nonlinear functions, the method's basic concept is to apply a modified quasi-Newton condition approximation. The modified quasi-Newton condition is as follows.

where  and   such that  and

The correction term in rank one is  where  and  . Therefore, the updated equation is

= +                                                             (1)                                                          

Observe that if   is symmetric, then so is    . Our goal now is to determine .

given    ,  and  so that the required relationship                                                                        (2)                                                              

is satisfied. In other words, given  , and  we wish to find  , to ensure that

 +

first, note that  is a scalar. Thus

-                                                      (3)

and hence    =          

We can now determine   =  

Hence,                                                         

= +                                        (4)  

Now multiply (3) by   to obtain

 -         

Since  is scalar and  =  , then the above equation becomes

-                                               (5)

By putting equation (5) in equation (4) we have,

= +                                         (6)

Which is the new algorithm of symmetric rank one

2.2 Algorithm of New Method

Step (1): Given  an initial vector,  a symmetric and positive definite matrix,    a termination scalar.  and compute

Step (2): Compute

Step (3): Compute .

Step (4): Find  satisfying the strong Wolfe condition.

Step (5): Calculate  ,   ,       =   -

 Step (6):  Compute

  ,   If  stop.

Step (7):  =  +

Step (9): If  go to step (3)

Step (10):  Set k=k+1, and go to step (4).

Theorem I: For the new algorithm applied to the quadratic functions with Hessian matrix , we have

Proof: Multiplying both sides of (6) by  from the left, we find

 =  +

Since  and   are scalars, then

 =    . So, we have  

Theorem II: The matrix  produced by the new monotonic approach is positively definite if is positive definite.

Proof: Multiply equation (6) by   from the right and by   from the left, we get

 =  +     

               and   are scalars, so

 So, we have

 =

 

Since  and , we have

 

Since    and are positive then  is also positive which makes the proof complete.

Illustrations and Tables

          This part focused on testing the implementation of the new method. The recent upgrade of SR1 and standard SR1 are evaluated in this study . Well-known nonlinear problems (classical test function) with various functions are used in the comparison testing for  All programs are created in FORTRAN 95, and the terminating condition is present in every case 

and restart using Powell's condition The line search routine was a cubic interpolation that uses function and gradient values.

The numbers of iterations, ISN Sand, and FSN are particularly mentioned in the findings in Tables (3. I) and (3. II). These tables of experimental findings demonstrate that the new algorithm outperforms the traditional approach of SR1 in terms of the number of iterations ISN and the number of functions FSN.

 




 

Table (3. I) Comparison of the Standard SR1 and the New Algorithm

 


 

No. of Check

Check Function

N

Standard Formula (SR1)

New Formula

ISN

FSN

ISN

FSN

 

1

 

Wood

5

100

500

2000

40

266

815

2223

194

918

3067

6937

21

24

24

24

61

68

67

68

2

Cubic

5

100

500

2000

19

49

59

72

67

144

181

224

15

16

18

18

44

46

50

50

3

Rosen

5

100

500

2000

35

328

991

1699

112

1246

4196

51420

12

12

12

12

34

34

34

34

4

Powell

5

100

500

2000

19

71

52

41

64

211

168

120

15

18

19

28

41

48

49

75

5

Sum

5

100

500

2000

10

439

1329

FF

47

1892

5323

FF

9

224

541

FF

32

897

2198

FF

6

Nondiagonal

5

100

500

2000

22

42

42

50

74

175

283

236

18

12

22

31

53

57

61

94

7

G-Helical

5

100

500

2000

20

FF

FF

FF

62

FF

FF

FF

19

19

19

19

47

47

47

47

8

G-Central

5

100

500

2000

21

22

23

23

254

260

266

266

16

16

17

17

71

71

78

78

9

Wolfe

5

100

500

2000

8

72

82

182

19

157

178

772

9

43

46

60

19

88

95

138

10

Beal

5

100

500

2000

9

10

10

11

27

27

27

29

7

7

7

8

21

21

21

23

11

Fred

5

100

500

2000

7

8

10

10

24

28

29

29

6

7

7

7

19

21

21

21

12

Resip

5

100

500

2000

6

6

6

6

19

19

19

19

6

6

7

7

14

14

16

16

13

Shallow

5

100

500

2000

8

8

8

10

28

28

28

32

7

7

7

7

19

19

19

19

14

Miele

5

100

500

2000

29

35

35

46

97

142

122

150

24

25

25

35

65

69

69

112

Totals

 

 

4661

63452

1468

5084

 

 


Note, FF means failure and we mean that the function did not work in the standard algorithm and worked in the new algorithm and when calculating the rate of improvement, we took the multipler value the function that did not work in the standard algorithm.


Table (3. II) A comparison of the new method's and the old algorithm's rates of improvement (SR1)

 

Tools

 SR1-Method

New Method

ISN

100%

31.4954%

FSN

100%

8.0124%

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 


 


Figures (3. I) Performing algorithms for the ISN and the FSN


Table (3. I), Table (3. II), and Fig. (3. I) compare the rate of improvement of the new algorithm to the traditional technologies. (SR1), The numerical results of the new algorithm are better than the standard algorithm since we observe that (ISN), and (FSN) of the standard algorithm (SR1) are about 100%, This means the new algorithm has improved on the standard algorithm (SR1) prorate (68.5046%) in (ISN) and prorate (91.9876%) in (FSN). The new algorithm has generally improved prorate (80.2461%) compared to conventional algorithms (SR1).

CONCLUSION


In this study, we introduced a brand-new symmetric rank one (SR1) approach with a few unique characteristics, such as the quasi-Newtonian requirement and the positive definite property. According to numerical results, this new algorithm outperforms the conventional symmetric rank one.

 Future updates and adjustments to the DFP and BFGS nonlinear unconstrained optimization techniques can be proposed in the same manner.

ABBREVIATIONS: QN Quasi Newton SR1 symmetric rank one, Min minimum, =  gradients, SWT strong Wolfe Terms, ISN iterations number, FSN functions number.

 

REFERENCES

Basheer M. S., Khalil K. A., Zeyad M. A. (2016). Partial    Davidon,                     Fletcher and Powell (DFP) of quasi newton method for unconstrained  optimization. Tikrit Journal of Pure Science, 21(6),180-186.

Cullum, j., and Brayton, R. K. (1979) Some Remarks on the Symmetric Rank-One Update, Journal of Optimization Theory and Applications. 27(4), 493-519.

 

Dennis, J.E., Schnabel, R.B. (1982). Numerical Methods for Nonlinear Equations and Unconstrained Optimization Prentice Hall, Englewood Cliffs.

 

Farzin M., Malik A. H. and Wah J. L. (2009). Memoryless Modified Symmetric Rank-One Method for Large-Scale Unconstrained Optimization, American Journal of Applied Sciences,6(12),2054-2059.

Fletcher, R. (1980). Practical Methods of Optimization. Wiley, New York.

Gill, P.E., Murray, W., Wright, M.H. (1981). Practical Optimization. Academic, London.

 

Issam A. R. M., Basim A. H., Aadil A. (2022).  New Self-scaling Quasi-Newton methods for unconstrained optimization. International Journal of Mathematics and Computer Science,17(3), 1061-1077.

 

Mahmood S. S. and Farqad H. (2017). On Extended Symmetric Rank One Update for     Unconstrained Optimization,” J. of Education. Special issued, 206-220.

Philipp H. Martin K.(2013). Quasi-Newton Methods: A New Direction. Journal of Machine Learning Research, 14,  843-865..

Saad Sh. M. and Jaafer H. E. (2022). New Class of Rank 1 Update for Solving Unconstrained Optimization Problem. Iraqi Journal of Science, 63(2), 683-689.

Wah J. L., Malik A. H.(2009).A restarting approach for the symmetric rank one update for unconstrained optimization. Comput. Optim. Appl.,42, 327–334.

Zhang J. and Xu Ch.(2001). Properties and Numerical           Performance of Modified quasi–Newton Methods Equations. Elsevier, J. of Comp. and App. Math,137, 269-278.