• Previous Article
    Dual algorithms based on the proximal bundle method for solving convex minimax fractional programs
  • JIMO Home
  • This Issue
  • Next Article
    Partial convolution for total variation deblurring and denoising by new linearized alternating direction method of multipliers with extension step
doi: 10.3934/jimo.2018122

Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization

1. 

Department of Applied Mathematics, Graduate School of Science, Tokyo University of Science, 1-3, Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan

2. 

Faculty of International Social Sciences, Yokohama National University, 79-4 Tokiwadai, Hodogaya-ku, Yokohama 240-8501, Japan

3. 

Department of Applied Mathematics, Tokyo University of Science, 1-3, Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan

* Corresponding author: Shummin Nakayama

Received  July 2017 Revised  February 2018 Published  August 2018

Memoryless quasi-Newton methods are studied for solving large-scale unconstrained optimization problems. Recently, memoryless quasi-Newton methods based on several kinds of updating formulas were proposed. Since the methods closely related to the conjugate gradient method, the methods are promising. In this paper, we propose a memoryless quasi-Newton method based on the Broyden family with the spectral-scaling secant condition. We focus on the convex and preconvex classes of the Broyden family, and we show that the proposed method satisfies the sufficient descent condition and converges globally. Finally, some numerical experiments are given.

Citation: Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe. Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2018122
References:
[1]

M. Al-BaaliY. Narushima and H. Yabe, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Computational Optimization and Applications, 60 (2015), 89-110. doi: 10.1007/s10589-014-9662-z.

[2]

I. BongartA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, 21 (1995), 123-160. doi: 10.1145/200979.201043.

[3]

Z. Chen and W. Cheng, Spectral-scaling quasi-Newton methods with updates from the one parameter of the Broyden family, Journal of Computational and Applied Mathematics, 248 (2013), 88-98. doi: 10.1016/j.cam.2013.01.012.

[4]

W. Y. Cheng and D. H. Li, Spectral scaling BFGS method, Journal of Optimization Theory and Applications, 146 (2010), 305-319. doi: 10.1007/s10957-010-9652-y.

[5]

Y. H. Dai and C. X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM Journal on Optimization, 23 (2013), 296-320. doi: 10.1137/100813026.

[6]

Y. H. Dai and L. Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Applied Mathematics and Optimization, 43 (2001), 87-101. doi: 10.1007/s002450010019.

[7]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, 91 (2002), 201-213. doi: 10.1007/s101070100263.

[8]

J. A. Ford and I. A. Moghrabi, Multi-step quasi-Newton methods for optimization, Journal of Computational and Applied Mathematics, 50 (1994), 305-323. doi: 10.1016/0377-0427(94)90309-3.

[9]

J. C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization, 2 (1992), 21-42. doi: 10.1137/0802003.

[10]

N.I.M GouldD. Orban and P.L. Toint, CUTEr and SifDec: A constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 373-394. doi: 10.1145/962437.962439.

[11]

W. W. Hager, Hager's web page: https://people.clas.ufl.edu/hager/.

[12]

W. W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880.

[13]

W. W. Hager and H. Zhang, A survey of nonlinear conjugate gradient method, Pacific Journal of Optimization, 2 (2006), 35-58.

[14]

W. W. Hager and H. Zhang, Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent, ACM Transactions on Mathematical Software, 32 (2006), 113-137. doi: 10.1145/1132973.1132979.

[15]

M. R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards, 49 (1952), 409-436. doi: 10.6028/jres.049.044.

[16]

S. Hoshino, A formulation of variable metric methods, IMA Journal of Applied Mathematics, 10 (1972), 394-403. doi: 10.1093/imamat/10.3.394.

[17]

C. X. Kou and Y. H. Dai, A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization, Journal of Optimization Theory and Applications, 165 (2015), 209-224. doi: 10.1007/s10957-014-0528-4.

[18]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35. doi: 10.1016/S0377-0427(00)00540-9.

[19]

F. ModarresM. A. Hassan and W. J. Leong, Memoryless modified symmetric rank-one method for large-scale unconstrained optimization, American Journal of Applied Sciences, 6 (2009), 2054-2059.

[20]

A.U. Moyi and W.J. Leong, A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization, Optimization, 65 (2016), 121-143. doi: 10.1080/02331934.2014.994625.

[21]

S. NakayamaY. Narushima and H. Yabe, A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization, Journal of the Operations Research Society of Japan, 61 (2018), 53-70. doi: 10.15807/jorsj.61.53.

[22]

Y. Narushima and H. Yabe, A survey of sufficient descent conjugate gradient methods for unconstrained optimization, SUT Journal of Mathematics, 50 (2014), 167-203.

[23]

Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230. doi: 10.1137/080743573.

[24]

J. Nocedal and S. J. Wright, Numerical Optimization, 2nd edition, Springer, 2006.

[25]

S. S. Oren, Self-scaling variable metric (SSVM) algorithms, Part Ⅱ: Implementation and experiments, Management Science, 20 (1974), 863-874. doi: 10.1287/mnsc.20.5.863.

[26]

S. S. Oren and D. G. Luenberger, Self-scaling variable metric (SSVM) algorithms, Part Ⅰ: Criteria and sufficient conditions for scaling a class of algorithms, Management Science, 20 (1974), 845-862. doi: 10.1287/mnsc.20.5.845.

[27]

D. F. Shanno, Conjugate gradient methods with inexact searches, Mathematics of Operations Research, 3 (1978), 244-256. doi: 10.1287/moor.3.3.244.

[28]

K. SugikiY. Narushima and H. Yabe, Globally convergent three-term conjugate gradient methods that use secant condition and generate descent search directions for unconstrained optimization, Journal of Optimization Theory and Applications, 153 (2012), 733-757. doi: 10.1007/s10957-011-9960-x.

[29]

L. Sun, An approach to scaling symmetric rank-one update, Pacific Journal of Optimization, 2 (2006), 105-118.

[30]

W. Sun and Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, 2006.

[31]

Z. WeiG. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems, Applied Mathematics and Computation, 175 (2006), 1156-1188. doi: 10.1016/j.amc.2005.08.027.

[32]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167. doi: 10.1023/A:1021898630001.

[33]

L. ZhangW. Zhou and D. H. Li, A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640. doi: 10.1093/imanum/drl016.

[34]

Y. Zhang and R. P. Tewarson, Quasi-Newton algorithms with updates from the preconvex part of Broyden's family, IMA Journal of Numerical Analysis, 8 (1988), 487-509. doi: 10.1093/imanum/8.4.487.

show all references

References:
[1]

M. Al-BaaliY. Narushima and H. Yabe, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Computational Optimization and Applications, 60 (2015), 89-110. doi: 10.1007/s10589-014-9662-z.

[2]

I. BongartA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, 21 (1995), 123-160. doi: 10.1145/200979.201043.

[3]

Z. Chen and W. Cheng, Spectral-scaling quasi-Newton methods with updates from the one parameter of the Broyden family, Journal of Computational and Applied Mathematics, 248 (2013), 88-98. doi: 10.1016/j.cam.2013.01.012.

[4]

W. Y. Cheng and D. H. Li, Spectral scaling BFGS method, Journal of Optimization Theory and Applications, 146 (2010), 305-319. doi: 10.1007/s10957-010-9652-y.

[5]

Y. H. Dai and C. X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM Journal on Optimization, 23 (2013), 296-320. doi: 10.1137/100813026.

[6]

Y. H. Dai and L. Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Applied Mathematics and Optimization, 43 (2001), 87-101. doi: 10.1007/s002450010019.

[7]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, 91 (2002), 201-213. doi: 10.1007/s101070100263.

[8]

J. A. Ford and I. A. Moghrabi, Multi-step quasi-Newton methods for optimization, Journal of Computational and Applied Mathematics, 50 (1994), 305-323. doi: 10.1016/0377-0427(94)90309-3.

[9]

J. C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization, 2 (1992), 21-42. doi: 10.1137/0802003.

[10]

N.I.M GouldD. Orban and P.L. Toint, CUTEr and SifDec: A constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 373-394. doi: 10.1145/962437.962439.

[11]

W. W. Hager, Hager's web page: https://people.clas.ufl.edu/hager/.

[12]

W. W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880.

[13]

W. W. Hager and H. Zhang, A survey of nonlinear conjugate gradient method, Pacific Journal of Optimization, 2 (2006), 35-58.

[14]

W. W. Hager and H. Zhang, Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent, ACM Transactions on Mathematical Software, 32 (2006), 113-137. doi: 10.1145/1132973.1132979.

[15]

M. R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards, 49 (1952), 409-436. doi: 10.6028/jres.049.044.

[16]

S. Hoshino, A formulation of variable metric methods, IMA Journal of Applied Mathematics, 10 (1972), 394-403. doi: 10.1093/imamat/10.3.394.

[17]

C. X. Kou and Y. H. Dai, A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization, Journal of Optimization Theory and Applications, 165 (2015), 209-224. doi: 10.1007/s10957-014-0528-4.

[18]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35. doi: 10.1016/S0377-0427(00)00540-9.

[19]

F. ModarresM. A. Hassan and W. J. Leong, Memoryless modified symmetric rank-one method for large-scale unconstrained optimization, American Journal of Applied Sciences, 6 (2009), 2054-2059.

[20]

A.U. Moyi and W.J. Leong, A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization, Optimization, 65 (2016), 121-143. doi: 10.1080/02331934.2014.994625.

[21]

S. NakayamaY. Narushima and H. Yabe, A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization, Journal of the Operations Research Society of Japan, 61 (2018), 53-70. doi: 10.15807/jorsj.61.53.

[22]

Y. Narushima and H. Yabe, A survey of sufficient descent conjugate gradient methods for unconstrained optimization, SUT Journal of Mathematics, 50 (2014), 167-203.

[23]

Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230. doi: 10.1137/080743573.

[24]

J. Nocedal and S. J. Wright, Numerical Optimization, 2nd edition, Springer, 2006.

[25]

S. S. Oren, Self-scaling variable metric (SSVM) algorithms, Part Ⅱ: Implementation and experiments, Management Science, 20 (1974), 863-874. doi: 10.1287/mnsc.20.5.863.

[26]

S. S. Oren and D. G. Luenberger, Self-scaling variable metric (SSVM) algorithms, Part Ⅰ: Criteria and sufficient conditions for scaling a class of algorithms, Management Science, 20 (1974), 845-862. doi: 10.1287/mnsc.20.5.845.

[27]

D. F. Shanno, Conjugate gradient methods with inexact searches, Mathematics of Operations Research, 3 (1978), 244-256. doi: 10.1287/moor.3.3.244.

[28]

K. SugikiY. Narushima and H. Yabe, Globally convergent three-term conjugate gradient methods that use secant condition and generate descent search directions for unconstrained optimization, Journal of Optimization Theory and Applications, 153 (2012), 733-757. doi: 10.1007/s10957-011-9960-x.

[29]

L. Sun, An approach to scaling symmetric rank-one update, Pacific Journal of Optimization, 2 (2006), 105-118.

[30]

W. Sun and Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, 2006.

[31]

Z. WeiG. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems, Applied Mathematics and Computation, 175 (2006), 1156-1188. doi: 10.1016/j.amc.2005.08.027.

[32]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167. doi: 10.1023/A:1021898630001.

[33]

L. ZhangW. Zhou and D. H. Li, A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640. doi: 10.1093/imanum/drl016.

[34]

Y. Zhang and R. P. Tewarson, Quasi-Newton algorithms with updates from the preconvex part of Broyden's family, IMA Journal of Numerical Analysis, 8 (1988), 487-509. doi: 10.1093/imanum/8.4.487.

Figure 1.  Performance profiles of methods in Table 2
Figure 2.  Performance profiles of methods in Table 3
Figure 3.  Performance profiles of methods in Table 4
Figure 4.  Performance profiles of methods in Table 5
Table 1.  Test problems (names and dimensions) by CUTEr library
name n name n name n name n
AKIVA 2 DIXMAANC 3000 HEART8LS 8 PENALTY1 1000
ALLINITU 4 DIXMAAND 3000 HELIX 3 PENALTY2 200
ARGLINA 200 DIXMAANE 3000 HIELOW 3 PENALTY3 200
ARGLINB 200 DIXMAANF 3000 HILBERTA 2 POWELLSG 5000
ARWHEAD 5000 DIXMAANG 3000 HILBERTB 10 POWER 10000
BARD 3 DIXMAANH 3000 HIMMELBB 2 QUARTC 5000
BDQRTIC 5000 DIXMAANI 3000 HIMMELBF 4 ROSENBR 2
BEALE 2 DIXMAANJ 3000 HIMMELBG 2 S308 2
BIGGS6 6 DIXMAANK 3000 HIMMELBH 2 SCHMVETT 5000
BOX3 3 DIXMAANL 3000 HUMPS 2 SENSORS 100
BOX 10000 DIXON3DQ 10000 JENSMP 2 SINEVAL 2
BRKMCC 2 DJTL 2 KOWOSB 4 SINQUAD 5000
BROWNAL 200 DQDRTIC 5000 LIARWHD 5000 SISSER 2
BROWNBS 2 DQRTIC 5000 LOGHAIRY 2 SNAIL 2
BROWNDEN 4 EDENSCH 2000 MANCINO 100 SPARSINE 5000
BROYDN7D 5000 EG2 1000 MARATOSB 2 SPARSQUR 10000
BRYBND 5000 ENGVAL1 5000 MEXHAT 2 SPMSRTLS 4999
CHAINWOO 4000 ENGVAL2 3 MOREBV 5000 SROSENBR 5000
CHNROSNB 50 ERRINROS 50 MSQRTALS 1024 STRATEC 10
CLIFF 2 EXPFIT 2 MSQRTBLS 1024 TESTQUAD 5000
COSINE 10000 EXTROSNB 1000 NONCVXU2 5000 TOINTGOR 50
CRAGGLVY 5000 FLETCBV2 5000 NONDIA 5000 TOINTGSS 5000
CUBE 2 FLETCHCR 1000 NONDQUAR 5000 TOINTPSP 50
CURLY10 10000 FMINSRF2 5625 OSBORNEA 5 TOINTQOR 50
CURLY20 10000 FMINSURF 5625 OSBORNEB 11 TQUARTIC 5000
CURLY30 10000 FREUROTH 5000 OSCIPATH 10 TRIDIA 5000
DECONVU 63 GENHUMPS 5000 PALMER1C 8 VARDIM 200
DENSCHNA 2 GENROSE 500 PALMER1D 7 VAREIGVL 50
DENSCHNB 2 GROWTHLS 3 PALMER2C 8 VIBRBEAM 8
DENSCHNC 2 GULF 3 PALMER3C 8 WATSON 12
DENSCHND 3 HAIRY 2 PALMER4C 8 WOODS 4000
DENSCHNE 3 HATFLDD 3 PALMER5C 6 YFITU 3
DENSCHNF 2 HATFLDE 3 PALMER6C 8 ZANGWIL2 2
DIXMAANA 3000 HATFLDFL 3 PALMER7C 8
DIXMAANB 3000 HEART6LS 6 PALMER8C 8
name n name n name n name n
AKIVA 2 DIXMAANC 3000 HEART8LS 8 PENALTY1 1000
ALLINITU 4 DIXMAAND 3000 HELIX 3 PENALTY2 200
ARGLINA 200 DIXMAANE 3000 HIELOW 3 PENALTY3 200
ARGLINB 200 DIXMAANF 3000 HILBERTA 2 POWELLSG 5000
ARWHEAD 5000 DIXMAANG 3000 HILBERTB 10 POWER 10000
BARD 3 DIXMAANH 3000 HIMMELBB 2 QUARTC 5000
BDQRTIC 5000 DIXMAANI 3000 HIMMELBF 4 ROSENBR 2
BEALE 2 DIXMAANJ 3000 HIMMELBG 2 S308 2
BIGGS6 6 DIXMAANK 3000 HIMMELBH 2 SCHMVETT 5000
BOX3 3 DIXMAANL 3000 HUMPS 2 SENSORS 100
BOX 10000 DIXON3DQ 10000 JENSMP 2 SINEVAL 2
BRKMCC 2 DJTL 2 KOWOSB 4 SINQUAD 5000
BROWNAL 200 DQDRTIC 5000 LIARWHD 5000 SISSER 2
BROWNBS 2 DQRTIC 5000 LOGHAIRY 2 SNAIL 2
BROWNDEN 4 EDENSCH 2000 MANCINO 100 SPARSINE 5000
BROYDN7D 5000 EG2 1000 MARATOSB 2 SPARSQUR 10000
BRYBND 5000 ENGVAL1 5000 MEXHAT 2 SPMSRTLS 4999
CHAINWOO 4000 ENGVAL2 3 MOREBV 5000 SROSENBR 5000
CHNROSNB 50 ERRINROS 50 MSQRTALS 1024 STRATEC 10
CLIFF 2 EXPFIT 2 MSQRTBLS 1024 TESTQUAD 5000
COSINE 10000 EXTROSNB 1000 NONCVXU2 5000 TOINTGOR 50
CRAGGLVY 5000 FLETCBV2 5000 NONDIA 5000 TOINTGSS 5000
CUBE 2 FLETCHCR 1000 NONDQUAR 5000 TOINTPSP 50
CURLY10 10000 FMINSRF2 5625 OSBORNEA 5 TOINTQOR 50
CURLY20 10000 FMINSURF 5625 OSBORNEB 11 TQUARTIC 5000
CURLY30 10000 FREUROTH 5000 OSCIPATH 10 TRIDIA 5000
DECONVU 63 GENHUMPS 5000 PALMER1C 8 VARDIM 200
DENSCHNA 2 GENROSE 500 PALMER1D 7 VAREIGVL 50
DENSCHNB 2 GROWTHLS 3 PALMER2C 8 VIBRBEAM 8
DENSCHNC 2 GULF 3 PALMER3C 8 WATSON 12
DENSCHND 3 HAIRY 2 PALMER4C 8 WOODS 4000
DENSCHNE 3 HATFLDD 3 PALMER5C 6 YFITU 3
DENSCHNF 2 HATFLDE 3 PALMER6C 8 ZANGWIL2 2
DIXMAANA 3000 HATFLDFL 3 PALMER7C 8
DIXMAANB 3000 HEART6LS 6 PALMER8C 8
Table 2.  Tested methods (Methods 1-9)
Method number $ \theta_{k-1}$ Class Global convergence
1 (DFP) 0 convex not established
2 0.25 convex established
3 0.5 convex established
4 0.75 convex established
5 (BFGS) 1 convex established
6 1.25 preconvex established
7 1.5 preconvex established
8 1.75 preconvex established
9 2 preconvex not established
Method number $ \theta_{k-1}$ Class Global convergence
1 (DFP) 0 convex not established
2 0.25 convex established
3 0.5 convex established
4 0.75 convex established
5 (BFGS) 1 convex established
6 1.25 preconvex established
7 1.5 preconvex established
8 1.75 preconvex established
9 2 preconvex not established
Table 3.  Tested methods (Methods 5 and 10-17)
Method number $ \theta_{k-1}$ Class Global convergence
(BFGS) 1 convex established
10 0.8 convex established
11 0.85 convex established
12 0.9 convex established
13 0.95 convex established
14 1.05 preconvex established
15 1.1 preconvex established
16 1.15 preconvex established
17 1.2 preconvex established
Method number $ \theta_{k-1}$ Class Global convergence
(BFGS) 1 convex established
10 0.8 convex established
11 0.85 convex established
12 0.9 convex established
13 0.95 convex established
14 1.05 preconvex established
15 1.1 preconvex established
16 1.15 preconvex established
17 1.2 preconvex established
Table 4.  Tested methods (Methods 5 and 18-21)
Method number $\hat\theta_{k-1}$ Class Global convergence
5 (BFGS) 1 convex established
18 (Hoshino) $\dfrac{s_{k-1}^Ty_{k-1}}{(s_{k-1} + \hat\gamma_{k-1}^{-1}y_{k-1})^T y_{k-1}}$ convex established
19 $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right|$ preconvex established
20 $1 + \dfrac{ d_{k-1}^Tg_k }{\|d_{k-1}\| \|g_{k}\|}$ unknown established
21 $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ preconvex established
Method number $\hat\theta_{k-1}$ Class Global convergence
5 (BFGS) 1 convex established
18 (Hoshino) $\dfrac{s_{k-1}^Ty_{k-1}}{(s_{k-1} + \hat\gamma_{k-1}^{-1}y_{k-1})^T y_{k-1}}$ convex established
19 $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right|$ preconvex established
20 $1 + \dfrac{ d_{k-1}^Tg_k }{\|d_{k-1}\| \|g_{k}\|}$ unknown established
21 $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ preconvex established
Table 5.  Tested methods (Methods 5, 19 and 21-25)
Method number $\hat\gamma_{k-1}$ $\hat\theta_{k-1}$ Global convergence
5 (BFGS) (54) 1 established
19 (54) $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right| $ established
21 (54) $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ established
22 (BFGS) (56) 1 not established
23 (56) $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right|$ not established
24 (56) $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ not established
25 (CG_DESCENT) [11,12,14] established
Method number $\hat\gamma_{k-1}$ $\hat\theta_{k-1}$ Global convergence
5 (BFGS) (54) 1 established
19 (54) $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right| $ established
21 (54) $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ established
22 (BFGS) (56) 1 not established
23 (56) $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right|$ not established
24 (56) $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ not established
25 (CG_DESCENT) [11,12,14] established
[1]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-16. doi: 10.3934/jimo.2018149

[2]

Wataru Nakamura, Yasushi Narushima, Hiroshi Yabe. Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (3) : 595-619. doi: 10.3934/jimo.2013.9.595

[3]

Honglan Zhu, Qin Ni, Meilan Zeng. A quasi-Newton trust region method based on a new fractional model. Numerical Algebra, Control & Optimization, 2015, 5 (3) : 237-249. doi: 10.3934/naco.2015.5.237

[4]

Yuhong Dai, Nobuo Yamashita. Convergence analysis of sparse quasi-Newton updates with positive definite matrix completion for two-dimensional functions. Numerical Algebra, Control & Optimization, 2011, 1 (1) : 61-69. doi: 10.3934/naco.2011.1.61

[5]

Gaohang Yu, Lutai Guan, Guoyin Li. Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. Journal of Industrial & Management Optimization, 2008, 4 (3) : 565-579. doi: 10.3934/jimo.2008.4.565

[6]

Hongxiu Zhong, Guoliang Chen, Xueping Guo. Semi-local convergence of the Newton-HSS method under the center Lipschitz condition. Numerical Algebra, Control & Optimization, 2019, 9 (1) : 85-99. doi: 10.3934/naco.2019007

[7]

Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial & Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038

[8]

Wei-Zhe Gu, Li-Yong Lu. The linear convergence of a derivative-free descent method for nonlinear complementarity problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 531-548. doi: 10.3934/jimo.2016030

[9]

Jun Chen, Wenyu Sun, Zhenghao Yang. A non-monotone retrospective trust-region method for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 919-944. doi: 10.3934/jimo.2013.9.919

[10]

Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control & Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309

[11]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[12]

Shenggui Zhang. A sufficient condition of Euclidean rings given by polynomial optimization over a box. Numerical Algebra, Control & Optimization, 2014, 4 (2) : 93-101. doi: 10.3934/naco.2014.4.93

[13]

M. S. Lee, B. S. Goh, H. G. Harno, K. H. Lim. On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 315-326. doi: 10.3934/naco.2018020

[14]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial & Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[15]

Ferenc A. Bartha, Ábel Garab. Necessary and sufficient condition for the global stability of a delayed discrete-time single neuron model. Journal of Computational Dynamics, 2014, 1 (2) : 213-232. doi: 10.3934/jcd.2014.1.213

[16]

Cheng-Dar Liou. Note on "Cost analysis of the M/M/R machine repair problem with second optional repair: Newton-Quasi method". Journal of Industrial & Management Optimization, 2012, 8 (3) : 727-732. doi: 10.3934/jimo.2012.8.727

[17]

Kuo-Hsiung Wang, Chuen-Wen Liao, Tseng-Chang Yen. Cost analysis of the M/M/R machine repair problem with second optional repair: Newton-Quasi method. Journal of Industrial & Management Optimization, 2010, 6 (1) : 197-207. doi: 10.3934/jimo.2010.6.197

[18]

Chien-Wen Chao, Shu-Cherng Fang, Ching-Jong Liao. A tropical cyclone-based method for global optimization. Journal of Industrial & Management Optimization, 2012, 8 (1) : 103-115. doi: 10.3934/jimo.2012.8.103

[19]

Yong Xia. New sufficient global optimality conditions for linearly constrained bivalent quadratic optimization problems. Journal of Industrial & Management Optimization, 2009, 5 (4) : 881-892. doi: 10.3934/jimo.2009.5.881

[20]

Weijun Zhou, Youhua Zhou. On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 893-899. doi: 10.3934/jimo.2013.9.893

2017 Impact Factor: 0.994

Metrics

  • PDF downloads (33)
  • HTML views (323)
  • Cited by (0)

[Back to Top]