# American Institute of Mathematical Sciences

• Previous Article
Continuous-time mean-variance portfolio selection with no-shorting constraints and regime-switching
• JIMO Home
• This Issue
• Next Article
Performance evaluation and Nash equilibrium of a cloud architecture with a sleeping mechanism and an enrollment service
doi: 10.3934/jimo.2019013

## A class of accelerated conjugate-gradient-like methods based on a modified secant equation

 Department of Applied Mathematics, Hainan University, Haikou 570228, China

* Corresponding author: Yigui Ou

Received  May 2018 Revised  September 2018 Published  March 2019

Fund Project: The work is supported by NNSF of China (Nos. 11261015, 11761025) and NSF of Hainan Province (Nos. 111001, 2016CXTD004)

This paper proposes a new class of accelerated conjugate-gradient-like algorithms for solving large scale unconstrained optimization problems, which combine the idea of accelerated adaptive Perry conjugate gradient algorithms proposed by Andrei (2017) with the modified secant condition and the nonmonotone line search technique. An attractive property of the proposed methods is that the search direction always provides sufficient descent step which is independent of the line search used and the convexity of objective function. Under common assumptions, it is proven that the proposed methods possess global convergence for nonconvex smooth functions, and R-linear convergence for uniformly convex functions. The numerical experiments show the efficiency of the proposed method in practical computations.

Citation: Yigui Ou, Haichan Lin. A class of accelerated conjugate-gradient-like methods based on a modified secant equation. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2019013
##### References:
 [1] N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416. doi: 10.1007/s10589-007-9055-7. Google Scholar [2] N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369. doi: 10.1016/j.amc.2009.03.020. Google Scholar [3] N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164. doi: 10.1016/j.cam.2017.04.045. Google Scholar [4] I. Bongartz, A. R. Conn, N. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160. doi: 10.1145/200979.201043. Google Scholar [5] Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936. doi: 10.1016/j.na.2010.09.046. Google Scholar [6] Z. F. Dai, X. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386. doi: 10.1016/j.amc.2015.08.014. Google Scholar [7] Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93. doi: 10.1007/s11075-011-9477-2. Google Scholar [8] Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330. doi: 10.1023/A:1013653923062. Google Scholar [9] Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.Google Scholar [10] E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263. Google Scholar [11] L. Grippo, F. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716. doi: 10.1137/0723046. Google Scholar [12] N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172. doi: 10.1016/j.camwa.2007.08.038. Google Scholar [13] W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880. Google Scholar [14] W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. Google Scholar [15] D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35. doi: 10.1016/S0377-0427(00)00540-9. Google Scholar [16] I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207. doi: 10.1016/j.amc.2012.02.076. Google Scholar [17] J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874. Google Scholar [18] Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106. doi: 10.1016/j.cam.2017.10.024. Google Scholar [19] D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94. doi: 10.1145/355666.355673. Google Scholar [20] W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. Google Scholar [21] S. W. Yao, D. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269. doi: 10.1007/s11075-017-0422-x. Google Scholar [22] G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21. doi: 10.1007/s11590-008-0086-5. Google Scholar [23] G. L. Yuan, Z. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152. doi: 10.1007/s10957-015-0781-1. Google Scholar [24] J. Z. Zhang, N. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167. doi: 10.1023/A:1021898630001. Google Scholar [25] H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056. doi: 10.1137/S1052623403428208. Google Scholar

show all references

##### References:
 [1] N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416. doi: 10.1007/s10589-007-9055-7. Google Scholar [2] N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369. doi: 10.1016/j.amc.2009.03.020. Google Scholar [3] N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164. doi: 10.1016/j.cam.2017.04.045. Google Scholar [4] I. Bongartz, A. R. Conn, N. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160. doi: 10.1145/200979.201043. Google Scholar [5] Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936. doi: 10.1016/j.na.2010.09.046. Google Scholar [6] Z. F. Dai, X. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386. doi: 10.1016/j.amc.2015.08.014. Google Scholar [7] Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93. doi: 10.1007/s11075-011-9477-2. Google Scholar [8] Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330. doi: 10.1023/A:1013653923062. Google Scholar [9] Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.Google Scholar [10] E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263. Google Scholar [11] L. Grippo, F. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716. doi: 10.1137/0723046. Google Scholar [12] N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172. doi: 10.1016/j.camwa.2007.08.038. Google Scholar [13] W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880. Google Scholar [14] W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. Google Scholar [15] D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35. doi: 10.1016/S0377-0427(00)00540-9. Google Scholar [16] I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207. doi: 10.1016/j.amc.2012.02.076. Google Scholar [17] J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874. Google Scholar [18] Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106. doi: 10.1016/j.cam.2017.10.024. Google Scholar [19] D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94. doi: 10.1145/355666.355673. Google Scholar [20] W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. Google Scholar [21] S. W. Yao, D. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269. doi: 10.1007/s11075-017-0422-x. Google Scholar [22] G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21. doi: 10.1007/s11590-008-0086-5. Google Scholar [23] G. L. Yuan, Z. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152. doi: 10.1007/s10957-015-0781-1. Google Scholar [24] J. Z. Zhang, N. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167. doi: 10.1023/A:1021898630001. Google Scholar [25] H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056. doi: 10.1137/S1052623403428208. Google Scholar
Performance profiles based on the CPU time (left) and the number $N_f$ of function evaluations (right)
Performance profiles based on the CPU time (left) and the number $N_f$ of function evaluations (right)
Performance profiles based on the CPU time (left) and the number $N_f$ of function evaluations (right)
List of the test problems
 No. Description of function Dim. No. Description of function Dim. 1 Ex. Freud. Roth 10000 35 Ex. Tri. Diag 2 10000 2 Ex. Trigonometric 10000 36 FLETCBV3(CUTE) 1000 3 Ex. Rosenbrock 10000 37 FLETCHCR(CUTE) 5000 4 Ex. White and Holst 10000 38 BDQRTIC(CUTE) 10000 5 Ex. Beale 10000 39 TRIDIA(CUTE) 10000 6 Ex. Penalty 10000 40 ARGLINB(CUTE) 10000 7 Perturbed Quad. 10000 41 ARWHEAD(CUTE) 10000 8 Raydan 1 10000 42 NONDIA(CUTE) 10000 9 Raydan 2 10000 43 NONDQUAR(CUTE) 10000 10 Diagonal 1 10000 44 DQDRTIC(CUTE) 10000 11 Diagonal 2 10000 45 EG2(CUTE) 10000 12 Diagonal 3 10000 46 CURLY20(CUTE) 1000 13 Hager Function 10000 47 DIXMAANA(CUTE) 9000 14 Generalized Tri. Diag. 1 10000 48 DIXMAANB(CUTE) 9000 15 Ex. Tri. Diag. 1 10000 49 DIXMAANC(CUTE) 9000 16 Ex. Three Exp. 10000 50 DIXMAAND(CUTE) 9000 17 Generalized Tri. Diag. 2 10000 51 DIXMAANE(CUTE) 9000 18 Diagonal 4 10000 52 DIXMAANF(CUTE) 9000 19 Diagonal 5 10000 53 DIXMAANG(CUTE) 9000 20 Ex. Himmelblau 10000 54 DIXMAANH(CUTE) 9000 21 Generalized PSC1 10000 55 DIXMAANI(CUTE) 9000 22 Ex. PSC1 10000 56 DIXMAANJ(CUTE) 9000 23 Ex. Powell 10000 57 DIXMAANK(CUTE) 9000 24 Ex. Block-Diag. BD1 10000 58 DIXMAANL(CUTE) 9000 25 Ex. Maratos 10000 59 LIARWHD(CUTE) 10000 26 Ex. Cliff 10000 60 POWER(CUTE) 5000 27 Quad. Dia. Perturbed 10000 61 ENGVAL1(CUTE) 10000 28 Ex. wood 10000 62 EDENSCH(CUTE) 10000 29 Ex. Hiebert 10000 63 VARDIM(CUTE) 10000 30 Quadratic QF1 10000 64 QUARTC(CUTE) 10000 31 Ex. Quad. Pena. QP1 10000 65 SINQUAD(CUTE) 5000 32 Ex. Quad. Pena. QP2 5000 66 DENSCHNB(CUTE) 10000 33 Quadratic QF2 10000 67 DENSCHNF(CUTE) 10000 34 Ex. EP1 10000 68 COSINE(CUTE) 10000
 No. Description of function Dim. No. Description of function Dim. 1 Ex. Freud. Roth 10000 35 Ex. Tri. Diag 2 10000 2 Ex. Trigonometric 10000 36 FLETCBV3(CUTE) 1000 3 Ex. Rosenbrock 10000 37 FLETCHCR(CUTE) 5000 4 Ex. White and Holst 10000 38 BDQRTIC(CUTE) 10000 5 Ex. Beale 10000 39 TRIDIA(CUTE) 10000 6 Ex. Penalty 10000 40 ARGLINB(CUTE) 10000 7 Perturbed Quad. 10000 41 ARWHEAD(CUTE) 10000 8 Raydan 1 10000 42 NONDIA(CUTE) 10000 9 Raydan 2 10000 43 NONDQUAR(CUTE) 10000 10 Diagonal 1 10000 44 DQDRTIC(CUTE) 10000 11 Diagonal 2 10000 45 EG2(CUTE) 10000 12 Diagonal 3 10000 46 CURLY20(CUTE) 1000 13 Hager Function 10000 47 DIXMAANA(CUTE) 9000 14 Generalized Tri. Diag. 1 10000 48 DIXMAANB(CUTE) 9000 15 Ex. Tri. Diag. 1 10000 49 DIXMAANC(CUTE) 9000 16 Ex. Three Exp. 10000 50 DIXMAAND(CUTE) 9000 17 Generalized Tri. Diag. 2 10000 51 DIXMAANE(CUTE) 9000 18 Diagonal 4 10000 52 DIXMAANF(CUTE) 9000 19 Diagonal 5 10000 53 DIXMAANG(CUTE) 9000 20 Ex. Himmelblau 10000 54 DIXMAANH(CUTE) 9000 21 Generalized PSC1 10000 55 DIXMAANI(CUTE) 9000 22 Ex. PSC1 10000 56 DIXMAANJ(CUTE) 9000 23 Ex. Powell 10000 57 DIXMAANK(CUTE) 9000 24 Ex. Block-Diag. BD1 10000 58 DIXMAANL(CUTE) 9000 25 Ex. Maratos 10000 59 LIARWHD(CUTE) 10000 26 Ex. Cliff 10000 60 POWER(CUTE) 5000 27 Quad. Dia. Perturbed 10000 61 ENGVAL1(CUTE) 10000 28 Ex. wood 10000 62 EDENSCH(CUTE) 10000 29 Ex. Hiebert 10000 63 VARDIM(CUTE) 10000 30 Quadratic QF1 10000 64 QUARTC(CUTE) 10000 31 Ex. Quad. Pena. QP1 10000 65 SINQUAD(CUTE) 5000 32 Ex. Quad. Pena. QP2 5000 66 DENSCHNB(CUTE) 10000 33 Quadratic QF2 10000 67 DENSCHNF(CUTE) 10000 34 Ex. EP1 10000 68 COSINE(CUTE) 10000
 [1] Yigui Ou, Xin Zhou. A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization. Journal of Industrial & Management Optimization, 2018, 14 (2) : 785-801. doi: 10.3934/jimo.2017075 [2] Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe. Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial & Management Optimization, 2019, 15 (4) : 1773-1793. doi: 10.3934/jimo.2018122 [3] Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024 [4] Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial & Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038 [5] Regina S. Burachik, C. Yalçın Kaya. An update rule and a convergence result for a penalty function method. Journal of Industrial & Management Optimization, 2007, 3 (2) : 381-398. doi: 10.3934/jimo.2007.3.381 [6] Weijun Zhou, Youhua Zhou. On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 893-899. doi: 10.3934/jimo.2013.9.893 [7] Ronan Costaouec, Haoyun Feng, Jesús Izaguirre, Eric Darve. Analysis of the accelerated weighted ensemble methodology. Conference Publications, 2013, 2013 (special) : 171-181. doi: 10.3934/proc.2013.2013.171 [8] Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial & Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411 [9] Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial & Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739 [10] Kenji Nakanishi. Modified wave operators for the Hartree equation with data, image and convergence in the same space. Communications on Pure & Applied Analysis, 2002, 1 (2) : 237-252. doi: 10.3934/cpaa.2002.1.237 [11] Liu Liu. Uniform spectral convergence of the stochastic Galerkin method for the linear semiconductor Boltzmann equation with random inputs and diffusive scaling. Kinetic & Related Models, 2018, 11 (5) : 1139-1156. doi: 10.3934/krm.2018044 [12] Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-16. doi: 10.3934/jimo.2018149 [13] Marek Fila, Michael Winkler, Eiji Yanagida. Convergence to self-similar solutions for a semilinear parabolic equation. Discrete & Continuous Dynamical Systems - A, 2008, 21 (3) : 703-716. doi: 10.3934/dcds.2008.21.703 [14] Yong Duan, Jian-Guo Liu. Convergence analysis of the vortex blob method for the $b$-equation. Discrete & Continuous Dynamical Systems - A, 2014, 34 (5) : 1995-2011. doi: 10.3934/dcds.2014.34.1995 [15] Andreas C. Aristotelous, Ohannes Karakashian, Steven M. Wise. A mixed discontinuous Galerkin, convex splitting scheme for a modified Cahn-Hilliard equation and an efficient nonlinear multigrid solver. Discrete & Continuous Dynamical Systems - B, 2013, 18 (9) : 2211-2238. doi: 10.3934/dcdsb.2013.18.2211 [16] Matania Ben–Artzi, Joseph Falcovitz, Jiequan Li. The convergence of the GRP scheme. Discrete & Continuous Dynamical Systems - A, 2009, 23 (1&2) : 1-27. doi: 10.3934/dcds.2009.23.1 [17] Xueyong Wang, Yiju Wang, Gang Wang. An accelerated augmented Lagrangian method for multi-criteria optimization problem. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-9. doi: 10.3934/jimo.2018136 [18] Fernando P. da Costa, João T. Pinto, Rafael Sasportes. On the convergence to critical scaling profiles in submonolayer deposition models. Kinetic & Related Models, 2018, 11 (6) : 1359-1376. doi: 10.3934/krm.2018053 [19] Rajesh Kumar, Jitendra Kumar, Gerald Warnecke. Convergence analysis of a finite volume scheme for solving non-linear aggregation-breakage population balance equations. Kinetic & Related Models, 2014, 7 (4) : 713-737. doi: 10.3934/krm.2014.7.713 [20] Yuan Zhao, Wuyi Yue. Performance analysis and optimization for cognitive radio networks with a finite primary user buffer and a probability returning scheme. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-16. doi: 10.3934/jimo.2018195

2018 Impact Factor: 1.025

## Tools

Article outline

Figures and Tables