doi: 10.3934/jimo.2018142

Improved Cuckoo Search algorithm for numerical function optimization

1. 

School of Science, China University of Petroleum, Beijing 102249, China

2. 

School of Management, Guangzhou University, Guangzhou, 510006, China

3. 

Australasian Joint Research Centre for Building Information Modelling, School of Built Environment, Curtin University, 6102, WA, Australia

* Corresponding author: Changzhi Wu

Received  February 2016 Revised  January 2018 Published  September 2018

Fund Project: This paper is partially supported by the National Natural Science Foundation of China (61473326) and Australian Research Council

Cuckoo Search (CS) is a recently proposed metaheuristic algorithm to solve optimization problems. For improving its performance both on the efficiency of searching and the speed of convergence, we proposed an improved Cuckoo Search algorithm based on the teaching-learning strategy (TLCS). For a better balance between intensification and diversification, both a dynamic weight factor and an out-of-bound project strategies are also introduced into TLCS. The results of numerical experiment demonstrate that our improved TLCS performs better than the basic CS and other two improved CS methods appearing in literatures.

Citation: Jianjun Liu, Min Zeng, Yifan Ge, Changzhi Wu, Xiangyu Wang. Improved Cuckoo Search algorithm for numerical function optimization. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2018142
References:
[1]

S. Das and P. Suganthan, Differential evolution, A survey of the state-of-the-art, IEEE Transactions on Evolutionary Computation, 15 (2011), 4-31.

[2]

M. Dorigo and C. Blum, Ant colony optimization theory, A survey, Theoretical Computer Science, 344 (2005), 243-278. doi: 10.1016/j.tcs.2005.05.020.

[3]

R. Eberhart and Y. Shi, Particle swarm optimization, Development, applications and resources, Proceedings of the 2001 Congress on Evolutionary Computation, 1 (2001), 81-86. doi: 10.1109/CEC.2001.934374.

[4]

E. ValianS. Mohanna and S. Tavakoli, Improved cuckoo search algorithm for global optimization, International Journal of Communications and Information Technology, 1 (2011), 31-44.

[5]

I. FisterX. Yang and D. Fister, Cuckoo search, A brief literature review, Studies in Computational Intelligence, 516 (2014), 49-62. doi: 10.1007/978-3-319-02141-6_3.

[6]

D. Goldberg, Genetic algorithms and machine learning, Machine Learning, 3 (1988), 95-99.

[7]

D. Karaboga and B. Basturk, A powerful and efficient algorithm for numerical function optimization, artificial bee colony (ABC) algorithm, Journal of Global Optimization, 39 (2007), 459-471. doi: 10.1007/s10898-007-9149-x.

[8]

J. LiuS. DongL. ZhangQ. Ma and C. Wu, Estimation of archie parameters by a novel hybrid optimization algorithm, Journal of Petroleum Science and Engineering, 135 (2015), 232-239. doi: 10.1016/j.petrol.2015.09.003.

[9]

J. LiuK. TeoX. Wang and C. Wu, An exact penalty function-based differential search algorithm for constrained global optimization, Soft Computing, 20 (2015), 1305-1313. doi: 10.1007/s00500-015-1588-6.

[10]

J. LiuC. WuG. Wu and X. Wang, A novel differential search algorithm and applications for structure design, Applied Mathematics & Computation, 268 (2015), 246-269. doi: 10.1016/j.amc.2015.06.036.

[11]

J. LiuS. ZhangC. WuJ. LiangX. Wang and K. L. Teo, A hybrid approach to constrained global optimization, Applied Soft Computing, 47 (2016), 281-294. doi: 10.1016/j.asoc.2016.05.021.

[12]

J. LiuH. ZhuQ. MaL. Zhang and H. Xu, An artificial bee colony algorithm with guide of global & local optimal and asynchronous scaling factors for numerical optimization, Applied Soft Computing, 37 (2015), 608-618.

[13]

Q. Long and C. Wu, A hybrid method combining genetic algorithm and Hooke-Jeeves method for constrained global optimization, Journal of Industrial & Management Optimization, 10 (2014), 1279-1296. doi: 10.3934/jimo.2014.10.1279.

[14]

M. PlatelL. DefoinS. Schliebs and N. Kasabov, Quantum-inspired evolutionary algorithm, a multimodel EDA, IEEE Transactions on Evolutionary Computation, 13 (2009), 1218-1232. doi: 10.1109/TEVC.2008.2003010.

[15]

A. QinV. Huang and P. Suganthan, Differential evolution algorithm with strategy adaptation for global numerical optimization, IEEE Transactions on Evolutionary Computation, 13 (2009), 398-417. doi: 10.1109/TEVC.2008.927706.

[16]

R. RaoV. Savsani and D. Vakharia, Teaching-learning-based optimization, A novel method for constrained mechanical design optimization problems, Computer-aided Design, 43 (2011), 303-315. doi: 10.1016/j.cad.2010.12.015.

[17]

R. RaoV. Savsani and D. Vakharia, Teaching-learning-based optimization, An optimization method for continuous non-linear large scale problems, Information Sciences, 183 (2012), 1-15. doi: 10.1016/j.ins.2011.08.006.

[18]

S. Tao, C. Wu, Z. Sheng and X. Wang, Space-time repetitive project scheduling considering location and congestion, Journal of Computing in Civil Engineering, 32 (2018), Article ID, 04018017. doi: 10.1061/(ASCE)CP.1943-5487.0000745.

[19]

S. TaoC. WuZ. Sheng and X. Wang, Stochastic project scheduling with hierarchical alternatives, Applied Mathematical Modelling, 58 (2018), 181-202. doi: 10.1016/j.apm.2017.09.015.

[20]

M. Tavazoei and M. Haeri, Comparison of different one-dimensional maps as chaotic search pattern in chaos optimization algorithms, Applied Mathematics and Computation, 187 (2007), 1076-1085. doi: 10.1016/j.amc.2006.09.087.

[21]

Z. WangD. Gao and J. Liu, Multi-objective sidetracking horizontal well trajectory optimization in cluster wells based on DS algorithm, Journal of Petroleum Science & Engineering, 147 (2016), 771-778. doi: 10.1016/j.petrol.2016.09.046.

[22]

X. Yang and S. Deb, Cuckoo search, Recent advances and applications, Neural Computing and Applications, 24 (2014), 169-174. doi: 10.1007/s00521-013-1367-1.

[23]

X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, 2010.

[24]

X. S. Yang and S. Deb, Cuckoo search via lévy flights, In World Congress on Nature & Biologically Inspired Computing, (2009), 210-214.

[25]

Y. ZhangL. Wang and Q. Wu, Modified adaptive cuckoo search (MACS) algorithm and formal description for global optimisation, International Journal of Computer Applications in Technology, 44 (2012), 73-79. doi: 10.1504/IJCAT.2012.048675.

[26]

C. ZhaoC. WuJ. ChaiX. WangX. YangJ. Lee and M. Kim, Decomposition-based multi-objective firefly algorithm for RFID network planning with uncertainty, Applied Soft Computing, 55 (2011), 549-564. doi: 10.1016/j.asoc.2017.02.009.

[27]

H. Zheng and Y. Zhou, A novel cuckoo search optimization algorithm base on Gauss distribution, Journal of Computational Information Systems, 8 (2012), 4193-4200.

[28]

H. Zheng and Y. Zhou, Self-adaptive step cuckoo search algorithm, Computer Engineering & Applications, 49 (2013), 68-71.

show all references

References:
[1]

S. Das and P. Suganthan, Differential evolution, A survey of the state-of-the-art, IEEE Transactions on Evolutionary Computation, 15 (2011), 4-31.

[2]

M. Dorigo and C. Blum, Ant colony optimization theory, A survey, Theoretical Computer Science, 344 (2005), 243-278. doi: 10.1016/j.tcs.2005.05.020.

[3]

R. Eberhart and Y. Shi, Particle swarm optimization, Development, applications and resources, Proceedings of the 2001 Congress on Evolutionary Computation, 1 (2001), 81-86. doi: 10.1109/CEC.2001.934374.

[4]

E. ValianS. Mohanna and S. Tavakoli, Improved cuckoo search algorithm for global optimization, International Journal of Communications and Information Technology, 1 (2011), 31-44.

[5]

I. FisterX. Yang and D. Fister, Cuckoo search, A brief literature review, Studies in Computational Intelligence, 516 (2014), 49-62. doi: 10.1007/978-3-319-02141-6_3.

[6]

D. Goldberg, Genetic algorithms and machine learning, Machine Learning, 3 (1988), 95-99.

[7]

D. Karaboga and B. Basturk, A powerful and efficient algorithm for numerical function optimization, artificial bee colony (ABC) algorithm, Journal of Global Optimization, 39 (2007), 459-471. doi: 10.1007/s10898-007-9149-x.

[8]

J. LiuS. DongL. ZhangQ. Ma and C. Wu, Estimation of archie parameters by a novel hybrid optimization algorithm, Journal of Petroleum Science and Engineering, 135 (2015), 232-239. doi: 10.1016/j.petrol.2015.09.003.

[9]

J. LiuK. TeoX. Wang and C. Wu, An exact penalty function-based differential search algorithm for constrained global optimization, Soft Computing, 20 (2015), 1305-1313. doi: 10.1007/s00500-015-1588-6.

[10]

J. LiuC. WuG. Wu and X. Wang, A novel differential search algorithm and applications for structure design, Applied Mathematics & Computation, 268 (2015), 246-269. doi: 10.1016/j.amc.2015.06.036.

[11]

J. LiuS. ZhangC. WuJ. LiangX. Wang and K. L. Teo, A hybrid approach to constrained global optimization, Applied Soft Computing, 47 (2016), 281-294. doi: 10.1016/j.asoc.2016.05.021.

[12]

J. LiuH. ZhuQ. MaL. Zhang and H. Xu, An artificial bee colony algorithm with guide of global & local optimal and asynchronous scaling factors for numerical optimization, Applied Soft Computing, 37 (2015), 608-618.

[13]

Q. Long and C. Wu, A hybrid method combining genetic algorithm and Hooke-Jeeves method for constrained global optimization, Journal of Industrial & Management Optimization, 10 (2014), 1279-1296. doi: 10.3934/jimo.2014.10.1279.

[14]

M. PlatelL. DefoinS. Schliebs and N. Kasabov, Quantum-inspired evolutionary algorithm, a multimodel EDA, IEEE Transactions on Evolutionary Computation, 13 (2009), 1218-1232. doi: 10.1109/TEVC.2008.2003010.

[15]

A. QinV. Huang and P. Suganthan, Differential evolution algorithm with strategy adaptation for global numerical optimization, IEEE Transactions on Evolutionary Computation, 13 (2009), 398-417. doi: 10.1109/TEVC.2008.927706.

[16]

R. RaoV. Savsani and D. Vakharia, Teaching-learning-based optimization, A novel method for constrained mechanical design optimization problems, Computer-aided Design, 43 (2011), 303-315. doi: 10.1016/j.cad.2010.12.015.

[17]

R. RaoV. Savsani and D. Vakharia, Teaching-learning-based optimization, An optimization method for continuous non-linear large scale problems, Information Sciences, 183 (2012), 1-15. doi: 10.1016/j.ins.2011.08.006.

[18]

S. Tao, C. Wu, Z. Sheng and X. Wang, Space-time repetitive project scheduling considering location and congestion, Journal of Computing in Civil Engineering, 32 (2018), Article ID, 04018017. doi: 10.1061/(ASCE)CP.1943-5487.0000745.

[19]

S. TaoC. WuZ. Sheng and X. Wang, Stochastic project scheduling with hierarchical alternatives, Applied Mathematical Modelling, 58 (2018), 181-202. doi: 10.1016/j.apm.2017.09.015.

[20]

M. Tavazoei and M. Haeri, Comparison of different one-dimensional maps as chaotic search pattern in chaos optimization algorithms, Applied Mathematics and Computation, 187 (2007), 1076-1085. doi: 10.1016/j.amc.2006.09.087.

[21]

Z. WangD. Gao and J. Liu, Multi-objective sidetracking horizontal well trajectory optimization in cluster wells based on DS algorithm, Journal of Petroleum Science & Engineering, 147 (2016), 771-778. doi: 10.1016/j.petrol.2016.09.046.

[22]

X. Yang and S. Deb, Cuckoo search, Recent advances and applications, Neural Computing and Applications, 24 (2014), 169-174. doi: 10.1007/s00521-013-1367-1.

[23]

X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, 2010.

[24]

X. S. Yang and S. Deb, Cuckoo search via lévy flights, In World Congress on Nature & Biologically Inspired Computing, (2009), 210-214.

[25]

Y. ZhangL. Wang and Q. Wu, Modified adaptive cuckoo search (MACS) algorithm and formal description for global optimisation, International Journal of Computer Applications in Technology, 44 (2012), 73-79. doi: 10.1504/IJCAT.2012.048675.

[26]

C. ZhaoC. WuJ. ChaiX. WangX. YangJ. Lee and M. Kim, Decomposition-based multi-objective firefly algorithm for RFID network planning with uncertainty, Applied Soft Computing, 55 (2011), 549-564. doi: 10.1016/j.asoc.2017.02.009.

[27]

H. Zheng and Y. Zhou, A novel cuckoo search optimization algorithm base on Gauss distribution, Journal of Computational Information Systems, 8 (2012), 4193-4200.

[28]

H. Zheng and Y. Zhou, Self-adaptive step cuckoo search algorithm, Computer Engineering & Applications, 49 (2013), 68-71.

Figure 1.  Project plot. $x_1, x_2, x_3$ are transformed to $y_1, y_2, y_3$, respectively, by (11) while $x_4$ is first transformed to $y_4$ by (11), then to $y_4'$ by (12).
Figure 2.  Convergence performance of CS, ASCS, WCS and TLCS, on 8 test functions of 5 dimensions. Horizontal and vertical axes represent the number of iterations and the logarithm of the function values, respectively
Table 1.  18 benchmark problems of $n(\geq 2)$ dimensions used in experiments. $x = (x_1, x_2, \cdots, x_n)$, C: characteristic, Opt.: optimal function value, U.: unimodal, M.: multimodal, S.: separable, N.: non-separable.
Function Name C Range Formula Opt.
Sphere S.U. [-100,100] $f_1(x)= \sum\limits_{i=1}^{n}x_{i}^2$ 0
Ackley S.M. [-32, 32] $ f_2(x)=20-20 e^{-0.2\sqrt{\frac{1}{n}\sum\limits_{i=1}^n x_i^2}} -e^{\frac{1}{n}\sum\limits_{i=1}^n \cos (2\pi x_i)} +e$ 0
Sum-Squares S.U. $[-10, 10]$ $f_{3}(x)=\sum\limits_{i=1}^n ix_i^2$ 0
Rastrigin S.M. [-5.12, 5.12] $f_4(x) = 10n + \sum\limits_{i=1}^n \big[x_i^2 -10\cos(2\pi x_i)\big]$ 0
Rosenbrock N.U. [-10, 10] $f_5(x)= \sum\limits_{i=1}^{n-1}\big[100(x_{i}^2 - x_{i+1})^2 + (x_i -1)^2\big]$ 0
Griewank N.M. [-600,600] $f_{6}(x) = 1 + \frac{1}{4000} \sum\limits_{i=1}^n x^2_i - \prod\limits_{i=1}^n \cos(\frac{x_i}{\sqrt{i}})$ 0
Levy N.M. [-10, 10] $f_{7}(x)={\sin(\pi w_1)}^2 +\sum\limits_{i=1}^{n-1}\big \{{(w_i-1)}^2\big[1+10$
${\sin(\pi w_i+1)}^2\big]\big\}+{(w_n-1)}^2[1+{\sin(2\pi w_n)]}^2$, 0
where $w_i=1+\frac{x_i-1}{4}$, for all $i=1, 2, \cdots, n$
Powell N.M [-4, 5] $f_{8}(x)=\sum\limits_{i=1}^\frac{n}{4} \Big[{(x_{4i-3}+10x_{4i-2})}^2+ 5(x_{4i-1}-x_{4i-2})^2$ 0
$ + (x_{4i-2}-2x_{4i-1})^4+ 10(x_{4i-3}-x_{4i})^4\Big]$
Zakharov N.- [-5, 10] $f_9(x)=\sum\limits_{i=1}^{n}x_{i}^2 + \Big(\frac{1}{2}\sum\limits_{i=1}^{n}ix_{i}\Big)^2+\Big( \frac{1}{2}\sum\limits_{i=1}^{n}ix_{i}\Big)^4$ 0
Weierstrass N.M. $[-0.5, 0.5]$ $f_{10}(x)=\sum\limits_{i=1}^n\big(\sum_{k=0}^{k_{max}}[a^k\cos(2\pi b^k(x_i+0.5))]\big)$ 0
$~~-n\sum_{k=0}^{k_{max}}[a^k\cos(2\pi b^k\cdot0.5)], $
$a=0.5, b=3, k_{max}=20$
Schwefel S.M. [-500,500] $f_{11}(x) = 418.982887 \cdot n-\sum\limits_{i=1}^n \big(x_i \sin(\sqrt{|x_i|}\big)$ 0
Whitley N.M. [-10.2, 10.2] $f_{12}(x) = \sum\limits_{i=1}^n \sum\limits_{j=1}^n \Big[\frac{(100(x_i^2-x_j)^2 + (1-x_j)^2)^2}{4000} $ 0
$-\cos\big(100(x_i^2-x_j)^2 + (1-x_j)^2\big)+1\Big]$
Trid N.- $[-n^2, n^2]$ $f_{13}(x)=\sum\limits_{i=1}^{n}(x_i-1)^2- \sum\limits_{i=1}^{n}x_ix_{i-1}, (n=5)$ $-30$
$(n=10)$ $-200$
Styblinski-Tang S.M. $[-5, 5]$ $f_{14}(x)=\frac{1}{n}\sum\limits_{k=1}^n (x_i^4-16x_i^2+5x_i), a=39.16599$ $-an$
Alpine N.M. $[-10, 10]$ $f_{15}(x)=\sum\limits_{i=1}^n |x_i\cdot \sin(x_i)+0.1\cdot x_i|$ 0
Easom S.U. $[-100,100]$ $f_{16}(x)=-\cos(x_1)\cos(x_2)e^{-(x_1-pi)^2-(x_2-pi)^2}$ -1
Treccani S.M. $[-3, 3]$ $f_{17}(x)=(x_1)^4+4(x_1)^3+4(x_1)^2+(x_2)^2$ 0
Perm$(n, \beta)$ N.U $[-n, n]$ $f_{18}(x)=\sum\limits_{k=1}^n {\Big[\sum\limits_{j=1}^n (j^k+\beta) \left({x_j^ k-\frac{1}{j^k}}\right)\Big]}^2, \beta=0.5$ 0
Function Name C Range Formula Opt.
Sphere S.U. [-100,100] $f_1(x)= \sum\limits_{i=1}^{n}x_{i}^2$ 0
Ackley S.M. [-32, 32] $ f_2(x)=20-20 e^{-0.2\sqrt{\frac{1}{n}\sum\limits_{i=1}^n x_i^2}} -e^{\frac{1}{n}\sum\limits_{i=1}^n \cos (2\pi x_i)} +e$ 0
Sum-Squares S.U. $[-10, 10]$ $f_{3}(x)=\sum\limits_{i=1}^n ix_i^2$ 0
Rastrigin S.M. [-5.12, 5.12] $f_4(x) = 10n + \sum\limits_{i=1}^n \big[x_i^2 -10\cos(2\pi x_i)\big]$ 0
Rosenbrock N.U. [-10, 10] $f_5(x)= \sum\limits_{i=1}^{n-1}\big[100(x_{i}^2 - x_{i+1})^2 + (x_i -1)^2\big]$ 0
Griewank N.M. [-600,600] $f_{6}(x) = 1 + \frac{1}{4000} \sum\limits_{i=1}^n x^2_i - \prod\limits_{i=1}^n \cos(\frac{x_i}{\sqrt{i}})$ 0
Levy N.M. [-10, 10] $f_{7}(x)={\sin(\pi w_1)}^2 +\sum\limits_{i=1}^{n-1}\big \{{(w_i-1)}^2\big[1+10$
${\sin(\pi w_i+1)}^2\big]\big\}+{(w_n-1)}^2[1+{\sin(2\pi w_n)]}^2$, 0
where $w_i=1+\frac{x_i-1}{4}$, for all $i=1, 2, \cdots, n$
Powell N.M [-4, 5] $f_{8}(x)=\sum\limits_{i=1}^\frac{n}{4} \Big[{(x_{4i-3}+10x_{4i-2})}^2+ 5(x_{4i-1}-x_{4i-2})^2$ 0
$ + (x_{4i-2}-2x_{4i-1})^4+ 10(x_{4i-3}-x_{4i})^4\Big]$
Zakharov N.- [-5, 10] $f_9(x)=\sum\limits_{i=1}^{n}x_{i}^2 + \Big(\frac{1}{2}\sum\limits_{i=1}^{n}ix_{i}\Big)^2+\Big( \frac{1}{2}\sum\limits_{i=1}^{n}ix_{i}\Big)^4$ 0
Weierstrass N.M. $[-0.5, 0.5]$ $f_{10}(x)=\sum\limits_{i=1}^n\big(\sum_{k=0}^{k_{max}}[a^k\cos(2\pi b^k(x_i+0.5))]\big)$ 0
$~~-n\sum_{k=0}^{k_{max}}[a^k\cos(2\pi b^k\cdot0.5)], $
$a=0.5, b=3, k_{max}=20$
Schwefel S.M. [-500,500] $f_{11}(x) = 418.982887 \cdot n-\sum\limits_{i=1}^n \big(x_i \sin(\sqrt{|x_i|}\big)$ 0
Whitley N.M. [-10.2, 10.2] $f_{12}(x) = \sum\limits_{i=1}^n \sum\limits_{j=1}^n \Big[\frac{(100(x_i^2-x_j)^2 + (1-x_j)^2)^2}{4000} $ 0
$-\cos\big(100(x_i^2-x_j)^2 + (1-x_j)^2\big)+1\Big]$
Trid N.- $[-n^2, n^2]$ $f_{13}(x)=\sum\limits_{i=1}^{n}(x_i-1)^2- \sum\limits_{i=1}^{n}x_ix_{i-1}, (n=5)$ $-30$
$(n=10)$ $-200$
Styblinski-Tang S.M. $[-5, 5]$ $f_{14}(x)=\frac{1}{n}\sum\limits_{k=1}^n (x_i^4-16x_i^2+5x_i), a=39.16599$ $-an$
Alpine N.M. $[-10, 10]$ $f_{15}(x)=\sum\limits_{i=1}^n |x_i\cdot \sin(x_i)+0.1\cdot x_i|$ 0
Easom S.U. $[-100,100]$ $f_{16}(x)=-\cos(x_1)\cos(x_2)e^{-(x_1-pi)^2-(x_2-pi)^2}$ -1
Treccani S.M. $[-3, 3]$ $f_{17}(x)=(x_1)^4+4(x_1)^3+4(x_1)^2+(x_2)^2$ 0
Perm$(n, \beta)$ N.U $[-n, n]$ $f_{18}(x)=\sum\limits_{k=1}^n {\Big[\sum\limits_{j=1}^n (j^k+\beta) \left({x_j^ k-\frac{1}{j^k}}\right)\Big]}^2, \beta=0.5$ 0
Table 2.  Statistic results obtained by CS [24], ASCS [25], WCS [28] and TLCS on 5-D functions over 30 independent runs, respectively
Fun. Indictor CS ASCS WCS TLCS
$f_{1}$ Best 4.5213e-14 2.4536e-18 0.0000e+00 0.0000e+00
Mean 2.6910e-12 1.7970e-16 0.0000e+00 0.0000e+00
Std. 4.2657e-12 2.0513e-16 0.0000e+00 0.0000e+00
$f_{2}$ Best 5.3705e-05 9.3072e-05 8.8818e-16 8.8818e-16
Mean 3.7553e-04 6.6183e-04 8.8818e-16 8.8818e-16
Std. 4.3639e-04 6.6652e-04 0.0000e+00 0.0000e+00
$f_{3}$ Best 9.9825e-16 5.7419e-19 0.0000e+00 0.0000e+00
Mean 3.5974e-14 1.8282e-17 0.0000e+00 0.0000e+00
Std. 4.4513e-14 2.9784e-17 0.0000e+00 0.0000e+00
$f_{4}$ Best 2.1760e-02 6.0844e-02 0.0000e+00 0.0000e+00
Mean 4.1580e-01 6.0469e-01 0.0000e+00 0.0000e+00
Std. 4.1771e-01 4.3998e-01 0.0000e+00 0.0000e+00
$f_{5}$ Best 5.4423e-04 3.8454e-03 9.0786e-05 3.2173e-06
Mean 4.9548e-02 8.8289e-02 4.9515e-03 9.0026e-05
Std. 8.2985e-02 1.2769e-01 4.4433e-03 2.4935e-04
$f_{6}$ Best 2.4318e-02 1.3526e-02 0.0000e+00 0.0000e+00
Mean 4.0590e-02 4.9268e-02 1.6139e-02 0.0000e+00
Std. 1.1239e-02 2.1748e-02 2.6449e-02 0.0000e+00
$f_{7}$ Best 7.4053e-12 4.1549e-11 1.4224e-13 6.0523e-17
Mean 4.5672e-10 5.7734e-10 4.9219e-11 1.3148e-15
Std. 1.2514e-09 5.2112e-10 4.1785e-11 2.0139e-15
$f_{8}$ Best 7.5663e-19 1.9396e-20 0.0000e+00 5.2592e-51
Mean 3.2565e-17 1.1242e-17 0.0000e+00 1.0280e-44
Std. 4.6518e-17 1.5804e-17 0.0000e+00 2.1851e-44
$f_{9}$ Best 2.7592e-13 1.2952e-16 0.0000e+00 0.0000e+00
Mean 2.6449e-12 1.6090e-15 0.0000e+00 0.0000e+00
Std. 2.9165e-12 1.4999e-15 0.0000e+00 0.0000e+00
$f_{10}$ Best 5.5995e-03 6.6042e-03 0.0000e+00 0.0000e+00
Mean 1.1684e-02 1.0426e-02 0.0000e+00 0.0000e+00
Std. 6.3211e-03 3.5660e-03 0.0000e+00 0.0000e+00
$f_{11}$ Best 2.0752e+03 2.0752e+03 2.0752e+03 2.0225e+03
Mean 2.0752e+03 2.0752e+03 2.0752e+03 2.0616e+03
Std. 4.7941e-11 1.7728e-10 1.5453e-09 2.2077e+01
$f_{12}$ Best 1.1195e-03 1.3950e-03 1.1187e-02 9.8810e-13
Mean 7.2247e-01 5.6108e-01 1.5719e+00 5.6092e+00
Std. 9.9970e-01 8.9282e-01 1.2832e+00 4.2152e+00
$f_{13}$ Best -3.0000e+01 -3.0000e+01 -3.0000e+01 -3.0000e+01
Mean -3.0000e+01 -3.0000e+01 -3.0000e+01 -3.0000e+01
Std. 2.3597e-13 0.0000e+00 4.2507e-12 6.6486e-14
$f_{14}$ Best -7.8332e+01 -7.8332e+01 -7.8332e+01 -7.8332e+01
Mean -7.8332e+01 -7.8332e+01 -7.8332e+01 -7.7201e+01
Std. 5.0220e-07 8.9447e-07 1.2933e-06 2.3413e+00
$f_{15}$ Best 1.7330e-03 3.1350e-03 1.1926e-181 0.0000e+00
Mean 8.1827e-03 1.4765e-02 3.9923e-177 0.0000e+00
Std. 4.3095e-03 1.2199e-02 0.0000e+00 0.0000e+00
$f_{16}$ Best -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Mean -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Std. 1.0624e-13 1.3743e-13 0.0000e+00 0.0000e+00
$f_{17}$ Best -3.5527e-15 -3.5527e-15 0.0000e+00 0.0000e+00
Mean -3.3892e-15 -3.1925e-15 0.0000e+00 2.0292e-67
Std. 6.3290e-16 9.9925e-16 0.0000e+00 7.8473e-67
$f_{18}$ Best 2.5826e-05 5.0852e-04 3.9689e-07 1.0547e-13
Mean 6.0659e-02 6.3264e-02 1.7993e-05 4.1372e-12
Std. 1.2079e-01 1.0323e-01 1.8082e-05 4.4431e-12
Fun. Indictor CS ASCS WCS TLCS
$f_{1}$ Best 4.5213e-14 2.4536e-18 0.0000e+00 0.0000e+00
Mean 2.6910e-12 1.7970e-16 0.0000e+00 0.0000e+00
Std. 4.2657e-12 2.0513e-16 0.0000e+00 0.0000e+00
$f_{2}$ Best 5.3705e-05 9.3072e-05 8.8818e-16 8.8818e-16
Mean 3.7553e-04 6.6183e-04 8.8818e-16 8.8818e-16
Std. 4.3639e-04 6.6652e-04 0.0000e+00 0.0000e+00
$f_{3}$ Best 9.9825e-16 5.7419e-19 0.0000e+00 0.0000e+00
Mean 3.5974e-14 1.8282e-17 0.0000e+00 0.0000e+00
Std. 4.4513e-14 2.9784e-17 0.0000e+00 0.0000e+00
$f_{4}$ Best 2.1760e-02 6.0844e-02 0.0000e+00 0.0000e+00
Mean 4.1580e-01 6.0469e-01 0.0000e+00 0.0000e+00
Std. 4.1771e-01 4.3998e-01 0.0000e+00 0.0000e+00
$f_{5}$ Best 5.4423e-04 3.8454e-03 9.0786e-05 3.2173e-06
Mean 4.9548e-02 8.8289e-02 4.9515e-03 9.0026e-05
Std. 8.2985e-02 1.2769e-01 4.4433e-03 2.4935e-04
$f_{6}$ Best 2.4318e-02 1.3526e-02 0.0000e+00 0.0000e+00
Mean 4.0590e-02 4.9268e-02 1.6139e-02 0.0000e+00
Std. 1.1239e-02 2.1748e-02 2.6449e-02 0.0000e+00
$f_{7}$ Best 7.4053e-12 4.1549e-11 1.4224e-13 6.0523e-17
Mean 4.5672e-10 5.7734e-10 4.9219e-11 1.3148e-15
Std. 1.2514e-09 5.2112e-10 4.1785e-11 2.0139e-15
$f_{8}$ Best 7.5663e-19 1.9396e-20 0.0000e+00 5.2592e-51
Mean 3.2565e-17 1.1242e-17 0.0000e+00 1.0280e-44
Std. 4.6518e-17 1.5804e-17 0.0000e+00 2.1851e-44
$f_{9}$ Best 2.7592e-13 1.2952e-16 0.0000e+00 0.0000e+00
Mean 2.6449e-12 1.6090e-15 0.0000e+00 0.0000e+00
Std. 2.9165e-12 1.4999e-15 0.0000e+00 0.0000e+00
$f_{10}$ Best 5.5995e-03 6.6042e-03 0.0000e+00 0.0000e+00
Mean 1.1684e-02 1.0426e-02 0.0000e+00 0.0000e+00
Std. 6.3211e-03 3.5660e-03 0.0000e+00 0.0000e+00
$f_{11}$ Best 2.0752e+03 2.0752e+03 2.0752e+03 2.0225e+03
Mean 2.0752e+03 2.0752e+03 2.0752e+03 2.0616e+03
Std. 4.7941e-11 1.7728e-10 1.5453e-09 2.2077e+01
$f_{12}$ Best 1.1195e-03 1.3950e-03 1.1187e-02 9.8810e-13
Mean 7.2247e-01 5.6108e-01 1.5719e+00 5.6092e+00
Std. 9.9970e-01 8.9282e-01 1.2832e+00 4.2152e+00
$f_{13}$ Best -3.0000e+01 -3.0000e+01 -3.0000e+01 -3.0000e+01
Mean -3.0000e+01 -3.0000e+01 -3.0000e+01 -3.0000e+01
Std. 2.3597e-13 0.0000e+00 4.2507e-12 6.6486e-14
$f_{14}$ Best -7.8332e+01 -7.8332e+01 -7.8332e+01 -7.8332e+01
Mean -7.8332e+01 -7.8332e+01 -7.8332e+01 -7.7201e+01
Std. 5.0220e-07 8.9447e-07 1.2933e-06 2.3413e+00
$f_{15}$ Best 1.7330e-03 3.1350e-03 1.1926e-181 0.0000e+00
Mean 8.1827e-03 1.4765e-02 3.9923e-177 0.0000e+00
Std. 4.3095e-03 1.2199e-02 0.0000e+00 0.0000e+00
$f_{16}$ Best -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Mean -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Std. 1.0624e-13 1.3743e-13 0.0000e+00 0.0000e+00
$f_{17}$ Best -3.5527e-15 -3.5527e-15 0.0000e+00 0.0000e+00
Mean -3.3892e-15 -3.1925e-15 0.0000e+00 2.0292e-67
Std. 6.3290e-16 9.9925e-16 0.0000e+00 7.8473e-67
$f_{18}$ Best 2.5826e-05 5.0852e-04 3.9689e-07 1.0547e-13
Mean 6.0659e-02 6.3264e-02 1.7993e-05 4.1372e-12
Std. 1.2079e-01 1.0323e-01 1.8082e-05 4.4431e-12
Table 3.  Statistic results obtained by CS [24], ASCS [25], WCS [28] and TLCS on 10-D functions over 30 independent runs, respectively
Fun. Indictor CS ASCS WCS TLCS
$f_{1}$ Best 3.2389e-06 1.1207e-08 0.0000e+00 0.0000e+00
Mean 8.0057e-06 5.3375e-08 0.0000e+00 0.0000e+00
Std. 4.6479e-06 2.9549e-08 0.0000e+00 0.0000e+00
$f_{2}$ Best 3.7125e-02 8.7783e-02 8.8818e-16 8.8818e-16
Mean 6.0272e-01 4.0848e-01 8.8818e-16 8.8818e-16
Std. 5.8084e-01 2.2043e-01 0.0000e+00 0.0000e+00
$f_{3}$ Best 1.8590e-07 5.7623e-10 0.0000e+00 0.0000e+00
Mean 5.2596e-07 2.7587e-09 0.0000e+00 0.0000e+00
Std. 2.9285e-07 2.3201e-09 0.0000e+00 0.0000e+00
$f_{4}$ Best 6.0868e+00 9.9616e+00 0.0000e+00 0.0000e+00
Mean 1.1568e+01 1.5915e+01 0.0000e+00 0.0000e+00
Std. 2.4106e+00 4.5746e+00 0.0000e+00 0.0000e+00
$f_{5}$ Best 5.5106e-01 1.1646e+00 2.7452e+00 2.2219e+00
Mean 4.3244e+00 3.2423e+00 3.6089e+00 2.7140e+00
Std. 1.4901e+00 1.5008e+00 3.7702e-01 3.3872e-01
$f_{6}$ Best 5.4516e-02 4.9131e-02 0.0000e+00 0.0000e+00
Mean 7.6544e-02 9.0650e-02 0.0000e+00 0.0000e+00
Std. 1.4500e-02 2.1017e-02 0.0000e+00 0.0000e+00
$f_{7}$ Best 1.6338e-04 3.3752e-03 4.5958e-06 9.2416e-10
Mean 1.0474e-02 2.5951e-02 1.2202e-04 1.7004e-08
Std. 1.0911e-02 1.8264e-02 1.1195e-04 1.4461e-08
$f_{8}$ Best 6.3186e-09 1.5939e-09 0.0000e+00 2.1348e-36
Mean 4.4004e-08 9.4253e-09 0.0000e+00 7.5924e-26
Std. 4.4362e-08 7.1640e-09 0.0000e+00 2.3933e-25
$f_{9}$ Best 8.7140e-04 3.7275e-05 0.0000e+00 0.0000e+00
Mean 4.8839e-03 8.7891e-05 0.0000e+00 0.0000e+00
Std. 3.2252e-03 4.7437e-05 0.0000e+00 0.0000e+00
$f_{10}$ Best 4.0273e-01 4.6694e-01 0.0000e+00 0.0000e+00
Mean 7.8045e-01 8.4527e-01 0.0000e+00 0.0000e+00
Std. 2.4665e-01 2.5778e-01 0.0000e+00 0.0000e+00
$f_{11}$ Best 4.1504e+03 4.1504e+03 4.1504e+03 4.0458e+03
Mean 4.1504e+03 4.1504e+03 4.1504e+03 4.1152e+03
Std. 1.1924e-04 6.4787e-04 2.7421e-04 3.9471e+01
$f_{12}$ Best 2.7106e+01 3.3064e+01 2.4671e+01 4.0486e+01
Mean 4.2321e+01 4.4884e+01 3.8730e+01 4.4356e+01
Std. 6.7034e+00 6.0260e+00 4.9097e+00 1.7008e+00
$f_{13}$ Best -1.2467e+02 -1.2467e+02 -1.2467e+02 -1.7996e+02
Mean -1.2467e+02 -1.2467e+02 -1.2467e+02 -1.6700e+02
Std. 6.0730e-09 5.1284e-10 4.6884e-08 6.7526e+00
$f_{14}$ Best -7.8281e+01 -7.8225e+01 -7.8051e+01 -7.8332e+01
Mean -7.7819e+01 -7.6712e+01 -7.6642e+01 -7.6256e+01
Std. 3.5470e-01 1.4800e+00 1.1564e+00 2.9181e+00
$f_{15}$ Best 2.2444e-01 5.7034e-01 2.5015e-181 0.0000e+00
Mean 3.8266e-01 1.1298e+00 4.8905e-179 0.0000e+00
Std. 1.0949e-01 4.0051e-01 0.0000e+00 0.0000e+00
$f_{16}$ Best -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Mean -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Std. 9.4746e-13 6.1618e-13 0.0000e+00 0.0000e+00
$f_{17}$ Best -3.5527e-15 -3.5527e-15 0.0000e+00 0.0000e+00
Mean -3.3158e-15 -3.3945e-15 0.0000e+00 2.7979e-35
Std. 9.1735e-16 5.0210e-16 0.0000e+00 1.0836e-34
$f_{18}$ Best 1.0000e+10 1.0000e+10 3.2642e+01 2.4003e-05
Mean 1.0000e+10 1.0000e+10 6.0712e+01 5.3966e-04
Std. 0.0000e+00 0.0000e+00 2.0484e+01 4.9058e-04
Fun. Indictor CS ASCS WCS TLCS
$f_{1}$ Best 3.2389e-06 1.1207e-08 0.0000e+00 0.0000e+00
Mean 8.0057e-06 5.3375e-08 0.0000e+00 0.0000e+00
Std. 4.6479e-06 2.9549e-08 0.0000e+00 0.0000e+00
$f_{2}$ Best 3.7125e-02 8.7783e-02 8.8818e-16 8.8818e-16
Mean 6.0272e-01 4.0848e-01 8.8818e-16 8.8818e-16
Std. 5.8084e-01 2.2043e-01 0.0000e+00 0.0000e+00
$f_{3}$ Best 1.8590e-07 5.7623e-10 0.0000e+00 0.0000e+00
Mean 5.2596e-07 2.7587e-09 0.0000e+00 0.0000e+00
Std. 2.9285e-07 2.3201e-09 0.0000e+00 0.0000e+00
$f_{4}$ Best 6.0868e+00 9.9616e+00 0.0000e+00 0.0000e+00
Mean 1.1568e+01 1.5915e+01 0.0000e+00 0.0000e+00
Std. 2.4106e+00 4.5746e+00 0.0000e+00 0.0000e+00
$f_{5}$ Best 5.5106e-01 1.1646e+00 2.7452e+00 2.2219e+00
Mean 4.3244e+00 3.2423e+00 3.6089e+00 2.7140e+00
Std. 1.4901e+00 1.5008e+00 3.7702e-01 3.3872e-01
$f_{6}$ Best 5.4516e-02 4.9131e-02 0.0000e+00 0.0000e+00
Mean 7.6544e-02 9.0650e-02 0.0000e+00 0.0000e+00
Std. 1.4500e-02 2.1017e-02 0.0000e+00 0.0000e+00
$f_{7}$ Best 1.6338e-04 3.3752e-03 4.5958e-06 9.2416e-10
Mean 1.0474e-02 2.5951e-02 1.2202e-04 1.7004e-08
Std. 1.0911e-02 1.8264e-02 1.1195e-04 1.4461e-08
$f_{8}$ Best 6.3186e-09 1.5939e-09 0.0000e+00 2.1348e-36
Mean 4.4004e-08 9.4253e-09 0.0000e+00 7.5924e-26
Std. 4.4362e-08 7.1640e-09 0.0000e+00 2.3933e-25
$f_{9}$ Best 8.7140e-04 3.7275e-05 0.0000e+00 0.0000e+00
Mean 4.8839e-03 8.7891e-05 0.0000e+00 0.0000e+00
Std. 3.2252e-03 4.7437e-05 0.0000e+00 0.0000e+00
$f_{10}$ Best 4.0273e-01 4.6694e-01 0.0000e+00 0.0000e+00
Mean 7.8045e-01 8.4527e-01 0.0000e+00 0.0000e+00
Std. 2.4665e-01 2.5778e-01 0.0000e+00 0.0000e+00
$f_{11}$ Best 4.1504e+03 4.1504e+03 4.1504e+03 4.0458e+03
Mean 4.1504e+03 4.1504e+03 4.1504e+03 4.1152e+03
Std. 1.1924e-04 6.4787e-04 2.7421e-04 3.9471e+01
$f_{12}$ Best 2.7106e+01 3.3064e+01 2.4671e+01 4.0486e+01
Mean 4.2321e+01 4.4884e+01 3.8730e+01 4.4356e+01
Std. 6.7034e+00 6.0260e+00 4.9097e+00 1.7008e+00
$f_{13}$ Best -1.2467e+02 -1.2467e+02 -1.2467e+02 -1.7996e+02
Mean -1.2467e+02 -1.2467e+02 -1.2467e+02 -1.6700e+02
Std. 6.0730e-09 5.1284e-10 4.6884e-08 6.7526e+00
$f_{14}$ Best -7.8281e+01 -7.8225e+01 -7.8051e+01 -7.8332e+01
Mean -7.7819e+01 -7.6712e+01 -7.6642e+01 -7.6256e+01
Std. 3.5470e-01 1.4800e+00 1.1564e+00 2.9181e+00
$f_{15}$ Best 2.2444e-01 5.7034e-01 2.5015e-181 0.0000e+00
Mean 3.8266e-01 1.1298e+00 4.8905e-179 0.0000e+00
Std. 1.0949e-01 4.0051e-01 0.0000e+00 0.0000e+00
$f_{16}$ Best -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Mean -1.0000e+00 -1.0000e+00 -1.0000e+00 -1.0000e+00
Std. 9.4746e-13 6.1618e-13 0.0000e+00 0.0000e+00
$f_{17}$ Best -3.5527e-15 -3.5527e-15 0.0000e+00 0.0000e+00
Mean -3.3158e-15 -3.3945e-15 0.0000e+00 2.7979e-35
Std. 9.1735e-16 5.0210e-16 0.0000e+00 1.0836e-34
$f_{18}$ Best 1.0000e+10 1.0000e+10 3.2642e+01 2.4003e-05
Mean 1.0000e+10 1.0000e+10 6.0712e+01 5.3966e-04
Std. 0.0000e+00 0.0000e+00 2.0484e+01 4.9058e-04
Table 4.  Wilcoxon rank sum results on best and average function values obtained by TLCS against CS, ASCS and WCS algorithms for 18 benchmark functions
Algorithm Dim. criteria rank sum $p$ value $h$ value
TLCS to CS Best 421.5000 0.0051 1
Mean 411.0000 0.0139 1
Std 422.0000 0.0048 1
TLCS to ASCS Best 415.0000 0.0095 1
5 Mean 406.0000 0.0213 1
Std 408.5000 0.0165 1
TLCS to WCS Best 357.0000 0.4363 0
Mean 356.5000 0.4532 0
Std 333.0000 1.0000 0
TLCS to CS Best 410.5000 0.0143 1
Mean 407.0000 0.0196 1
Std 405.5000 0.0213 1
TLCS to ASCS Best 408.5000 0.0170 1
10 Mean 406.0000 0.0213 1
Std 407.5000 0.0180 1
TLCS to WCS Best 353.0000 0.5183 0
Mean 348.0000 0.6339 0
Std 324.5000 0.7810 0
Algorithm Dim. criteria rank sum $p$ value $h$ value
TLCS to CS Best 421.5000 0.0051 1
Mean 411.0000 0.0139 1
Std 422.0000 0.0048 1
TLCS to ASCS Best 415.0000 0.0095 1
5 Mean 406.0000 0.0213 1
Std 408.5000 0.0165 1
TLCS to WCS Best 357.0000 0.4363 0
Mean 356.5000 0.4532 0
Std 333.0000 1.0000 0
TLCS to CS Best 410.5000 0.0143 1
Mean 407.0000 0.0196 1
Std 405.5000 0.0213 1
TLCS to ASCS Best 408.5000 0.0170 1
10 Mean 406.0000 0.0213 1
Std 407.5000 0.0180 1
TLCS to WCS Best 353.0000 0.5183 0
Mean 348.0000 0.6339 0
Std 324.5000 0.7810 0
[1]

Xiangyu Gao, Yong Sun. A new heuristic algorithm for laser antimissile strategy optimization. Journal of Industrial & Management Optimization, 2012, 8 (2) : 457-468. doi: 10.3934/jimo.2012.8.457

[2]

Ryan Loxton, Qun Lin. Optimal fleet composition via dynamic programming and golden section search. Journal of Industrial & Management Optimization, 2011, 7 (4) : 875-890. doi: 10.3934/jimo.2011.7.875

[3]

Adel Dabah, Ahcene Bendjoudi, Abdelhakim AitZai. An efficient Tabu Search neighborhood based on reconstruction strategy to solve the blocking job shop scheduling problem. Journal of Industrial & Management Optimization, 2017, 13 (4) : 2015-2031. doi: 10.3934/jimo.2017029

[4]

Behrouz Kheirfam, Morteza Moslemi. On the extension of an arc-search interior-point algorithm for semidefinite optimization. Numerical Algebra, Control & Optimization, 2018, 8 (2) : 261-275. doi: 10.3934/naco.2018015

[5]

Siyu Liu, Xue Yang, Yingjie Bi, Yong Li. Dynamic behavior and optimal scheduling for mixed vaccination strategy with temporary immunity. Discrete & Continuous Dynamical Systems - B, 2017, 22 (11) : 1-15. doi: 10.3934/dcdsb.2018216

[6]

Wing Yan Lee, Fangda Liu. Analysis of a dynamic premium strategy: From theoretical and marketing perspectives. Journal of Industrial & Management Optimization, 2018, 14 (4) : 1545-1564. doi: 10.3934/jimo.2018020

[7]

Yuxue Li, Maozhu Jin, Peiyu Ren, Zhixue Liao. Research on the optimal initial shunt strategy of Jiuzhaigou based on the optimization model. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1239-1249. doi: 10.3934/dcdss.2015.8.1239

[8]

Zhong Wan, Chaoming Hu, Zhanlu Yang. A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search. Discrete & Continuous Dynamical Systems - B, 2011, 16 (4) : 1157-1169. doi: 10.3934/dcdsb.2011.16.1157

[9]

Mingyong Lai, Xiaojiao Tong. A metaheuristic method for vehicle routing problem based on improved ant colony optimization and Tabu search. Journal of Industrial & Management Optimization, 2012, 8 (2) : 469-484. doi: 10.3934/jimo.2012.8.469

[10]

Juan Carlos De los Reyes, Carola-Bibiane Schönlieb. Image denoising: Learning the noise model via nonsmooth PDE-constrained optimization. Inverse Problems & Imaging, 2013, 7 (4) : 1183-1214. doi: 10.3934/ipi.2013.7.1183

[11]

Bruce D. Craven, Sardar M. N. Islam. Dynamic optimization models in finance: Some extensions to the framework, models, and computation. Journal of Industrial & Management Optimization, 2014, 10 (4) : 1129-1146. doi: 10.3934/jimo.2014.10.1129

[12]

Tao Zhang, Yue-Jie Zhang, Qipeng P. Zheng, P. M. Pardalos. A hybrid particle swarm optimization and tabu search algorithm for order planning problems of steel factories based on the Make-To-Stock and Make-To-Order management architecture. Journal of Industrial & Management Optimization, 2011, 7 (1) : 31-51. doi: 10.3934/jimo.2011.7.31

[13]

Kamil Rajdl, Petr Lansky. Fano factor estimation. Mathematical Biosciences & Engineering, 2014, 11 (1) : 105-123. doi: 10.3934/mbe.2014.11.105

[14]

Jian-Wu Xue, Xiao-Kun Xu, Feng Zhang. Big data dynamic compressive sensing system architecture and optimization algorithm for internet of things. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1401-1414. doi: 10.3934/dcdss.2015.8.1401

[15]

Alan Beggs. Learning in monotone bayesian games. Journal of Dynamics & Games, 2015, 2 (2) : 117-140. doi: 10.3934/jdg.2015.2.117

[16]

Yangyang Xu, Wotao Yin, Stanley Osher. Learning circulant sensing kernels. Inverse Problems & Imaging, 2014, 8 (3) : 901-923. doi: 10.3934/ipi.2014.8.901

[17]

Nicolás M. Crisosto, Christopher M. Kribs-Zaleta, Carlos Castillo-Chávez, Stephen Wirkus. Community resilience in collaborative learning. Discrete & Continuous Dynamical Systems - B, 2010, 14 (1) : 17-40. doi: 10.3934/dcdsb.2010.14.17

[18]

Fengwei Li, Qin Yue, Fengmei Liu. The weight distributions of constacyclic codes. Advances in Mathematics of Communications, 2017, 11 (3) : 471-480. doi: 10.3934/amc.2017039

[19]

Ka Wo Lau, Yue Kuen Kwok. Optimal execution strategy of liquidation. Journal of Industrial & Management Optimization, 2006, 2 (2) : 135-144. doi: 10.3934/jimo.2006.2.135

[20]

Minlong Lin, Ke Tang. Selective further learning of hybrid ensemble for class imbalanced increment learning. Big Data & Information Analytics, 2017, 2 (1) : 1-21. doi: 10.3934/bdia.2017005

2017 Impact Factor: 0.994

Metrics

  • PDF downloads (14)
  • HTML views (55)
  • Cited by (0)

[Back to Top]