September  2019, 1(3): 293-306. doi: 10.3934/fods.2019013

Modelling dynamic network evolution as a Pitman-Yor process

Department of Mathematics, Imperial College London, 180 Queen's Gate, London – SW7 2AZ, United Kingdom

Published  August 2019

Dynamic interaction networks frequently arise in biology, communications technology and the social sciences, representing, for example, neuronal connectivity in the brain, internet connections between computers and human interactions within social networks. The evolution and strengthening of the links in such networks can be observed through sequences of connection events occurring between network nodes over time. In some of these applications, the identity and size of the network may be unknown a priori and may change over time. In this article, a model for the evolution of dynamic networks based on the Pitman-Yor process is proposed. This model explicitly admits power-laws in the number of connections on each edge, often present in real world networks, and, for careful choices of the parameters, power-laws for the degree distribution of the nodes. A novel empirical method for the estimation of the hyperparameters of the Pitman-Yor process is proposed, and some necessary corrections for uniform discrete base distributions are carefully addressed. The methodology is tested on synthetic data and in an anomaly detection study on the enterprise computer network of the Los Alamos National Laboratory, and successfully detects connections from a red-team penetration test.

Citation: Francesco Sanna Passino, Nicholas A. Heard. Modelling dynamic network evolution as a Pitman-Yor process. Foundations of Data Science, 2019, 1 (3) : 293-306. doi: 10.3934/fods.2019013
References:
[1]

D. Aldous, Exchangeability and related topics, in École d'Été de Probabilités de Saint-Flour XIII-1983, 1117 (1985), 1–198. doi: 10.1007/BFb0099421. Google Scholar

[2]

A. L. Barabási, The origin of bursts and heavy tails in human dynamics, Nature, 435 (2005), 207-211. Google Scholar

[3]

Y. BengioR. DucharmeP. Vincent and C. Janvin, A neural probabilistic language model, Journal of Machine Learning Research, 3 (2003), 1137-1155. Google Scholar

[4]

W. Buntine and M. Hutter, A Bayesian view of the Poisson-Dirichlet process, preprint, arXiv: 1007.0296.Google Scholar

[5]

C. Chen, L. Du and W. Buntine, Sampling table configurations for the hierarchical Poisson-Dirichlet process, in Machine Learning and Knowledge Discovery in Databases (eds. D. Gunopulos, T. Hofmann, D. Malerba and M. Vazirgiannis), Springer Berlin Heidelberg, 2011, 296–311. doi: 10.1007/978-3-642-23780-5_29. Google Scholar

[6]

T. S. Ferguson, A Bayesian analysis of some nonparametric problems, The Annals of Statistics, 1 (1973), 209-230. doi: 10.1214/aos/1176342360. Google Scholar

[7]

R. A. Fisher, Statistical Methods for Research Workers, Fourteenth edition. Hafner Publishing Co., New York, 1973. Google Scholar

[8]

A. GoldenbergA. X. ZhengS. E. Fienberg and E. M. Airoldi, A survey of statistical network models, Foundations and Trends in Machine Learning, 2 (2009), 129-233. Google Scholar

[9]

S. Goldwater, T. L. Griffiths and M. Johnson, Interpolating between types and tokens by estimating power-law generators, in Proceedings of the 18th International Conference on Neural Information Processing Systems, MIT Press, 2005, 459–466.Google Scholar

[10]

N. A. Heard and P. Rubin-Delanchy, Network-wide anomaly detection via the Dirichlet process, in Proceedings of the IEEE workshop on Big Data Analytics for Cyber-security Computing, 2016.Google Scholar

[11]

N. A. Heard and P. Rubin-Delanchy, Choosing between methods of combining $p$-values, Biometrika, 105 (2018), 239-246. doi: 10.1093/biomet/asx076. Google Scholar

[12]

H. Ishwaran and L. F. James, Gibbs sampling methods for stick-breaking priors, Journal of the American Statistical Association, 96 (2001), 161-173. doi: 10.1198/016214501750332758. Google Scholar

[13]

D. Jurafsky, J. H. Martin, P. Norvig and S. Russell, Speech and Language Processing, Pearson Education, 2014.Google Scholar

[14]

A. D. Kent, Cybersecurity data sources for dynamic network research, in Dynamic Networks and Cyber-Security, World Scientific, 2016.Google Scholar

[15]

H. Lancaster, Statistical control of counting experiments, Biometrika, 39 (1952), 419-422. Google Scholar

[16]

Y. Lv and C. X. Zhai, Positional language models for information retrieval, in Proceedings of the 32Nd International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM, 2009,299–306. doi: 10.1145/1571941.1571994. Google Scholar

[17]

C. Matias and V. Miele, Statistical clustering of temporal networks through a dynamic stochastic block model, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79 (2017), 1119-1141. doi: 10.1111/rssb.12200. Google Scholar

[18]

T. Mikolov, M. Karafiát, L. Burget, J. Černocký and S. Khudanpur, Recurrent neural network based language model, in Proceedings of the 11th Annual Conference of the International Speech Communication Association (INTERSPEECH 2010), International Speech Communication Association, 2010, 1045–1048.Google Scholar

[19]

M. E. J. Newman, Power laws, Pareto distributions and Zipf's law, Contemporary Physics, 46 (2005), 323-351. doi: 10.1080/00107510500052444. Google Scholar

[20]

K. Pearson, On a method of determining whether a sample of size $n$ supposed to have been drawn from a parent population having a known probability integral has probably been drawn at random, Biometrika, 25 (1933), 379-410. Google Scholar

[21]

P. O. Perry and P. J. Wolfe, Point process modelling for directed interaction networks, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (2013), 821-849. doi: 10.1111/rssb.12013. Google Scholar

[22]

J. Pitman, Combinatorial Stochastic Processes, Lecture Notes in Mathematics, 1875. Springer-Verlag, Berlin, 2006. Google Scholar

[23]

J. Pitman and M. Yor, The two-parameter Poisson-Dirichlet distribution derived from a stable sub-ordinator, Annals of Probability, 25 (1997), 855-900. doi: 10.1214/aop/1024404422. Google Scholar

[24]

M. Price-Williams and N. A. Heard, Nonparametric self-exciting models for computer network traffic, Statistics and Computing, 2019, 1–12. doi: 10.1007/s11222-019-09875-z. Google Scholar

[25]

R. Rosenfeld, A maximum entropy approach to adaptive statistical language modelling, Computer Speech & Language, 10 (1996), 187-228. doi: 10.1006/csla.1996.0011. Google Scholar

[26]

P. Rubin-Delanchy, N. A. Heard and D. J. Lawson, Meta analysis of mid-$p$-values: some new results based on the convex order, Journal of the American Statistical Association, 2018. doi: 10.1080/01621459.2018.1469994. Google Scholar

[27]

B. W. Silverman, Density Estimation, London: Chapman and Hall, 1986. Google Scholar

[28]

S. A. Stouffer, E. A. Suchman, L. C. DeVinney, S. A. Star and R. M. Williams, The American Soldier. Adjustment During Army Life, Princeton, New Jersey: Princeton University Press, 1949.Google Scholar

[29]

W. Y. Teh, A hierarchical Bayesian language model based on Pitman-Yor processes, in Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association of Computational Linguistics, 2006,985–992. doi: 10.3115/1220175.1220299. Google Scholar

[30]

L. H. C. Tippett, The Methods of Statistics, 4th ed. John Wiley & Sons, Inc., New York, N. Y.; Williams & Norgate, Ltd., London, 1952. Google Scholar

[31]

H. M. Wallach, S. T. Jensen, L. Dicker and K. A. Heller, An alternative prior process for nonparametric Bayesian clustering, in Proceedings of the Thireenth International Conference on Artificial Intelligence and Statistics (AISTATS 2010), 2010,892–899.Google Scholar

show all references

References:
[1]

D. Aldous, Exchangeability and related topics, in École d'Été de Probabilités de Saint-Flour XIII-1983, 1117 (1985), 1–198. doi: 10.1007/BFb0099421. Google Scholar

[2]

A. L. Barabási, The origin of bursts and heavy tails in human dynamics, Nature, 435 (2005), 207-211. Google Scholar

[3]

Y. BengioR. DucharmeP. Vincent and C. Janvin, A neural probabilistic language model, Journal of Machine Learning Research, 3 (2003), 1137-1155. Google Scholar

[4]

W. Buntine and M. Hutter, A Bayesian view of the Poisson-Dirichlet process, preprint, arXiv: 1007.0296.Google Scholar

[5]

C. Chen, L. Du and W. Buntine, Sampling table configurations for the hierarchical Poisson-Dirichlet process, in Machine Learning and Knowledge Discovery in Databases (eds. D. Gunopulos, T. Hofmann, D. Malerba and M. Vazirgiannis), Springer Berlin Heidelberg, 2011, 296–311. doi: 10.1007/978-3-642-23780-5_29. Google Scholar

[6]

T. S. Ferguson, A Bayesian analysis of some nonparametric problems, The Annals of Statistics, 1 (1973), 209-230. doi: 10.1214/aos/1176342360. Google Scholar

[7]

R. A. Fisher, Statistical Methods for Research Workers, Fourteenth edition. Hafner Publishing Co., New York, 1973. Google Scholar

[8]

A. GoldenbergA. X. ZhengS. E. Fienberg and E. M. Airoldi, A survey of statistical network models, Foundations and Trends in Machine Learning, 2 (2009), 129-233. Google Scholar

[9]

S. Goldwater, T. L. Griffiths and M. Johnson, Interpolating between types and tokens by estimating power-law generators, in Proceedings of the 18th International Conference on Neural Information Processing Systems, MIT Press, 2005, 459–466.Google Scholar

[10]

N. A. Heard and P. Rubin-Delanchy, Network-wide anomaly detection via the Dirichlet process, in Proceedings of the IEEE workshop on Big Data Analytics for Cyber-security Computing, 2016.Google Scholar

[11]

N. A. Heard and P. Rubin-Delanchy, Choosing between methods of combining $p$-values, Biometrika, 105 (2018), 239-246. doi: 10.1093/biomet/asx076. Google Scholar

[12]

H. Ishwaran and L. F. James, Gibbs sampling methods for stick-breaking priors, Journal of the American Statistical Association, 96 (2001), 161-173. doi: 10.1198/016214501750332758. Google Scholar

[13]

D. Jurafsky, J. H. Martin, P. Norvig and S. Russell, Speech and Language Processing, Pearson Education, 2014.Google Scholar

[14]

A. D. Kent, Cybersecurity data sources for dynamic network research, in Dynamic Networks and Cyber-Security, World Scientific, 2016.Google Scholar

[15]

H. Lancaster, Statistical control of counting experiments, Biometrika, 39 (1952), 419-422. Google Scholar

[16]

Y. Lv and C. X. Zhai, Positional language models for information retrieval, in Proceedings of the 32Nd International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM, 2009,299–306. doi: 10.1145/1571941.1571994. Google Scholar

[17]

C. Matias and V. Miele, Statistical clustering of temporal networks through a dynamic stochastic block model, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79 (2017), 1119-1141. doi: 10.1111/rssb.12200. Google Scholar

[18]

T. Mikolov, M. Karafiát, L. Burget, J. Černocký and S. Khudanpur, Recurrent neural network based language model, in Proceedings of the 11th Annual Conference of the International Speech Communication Association (INTERSPEECH 2010), International Speech Communication Association, 2010, 1045–1048.Google Scholar

[19]

M. E. J. Newman, Power laws, Pareto distributions and Zipf's law, Contemporary Physics, 46 (2005), 323-351. doi: 10.1080/00107510500052444. Google Scholar

[20]

K. Pearson, On a method of determining whether a sample of size $n$ supposed to have been drawn from a parent population having a known probability integral has probably been drawn at random, Biometrika, 25 (1933), 379-410. Google Scholar

[21]

P. O. Perry and P. J. Wolfe, Point process modelling for directed interaction networks, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (2013), 821-849. doi: 10.1111/rssb.12013. Google Scholar

[22]

J. Pitman, Combinatorial Stochastic Processes, Lecture Notes in Mathematics, 1875. Springer-Verlag, Berlin, 2006. Google Scholar

[23]

J. Pitman and M. Yor, The two-parameter Poisson-Dirichlet distribution derived from a stable sub-ordinator, Annals of Probability, 25 (1997), 855-900. doi: 10.1214/aop/1024404422. Google Scholar

[24]

M. Price-Williams and N. A. Heard, Nonparametric self-exciting models for computer network traffic, Statistics and Computing, 2019, 1–12. doi: 10.1007/s11222-019-09875-z. Google Scholar

[25]

R. Rosenfeld, A maximum entropy approach to adaptive statistical language modelling, Computer Speech & Language, 10 (1996), 187-228. doi: 10.1006/csla.1996.0011. Google Scholar

[26]

P. Rubin-Delanchy, N. A. Heard and D. J. Lawson, Meta analysis of mid-$p$-values: some new results based on the convex order, Journal of the American Statistical Association, 2018. doi: 10.1080/01621459.2018.1469994. Google Scholar

[27]

B. W. Silverman, Density Estimation, London: Chapman and Hall, 1986. Google Scholar

[28]

S. A. Stouffer, E. A. Suchman, L. C. DeVinney, S. A. Star and R. M. Williams, The American Soldier. Adjustment During Army Life, Princeton, New Jersey: Princeton University Press, 1949.Google Scholar

[29]

W. Y. Teh, A hierarchical Bayesian language model based on Pitman-Yor processes, in Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association of Computational Linguistics, 2006,985–992. doi: 10.3115/1220175.1220299. Google Scholar

[30]

L. H. C. Tippett, The Methods of Statistics, 4th ed. John Wiley & Sons, Inc., New York, N. Y.; Williams & Norgate, Ltd., London, 1952. Google Scholar

[31]

H. M. Wallach, S. T. Jensen, L. Dicker and K. A. Heller, An alternative prior process for nonparametric Bayesian clustering, in Proceedings of the Thireenth International Conference on Artificial Intelligence and Statistics (AISTATS 2010), 2010,892–899.Google Scholar

Figure 1.  Cartoon example of the Chinese Restaurant metaphor for the Pitman-Yor process, where $ C_j $ represents the $ j $th customer and $ x_j^\star $ is the unique dish served at table $ T_j $. The vector $ \boldsymbol x_n = (x_1, \dots, x_n) $ denotes the observed sequence of dishes eaten by each customer. Customer $ C_i $ being seated at table $ T_j $ is denoted $ C_i\to T_j $. Then, for example, the tenth customer will sit at $ T_1 $ with probability $ (3-d)/(\alpha+9) $
Figure 2.  Kernel density estimates of the parameter estimates from 10000 simulations, $ |V| = {1000} $
Figure 3.  Kernel density estimates of the parameter estimates from 10000 simulations, $ |V| = {16230} $
Figure 4.  Uniform Q-Q plot for the Pitman-Yor process fitted to six destination nodes
Figure 5.  Plot of the corrected $ K_n $ (a), $ H_{1n} $ (b) and their ratio $ H_{1n}/K_n $ (c) as a function of $ n $ for the connections to the destination computer ${\mathtt C1438} $ , and averaged sample paths from $ 100 $ samples from a Pitman-Yor process, obtained using different estimates of the parameters. The grey lines correspond to 10 realised trajectories of simulated Pitman-Yor processes, obtained using the corrected estimate (5)
Table 1.  Estimated Pitman-Yor parameters for 6 destination nodes
Destination $ N $ $ \tilde K_n $ $ \hat K_n $ $ \tilde H_{1n} $ $ \hat H_{1n} $ $ \hat\alpha $ $ \hat d $
$\mathtt C5716$ $ {113987} $ $ {3401} $ $ {3816.418} $ $ 144 $ $ 182.164 $ $ 651.519 $ $ 0.047 $
$\mathtt U7$ $ {138286} $ $ {2700} $ $ {2952.989} $ $ 350 $ $ 419.819 $ $ 302.093 $ $ 0.142 $
$\mathtt C2525$ $ {204532} $ $ {1555} $ $ {1634.571} $ $ 97 $ $ 107.272 $ $ 183.319 $ $ 0.066 $
$\mathtt C1877$ $ {342766} $ $ {5095} $ $ {6114.758} $ $ 226 $ $ 329.390 $ $ 866.520 $ $ 0.054 $
$\mathtt C395$ $ {518058} $ $ {5957} $ $ {7422.437} $ $ 442 $ $ 698.259 $ $ 841.357 $ $ 0.094 $
$\mathtt C423$ $ {2426512} $ $ {2705} $ $ {2958.988} $ $ 166 $ $ 199.188 $ $ 230.040 $ $ 0.067 $
Destination $ N $ $ \tilde K_n $ $ \hat K_n $ $ \tilde H_{1n} $ $ \hat H_{1n} $ $ \hat\alpha $ $ \hat d $
$\mathtt C5716$ $ {113987} $ $ {3401} $ $ {3816.418} $ $ 144 $ $ 182.164 $ $ 651.519 $ $ 0.047 $
$\mathtt U7$ $ {138286} $ $ {2700} $ $ {2952.989} $ $ 350 $ $ 419.819 $ $ 302.093 $ $ 0.142 $
$\mathtt C2525$ $ {204532} $ $ {1555} $ $ {1634.571} $ $ 97 $ $ 107.272 $ $ 183.319 $ $ 0.066 $
$\mathtt C1877$ $ {342766} $ $ {5095} $ $ {6114.758} $ $ 226 $ $ 329.390 $ $ 866.520 $ $ 0.054 $
$\mathtt C395$ $ {518058} $ $ {5957} $ $ {7422.437} $ $ 442 $ $ 698.259 $ $ 841.357 $ $ 0.094 $
$\mathtt C423$ $ {2426512} $ $ {2705} $ $ {2958.988} $ $ 166 $ $ 199.188 $ $ 230.040 $ $ 0.067 $
Table 2.  Anomaly rankings for the four red-team source computers
Events $ x_{n+1}\vert y_{n+1} $ only. Events $ x_{n+1}\vert y_{n+1} $ only.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 5 $ $ 2 $ $ 4 $ $ 2 $ $ 1 $ $ 5 $
Source ${\mathtt C18025}$ $ 138 $ $ 75 $ $ 78 $ $ 151 $ $ 74 $ $ 105 $
computer ${\mathtt C19932}$ $ 3831 $ $ 8870 $ $ 8877 $ $ 3571 $ $ 2754 $ $ 3151 $
${\mathtt C22409}$ $ 3767 $ $ 15773 $ $ 15764 $ $ 3450 $ $ 6984 $ $ 3756 $
Events $ y_{n+1} $ only. Event level combiner: Tippett.
Mid $ p $-values $ q_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 6 $ $ 5 $ $ 5 $ $ 6 $ $ 5 $ $ 5 $
Source ${\mathtt C18025}$ $ 2806 $ $ 1536 $ $ 1674 $ $ 142 $ $ 96 $ $ 107 $
computer ${\mathtt C19932}$ $ 5407 $ $ 8882 $ $ 8914 $ $ 3813 $ $ 2264 $ $ 3232 $
${\mathtt C22409}$ $ 12126 $ $ 15808 $ $ 15878 $ $ 3803 $ $ 6516 $ $ 4196 $
Event level combiner: Fisher. Event level combiner: Fisher.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 3 $ $ 5 $ $ 5 $ $ 5 $ $ 2 $ $ 5 $
Source ${\mathtt C18025}$ $ 151 $ $ 88 $ $ 101 $ $ 155 $ $ 90 $ $ 106 $
computer ${\mathtt C19932}$ $ 6339 $ $ 3818 $ $ 4879 $ $ 4937 $ $ 3017 $ $ 3996 $
${\mathtt C22409}$ $ 6120 $ $ 14799 $ $ 5379 $ $ 4451 $ $ 6695 $ $ 5236 $
Events $ x_{n+1}\vert y_{n+1} $ only. Events $ x_{n+1}\vert y_{n+1} $ only.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 5 $ $ 2 $ $ 4 $ $ 2 $ $ 1 $ $ 5 $
Source ${\mathtt C18025}$ $ 138 $ $ 75 $ $ 78 $ $ 151 $ $ 74 $ $ 105 $
computer ${\mathtt C19932}$ $ 3831 $ $ 8870 $ $ 8877 $ $ 3571 $ $ 2754 $ $ 3151 $
${\mathtt C22409}$ $ 3767 $ $ 15773 $ $ 15764 $ $ 3450 $ $ 6984 $ $ 3756 $
Events $ y_{n+1} $ only. Event level combiner: Tippett.
Mid $ p $-values $ q_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 6 $ $ 5 $ $ 5 $ $ 6 $ $ 5 $ $ 5 $
Source ${\mathtt C18025}$ $ 2806 $ $ 1536 $ $ 1674 $ $ 142 $ $ 96 $ $ 107 $
computer ${\mathtt C19932}$ $ 5407 $ $ 8882 $ $ 8914 $ $ 3813 $ $ 2264 $ $ 3232 $
${\mathtt C22409}$ $ 12126 $ $ 15808 $ $ 15878 $ $ 3803 $ $ 6516 $ $ 4196 $
Event level combiner: Fisher. Event level combiner: Fisher.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 3 $ $ 5 $ $ 5 $ $ 5 $ $ 2 $ $ 5 $
Source ${\mathtt C18025}$ $ 151 $ $ 88 $ $ 101 $ $ 155 $ $ 90 $ $ 106 $
computer ${\mathtt C19932}$ $ 6339 $ $ 3818 $ $ 4879 $ $ 4937 $ $ 3017 $ $ 3996 $
${\mathtt C22409}$ $ 6120 $ $ 14799 $ $ 5379 $ $ 4451 $ $ 6695 $ $ 5236 $
[1]

Edward S. Canepa, Alexandre M. Bayen, Christian G. Claudel. Spoofing cyber attack detection in probe-based traffic monitoring systems using mixed integer linear programming. Networks & Heterogeneous Media, 2013, 8 (3) : 783-802. doi: 10.3934/nhm.2013.8.783

[2]

Fabien Caubet, Carlos Conca, Matías Godoy. On the detection of several obstacles in 2D Stokes flow: Topological sensitivity and combination with shape derivatives. Inverse Problems & Imaging, 2016, 10 (2) : 327-367. doi: 10.3934/ipi.2016003

[3]

Dominique Zosso, Jing An, James Stevick, Nicholas Takaki, Morgan Weiss, Liane S. Slaughter, Huan H. Cao, Paul S. Weiss, Andrea L. Bertozzi. Image segmentation with dynamic artifacts detection and bias correction. Inverse Problems & Imaging, 2017, 11 (3) : 577-600. doi: 10.3934/ipi.2017027

[4]

Meenakshi Kansal, Ratna Dutta, Sourav Mukhopadhyay. Group signature from lattices preserving forward security in dynamic setting. Advances in Mathematics of Communications, 2019, 0 (0) : 0-0. doi: 10.3934/amc.2020027

[5]

Valery Y. Glizer, Vladimir Turetsky, Emil Bashkansky. Statistical process control optimization with variable sampling interval and nonlinear expected loss. Journal of Industrial & Management Optimization, 2015, 11 (1) : 105-133. doi: 10.3934/jimo.2015.11.105

[6]

Alessia Marigo, Benedetto Piccoli. A model for biological dynamic networks. Networks & Heterogeneous Media, 2011, 6 (4) : 647-663. doi: 10.3934/nhm.2011.6.647

[7]

Jérôme Renault. General limit value in dynamic programming. Journal of Dynamics & Games, 2014, 1 (3) : 471-484. doi: 10.3934/jdg.2014.1.471

[8]

Xianchao Xiu, Ying Yang, Wanquan Liu, Lingchen Kong, Meijuan Shang. An improved total variation regularized RPCA for moving object detection with dynamic background. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-14. doi: 10.3934/jimo.2019024

[9]

Manel Hmimida, Rushed Kanawati. Community detection in multiplex networks: A seed-centric approach. Networks & Heterogeneous Media, 2015, 10 (1) : 71-85. doi: 10.3934/nhm.2015.10.71

[10]

Michele La Rocca, Cira Perna. Designing neural networks for modeling biological data: A statistical perspective. Mathematical Biosciences & Engineering, 2014, 11 (2) : 331-342. doi: 10.3934/mbe.2014.11.331

[11]

M. B. Short, G. O. Mohler, P. J. Brantingham, G. E. Tita. Gang rivalry dynamics via coupled point process networks. Discrete & Continuous Dynamical Systems - B, 2014, 19 (5) : 1459-1477. doi: 10.3934/dcdsb.2014.19.1459

[12]

Joseph D. Skufca, Erik M. Bollt. Communication and Synchronization in Disconnected Networks with Dynamic Topology: Moving Neighborhood Networks. Mathematical Biosciences & Engineering, 2004, 1 (2) : 347-359. doi: 10.3934/mbe.2004.1.347

[13]

R. M. Yulmetyev, E. V. Khusaenova, D. G. Yulmetyeva, P. Hänggi, S. Shimojo, K. Watanabe, J. Bhattacharya. Dynamic effects and information quantifiers of statistical memory of MEG's signals at photosensitive epilepsy. Mathematical Biosciences & Engineering, 2009, 6 (1) : 189-206. doi: 10.3934/mbe.2009.6.189

[14]

Yan-An Hwang, Yu-Hsien Liao. Reduction and dynamic approach for the multi-choice Shapley value. Journal of Industrial & Management Optimization, 2013, 9 (4) : 885-892. doi: 10.3934/jimo.2013.9.885

[15]

Cristina De Ambrosi, Annalisa Barla, Lorenzo Tortolina, Nicoletta Castagnino, Raffaele Pesenti, Alessandro Verri, Alberto Ballestrero, Franco Patrone, Silvio Parodi. Parameter space exploration within dynamic simulations of signaling networks. Mathematical Biosciences & Engineering, 2013, 10 (1) : 103-120. doi: 10.3934/mbe.2013.10.103

[16]

Pasquale Candito, Giovanni Molica Bisci. Multiple solutions for a Navier boundary value problem involving the $p$--biharmonic operator. Discrete & Continuous Dynamical Systems - S, 2012, 5 (4) : 741-751. doi: 10.3934/dcdss.2012.5.741

[17]

Fausto Ferrari, Qing Liu, Juan Manfredi. On the characterization of $p$-harmonic functions on the Heisenberg group by mean value properties. Discrete & Continuous Dynamical Systems - A, 2014, 34 (7) : 2779-2793. doi: 10.3934/dcds.2014.34.2779

[18]

Agnid Banerjee, Nicola Garofalo. On the Dirichlet boundary value problem for the normalized $p$-laplacian evolution. Communications on Pure & Applied Analysis, 2015, 14 (1) : 1-21. doi: 10.3934/cpaa.2015.14.1

[19]

Yoon-Sik Cho, Aram Galstyan, P. Jeffrey Brantingham, George Tita. Latent self-exciting point process model for spatial-temporal networks. Discrete & Continuous Dynamical Systems - B, 2014, 19 (5) : 1335-1354. doi: 10.3934/dcdsb.2014.19.1335

[20]

Neal Koblitz, Alfred Menezes. Another look at security definitions. Advances in Mathematics of Communications, 2013, 7 (1) : 1-38. doi: 10.3934/amc.2013.7.1

 Impact Factor: 

Metrics

  • PDF downloads (20)
  • HTML views (114)
  • Cited by (0)

Other articles
by authors

[Back to Top]