• Previous Article
    Dynamical analysis of a toxin-producing phytoplankton-zooplankton model with refuge
  • MBE Home
  • This Issue
  • Next Article
    Modelling chemistry and biology after implantation of a drug-eluting stent. Part Ⅰ: Drug transport
April  2017, 14(2): 511-527. doi: 10.3934/mbe.2017031

Machine learning of swimming data via wisdom of crowd and regression analysis

1. 

School of Computer Engineering and Science, Shanghai University, 99 Shangda Road, Shanghai 200444, China

2. 

University High School, 4771 Campus Drive, Irvine, CA 92612, USA

3. 

Department of Mathematics, Center for Mathematical and Computational Biology, University of California, Irvine, CA 92697, USA

* Corresponding author: qnie@math.uci.edu

Received  July 2016 Accepted  August 05, 2016 Published  October 2016

Fund Project: This work was partially supported by the Major Research Plan of NSFC [No. 91330116], and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, and National Science Foundation grants DMS1161621 and DMS1562176

Every performance, in an officially sanctioned meet, by a registered USA swimmer is recorded into an online database with times dating back to 1980. For the first time, statistical analysis and machine learning methods are systematically applied to 4,022,631 swim records. In this study, we investigate performance features for all strokes as a function of age and gender. The variances in performance of males and females for different ages and strokes were studied, and the correlations of performances for different ages were estimated using the Pearson correlation. Regression analysis show the performance trends for both males and females at different ages and suggest critical ages for peak training. Moreover, we assess twelve popular machine learning methods to predict or classify swimmer performance. Each method exhibited different strengths or weaknesses in different cases, indicating no one method could predict well for all strokes. To address this problem, we propose a new method by combining multiple inference methods to derive Wisdom of Crowd Classifier (WoCC). Our simulation experiments demonstrate that the WoCC is a consistent method with better overall prediction accuracy. Our study reveals several new age-dependent trends in swimming and provides an accurate method for classifying and predicting swimming times.

Citation: Jiang Xie, Junfu Xu, Celine Nie, Qing Nie. Machine learning of swimming data via wisdom of crowd and regression analysis. Mathematical Biosciences & Engineering, 2017, 14 (2) : 511-527. doi: 10.3934/mbe.2017031
References:
[1]

M. Bächlin and G. Tröster, Swimming performance and technique evaluation with wearable acceleration sensors, Pervasive and Mobile Computing, 8 (2012), 68-81. Google Scholar

[2]

R. C. BarrosM. P. BasgaluppA. C. De Carvalho and A. Freitas, A survey of evolutionary algorithms for decision-tree induction, Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 42 (2012), 291-312. doi: 10.1109/TSMCC.2011.2157494. Google Scholar

[3]

D. BasakS. Pal and D. C. Patranabis, Support vector regression, Neural Information Processing-Letters and Reviews, 11 (2007), 203-224. Google Scholar

[4]

C. CaiG. WangY. WenJ. PeiX. Zhu and W. Zhuang, Superconducting transition temperature t c estimation for superconductors of the doped mgb2 system using topological index via support vector regression, Journal of Superconductivity and Novel Magnetism, 23 (2010), 745-748. doi: 10.1007/s10948-010-0727-7. Google Scholar

[5]

D. Cai, X. He and J. Han, Semi-supervised discriminant analysis, in Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on, IEEE, 2007, 1-7. doi: 10.1109/ICCV.2007.4408856. Google Scholar

[6]

J. CaoS. Kwong and R. Wang, A noise-detection based adaboost algorithm for mislabeled data, Pattern Recognition, 45 (2012), 4451-4465. doi: 10.1016/j.patcog.2012.05.002. Google Scholar

[7]

J. J. ChehR. S. Weinberg and K. C. Yook, An application of an artificial neural network investment system to predict takeover targets, Journal of Applied Business Research (JABR), 15 (2013), 33-46. doi: 10.19030/jabr.v15i4.8151. Google Scholar

[8]

J. L. Dye and V. A. Nicely, A general purpose curve fitting program for class and research use, Journal of chemical Education, 48 (1971), 443. doi: 10.1021/ed048p443. Google Scholar

[9]

M. A. Friedl and C. E. Brodley, Decision tree classification of land cover from remotely sensed data, Remote Sensing of Environment, 61 (1997), 399-409. doi: 10.1016/S0034-4257(97)00049-7. Google Scholar

[10]

K. Fukunaga and P. M. Narendra, A branch and bound algorithm for computing k-nearest neighbors, Computers, IEEE Transactions on, 100 (1975), 750-753. doi: 10.1109/T-C.1975.224297. Google Scholar

[11]

A. Garg and K. Tai, Comparison of regression analysis, artificial neural network and genetic programming in handling the multicollinearity problem, in Modelling, Identification & Control (ICMIC), 2012 Proceedings of International Conference on, IEEE, 2012,353-358.Google Scholar

[12]

Z. GuoW. ZhaoH. Lu and J. Wang, Multi-step forecasting for wind speed using a modified emd-based artificial neural network model, Renewable Energy, 37 (2012), 241-249. doi: 10.1016/j.renene.2011.06.023. Google Scholar

[13]

I. HmeidiB. Hawashin and E. El-Qawasmeh, Performance of knn and svm classifiers on full word arabic articles, Advanced Engineering Informatics, 22 (2008), 106-111. doi: 10.1016/j.aei.2007.12.001. Google Scholar

[14]

Y. Jiang, J. Lin, B. Cukic and T. Menzies, Variance analysis in software fault prediction models, in Software Reliability Engineering, 2009. ISSRE'09. 20th International Symposium on, IEEE, 2009, 99-108. doi: 10.1109/ISSRE.2009.13. Google Scholar

[15]

A. Liaw and M. Wiener, Classification and regression by randomforest, R news, 2 (2002), 18-22. Google Scholar

[16]

B. Liu and G. Qiu, Illuminant classification based on random forest, in Machine Vision Applications (MVA), 2015 14th IAPR International Conference on, IEEE, 2015,106-109. doi: 10.1109/MVA.2015.7153144. Google Scholar

[17]

D. MarbachR. J. PrillT. SchaffterC. MattiussiD. Floreano and G. Stolovitzky, Revealing strengths and weaknesses of methods for gene network inference, Proceedings of the National Academy of Sciences, 107 (2010), 6286-6291. doi: 10.1073/pnas.0913357107. Google Scholar

[18]

F. PedregosaG. VaroquauxA. GramfortV. MichelB. ThirionO. GriselM. BlondelP. PrettenhoferR. WeissV. DubourgJ. VanderplasA. PassosD. CournapeauM. BrucherM. Perrot and E. Duchesnay, Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, 12 (2011), 2825-2830. Google Scholar

[19]

M.-T. PuthM. Neuhäuser and G. D. Ruxton, Effective use of pearson's product--moment correlation coefficient, Animal Behaviour, 93 (2014), 183-189. doi: 10.1016/j.anbehav.2014.05.003. Google Scholar

[20]

G. RätschT. Onoda and K.-R. Müller, Soft margins for adaboost, Machine Learning, 42 (2001), 287-320. Google Scholar

[21]

J. F. ReisF. B. AlvesP. M. BrunoV. Vleck and G. P. Millet, Oxygen uptake kinetics and middle distance swimming performance, Journal of Science and Medicine in Sport, 15 (2012), 58-63. doi: 10.1016/j.jsams.2011.05.012. Google Scholar

[22]

B. Scholkopft and K.-R. Mullert, Fisher discriminant analysis with kernels, Neural Networks for Signal Processing Ⅸ, 1 (1999), p1. Google Scholar

[23]

C. Schüldt, I. Laptev and B. Caputo, Recognizing human actions: A local svm approach, in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, IEEE, 3 (2004), 32-36.Google Scholar

[24]

A. J. Smola and B. Schölkopf, A tutorial on support vector regression, Statistics and Computing, 14 (2004), 199-222. doi: 10.1023/B:STCO.0000035301.49549.88. Google Scholar

[25]

M. VasoB. KnechtleC. A. RüstT. Rosemann and R. Lepers, Age of peak swim speed and sex difference in performance in medley and freestyle swimming. a comparison between 200 m and 400 m in swiss elite swimmers, Journal of Human Sport and Exercise, 8 (2013), 954-965. doi: 10.4100/jhse.2013.84.06. Google Scholar

[26]

Q. WangG. M. GarrityJ. M. Tiedje and J. R. Cole, Naive bayesian classifier for rapid assignment of rrna sequences into the new bacterial taxonomy, Applied and Environmental Microbiology, 73 (2007), 5261-5267. doi: 10.1128/AEM.00062-07. Google Scholar

[27]

S.-C. Wang, Artificial neural network, in Interdisciplinary Computing in Java Programming, Springer, 2003, 81-100.Google Scholar

[28]

C.-H. WuJ.-M. Ho and D.-T. Lee, Travel-time prediction with support vector regression, Intelligent Transportation Systems, IEEE Transactions on, 5 (2004), 276-281. doi: 10.1109/TITS.2004.837813. Google Scholar

[29]

J. Wu, Z. Cai, S. Zeng and X. Zhu, Artificial immune system for attribute weighted naive bayes classification, in Neural Networks (IJCNN), The 2013 International Joint Conference on, IEEE, 2013, 1-8. doi: 10.1109/IJCNN.2013.6706818. Google Scholar

show all references

References:
[1]

M. Bächlin and G. Tröster, Swimming performance and technique evaluation with wearable acceleration sensors, Pervasive and Mobile Computing, 8 (2012), 68-81. Google Scholar

[2]

R. C. BarrosM. P. BasgaluppA. C. De Carvalho and A. Freitas, A survey of evolutionary algorithms for decision-tree induction, Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 42 (2012), 291-312. doi: 10.1109/TSMCC.2011.2157494. Google Scholar

[3]

D. BasakS. Pal and D. C. Patranabis, Support vector regression, Neural Information Processing-Letters and Reviews, 11 (2007), 203-224. Google Scholar

[4]

C. CaiG. WangY. WenJ. PeiX. Zhu and W. Zhuang, Superconducting transition temperature t c estimation for superconductors of the doped mgb2 system using topological index via support vector regression, Journal of Superconductivity and Novel Magnetism, 23 (2010), 745-748. doi: 10.1007/s10948-010-0727-7. Google Scholar

[5]

D. Cai, X. He and J. Han, Semi-supervised discriminant analysis, in Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on, IEEE, 2007, 1-7. doi: 10.1109/ICCV.2007.4408856. Google Scholar

[6]

J. CaoS. Kwong and R. Wang, A noise-detection based adaboost algorithm for mislabeled data, Pattern Recognition, 45 (2012), 4451-4465. doi: 10.1016/j.patcog.2012.05.002. Google Scholar

[7]

J. J. ChehR. S. Weinberg and K. C. Yook, An application of an artificial neural network investment system to predict takeover targets, Journal of Applied Business Research (JABR), 15 (2013), 33-46. doi: 10.19030/jabr.v15i4.8151. Google Scholar

[8]

J. L. Dye and V. A. Nicely, A general purpose curve fitting program for class and research use, Journal of chemical Education, 48 (1971), 443. doi: 10.1021/ed048p443. Google Scholar

[9]

M. A. Friedl and C. E. Brodley, Decision tree classification of land cover from remotely sensed data, Remote Sensing of Environment, 61 (1997), 399-409. doi: 10.1016/S0034-4257(97)00049-7. Google Scholar

[10]

K. Fukunaga and P. M. Narendra, A branch and bound algorithm for computing k-nearest neighbors, Computers, IEEE Transactions on, 100 (1975), 750-753. doi: 10.1109/T-C.1975.224297. Google Scholar

[11]

A. Garg and K. Tai, Comparison of regression analysis, artificial neural network and genetic programming in handling the multicollinearity problem, in Modelling, Identification & Control (ICMIC), 2012 Proceedings of International Conference on, IEEE, 2012,353-358.Google Scholar

[12]

Z. GuoW. ZhaoH. Lu and J. Wang, Multi-step forecasting for wind speed using a modified emd-based artificial neural network model, Renewable Energy, 37 (2012), 241-249. doi: 10.1016/j.renene.2011.06.023. Google Scholar

[13]

I. HmeidiB. Hawashin and E. El-Qawasmeh, Performance of knn and svm classifiers on full word arabic articles, Advanced Engineering Informatics, 22 (2008), 106-111. doi: 10.1016/j.aei.2007.12.001. Google Scholar

[14]

Y. Jiang, J. Lin, B. Cukic and T. Menzies, Variance analysis in software fault prediction models, in Software Reliability Engineering, 2009. ISSRE'09. 20th International Symposium on, IEEE, 2009, 99-108. doi: 10.1109/ISSRE.2009.13. Google Scholar

[15]

A. Liaw and M. Wiener, Classification and regression by randomforest, R news, 2 (2002), 18-22. Google Scholar

[16]

B. Liu and G. Qiu, Illuminant classification based on random forest, in Machine Vision Applications (MVA), 2015 14th IAPR International Conference on, IEEE, 2015,106-109. doi: 10.1109/MVA.2015.7153144. Google Scholar

[17]

D. MarbachR. J. PrillT. SchaffterC. MattiussiD. Floreano and G. Stolovitzky, Revealing strengths and weaknesses of methods for gene network inference, Proceedings of the National Academy of Sciences, 107 (2010), 6286-6291. doi: 10.1073/pnas.0913357107. Google Scholar

[18]

F. PedregosaG. VaroquauxA. GramfortV. MichelB. ThirionO. GriselM. BlondelP. PrettenhoferR. WeissV. DubourgJ. VanderplasA. PassosD. CournapeauM. BrucherM. Perrot and E. Duchesnay, Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, 12 (2011), 2825-2830. Google Scholar

[19]

M.-T. PuthM. Neuhäuser and G. D. Ruxton, Effective use of pearson's product--moment correlation coefficient, Animal Behaviour, 93 (2014), 183-189. doi: 10.1016/j.anbehav.2014.05.003. Google Scholar

[20]

G. RätschT. Onoda and K.-R. Müller, Soft margins for adaboost, Machine Learning, 42 (2001), 287-320. Google Scholar

[21]

J. F. ReisF. B. AlvesP. M. BrunoV. Vleck and G. P. Millet, Oxygen uptake kinetics and middle distance swimming performance, Journal of Science and Medicine in Sport, 15 (2012), 58-63. doi: 10.1016/j.jsams.2011.05.012. Google Scholar

[22]

B. Scholkopft and K.-R. Mullert, Fisher discriminant analysis with kernels, Neural Networks for Signal Processing Ⅸ, 1 (1999), p1. Google Scholar

[23]

C. Schüldt, I. Laptev and B. Caputo, Recognizing human actions: A local svm approach, in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, IEEE, 3 (2004), 32-36.Google Scholar

[24]

A. J. Smola and B. Schölkopf, A tutorial on support vector regression, Statistics and Computing, 14 (2004), 199-222. doi: 10.1023/B:STCO.0000035301.49549.88. Google Scholar

[25]

M. VasoB. KnechtleC. A. RüstT. Rosemann and R. Lepers, Age of peak swim speed and sex difference in performance in medley and freestyle swimming. a comparison between 200 m and 400 m in swiss elite swimmers, Journal of Human Sport and Exercise, 8 (2013), 954-965. doi: 10.4100/jhse.2013.84.06. Google Scholar

[26]

Q. WangG. M. GarrityJ. M. Tiedje and J. R. Cole, Naive bayesian classifier for rapid assignment of rrna sequences into the new bacterial taxonomy, Applied and Environmental Microbiology, 73 (2007), 5261-5267. doi: 10.1128/AEM.00062-07. Google Scholar

[27]

S.-C. Wang, Artificial neural network, in Interdisciplinary Computing in Java Programming, Springer, 2003, 81-100.Google Scholar

[28]

C.-H. WuJ.-M. Ho and D.-T. Lee, Travel-time prediction with support vector regression, Intelligent Transportation Systems, IEEE Transactions on, 5 (2004), 276-281. doi: 10.1109/TITS.2004.837813. Google Scholar

[29]

J. Wu, Z. Cai, S. Zeng and X. Zhu, Artificial immune system for attribute weighted naive bayes classification, in Neural Networks (IJCNN), The 2013 International Joint Conference on, IEEE, 2013, 1-8. doi: 10.1109/IJCNN.2013.6706818. Google Scholar

Figure 1.  The average variance and CV as a function of age for different strokes in the LCM
Figure 2.  The average variance and CV as a function of age for different strokes in the SCY
Figure 3.  The average variance and CV in time for different distances (LCM)
Figure 4.  The average variance and CV in time for different distances (SCY)
Figure 5.  Pearson Correlation coefficient between younger ages and age 18
Figure 6.  100M freestyle performance regression analysis
Figure 7.  100Y freestyle performance regression analysis
Figure 8.  Classification model
Figure 9.  Predictions of swimming times
Figure 10.  Illustration of a Wisdom of Crowd Classifier(WoCC)
Table 1.  A sample of the USA Swimming data set
StrokeCourseAgeTime (sec.)Power points
100Y_FRSCY2141.121053
100M_FLLCM2453.83926
100M_FRLCM2550.01930
200Y_FRSCY2096.52897
400M_IMLCM18273.69834
800M_FRLCM16520.64750
StrokeCourseAgeTime (sec.)Power points
100Y_FRSCY2141.121053
100M_FLLCM2453.83926
100M_FRLCM2550.01930
200Y_FRSCY2096.52897
400M_IMLCM18273.69834
800M_FRLCM16520.64750
Table 2.  The coefficients of the prediction equation (LCM and SCY)
courseLCMSCY
Female Group 10.28-9.31137.40.31-10.55140.4
Female Group 20.31-10.44146.40.34-11.22145.2
Female Group 30.33-10.86148.80.35-11.48146.6
Female Group 40.33-10.85148.20.35-11.37145.2
Male Group 10.33-12.18165.40.20-7.99126.5
Male Group 20.32-11.87163.10.24-9.36136.6
Male Group 30.32-11.69161.30.27-10.24143.2
Male Group 40.3412.35166.10.29-10.78147.3
courseLCMSCY
Female Group 10.28-9.31137.40.31-10.55140.4
Female Group 20.31-10.44146.40.34-11.22145.2
Female Group 30.33-10.86148.80.35-11.48146.6
Female Group 40.33-10.85148.20.35-11.37145.2
Male Group 10.33-12.18165.40.20-7.99126.5
Male Group 20.32-11.87163.10.24-9.36136.6
Male Group 30.32-11.69161.30.27-10.24143.2
Male Group 40.3412.35166.10.29-10.78147.3
Table 3.  Definition of male swimming time standards in 100M freestyle levels
Time Standards/CutsMean time in 18-year-old male (sec.)
AAAA Min $time<=54.09$
AAA Min $54.09<time<=56.59$
Slower than AAA Min $ time>56.59$
Time Standards/CutsMean time in 18-year-old male (sec.)
AAAA Min $time<=54.09$
AAA Min $54.09<time<=56.59$
Slower than AAA Min $ time>56.59$
Table 4.  Parameters of each method
MethodsParameters
KNN $k$=5
Linear SVM $kernel$=linear
RBF SVM $kernel$=rbf
DT $max\_depth$=10
RF $max\_depth$=10, $n\_estimators$=10, $max\_features$=1
AdaBoostdefault parameters
NBdefault parameters
LDAdefault parameters
QDAdefault parameters
ANN2 layers with 24 inputs 10 neurons in hidden layer
SVR $kernel$=linear, $C$=1.0
MethodsParameters
KNN $k$=5
Linear SVM $kernel$=linear
RBF SVM $kernel$=rbf
DT $max\_depth$=10
RF $max\_depth$=10, $n\_estimators$=10, $max\_features$=1
AdaBoostdefault parameters
NBdefault parameters
LDAdefault parameters
QDAdefault parameters
ANN2 layers with 24 inputs 10 neurons in hidden layer
SVR $kernel$=linear, $C$=1.0
Table 5.  MAD of swimming time predictions for breaststroke
Methodsmale 100Mmale 100Yfemale 100Mfemale 100Y
(103 Records)(179 Records)(143 Records)(210 Records)
QPR8.00s7.70s8.98s8.42s
ANN1.20s2.44s2.97s2.65s
SVR1.01s1.90s2.43s1.67s
Methodsmale 100Mmale 100Yfemale 100Mfemale 100Y
(103 Records)(179 Records)(143 Records)(210 Records)
QPR8.00s7.70s8.98s8.42s
ANN1.20s2.44s2.97s2.65s
SVR1.01s1.90s2.43s1.67s
Table 6.  MAD of swimming time predictions for freestyle
Methodsmale 100Mmale 100Yfemale 100Mfemale 100Y
(340 Records)(572 Records)(548 Records)(743 Records)
QPR6.70s8.00s8.21s6.05s
ANN1.50s1.24s1.50s1.17s
SVR1.18s1.02s1.28s0.95s
Methodsmale 100Mmale 100Yfemale 100Mfemale 100Y
(340 Records)(572 Records)(548 Records)(743 Records)
QPR6.70s8.00s8.21s6.05s
ANN1.50s1.24s1.50s1.17s
SVR1.18s1.02s1.28s0.95s
Table 7.  Prediction accuracy for freestyle
male 100Mmale 100Yfemale 100Mfemale 100Y
Methods(340 Records)(572 Records)(548 Records)(743 Records)
KNN0.550.600.590.58
Linear SVM 0.60 0.620.63 0.64
RBF SVM0.580.610.660.63
DT0.480.540.620.58
RF0.590.60 0.67 0.64
AdaBoost0.530.590.590.60
NB0.530.490.560.55
LDA 0.600.600.640.63
QDA0.520.560.580.53
WoCC 0.61 0.64 0.67 0.65
male 100Mmale 100Yfemale 100Mfemale 100Y
Methods(340 Records)(572 Records)(548 Records)(743 Records)
KNN0.550.600.590.58
Linear SVM 0.60 0.620.63 0.64
RBF SVM0.580.610.660.63
DT0.480.540.620.58
RF0.590.60 0.67 0.64
AdaBoost0.530.590.590.60
NB0.530.490.560.55
LDA 0.600.600.640.63
QDA0.520.560.580.53
WoCC 0.61 0.64 0.67 0.65
Table 8.  Accuracy of prediction for breaststroke
male 100Mmale 100Yfemale 100Mfemale 100Y
Methods(103 Records)(179 Records)(143 Records)(210 Records)
KNN0.500.46 0.750.64
Linear SVM0.470.460.70 0.66
RBF SVM0.370.360.660.54
DT0.440.500.690.64
RF0.460.49 0.750.63
AdaBoost0.470.450.570.56
NB 0.53 0.530.660.65
LDA0.450.450.670.64
QDA0.410.410.630.56
WoCC 0.61 0.49 0.75 0.66
male 100Mmale 100Yfemale 100Mfemale 100Y
Methods(103 Records)(179 Records)(143 Records)(210 Records)
KNN0.500.46 0.750.64
Linear SVM0.470.460.70 0.66
RBF SVM0.370.360.660.54
DT0.440.500.690.64
RF0.460.49 0.750.63
AdaBoost0.470.450.570.56
NB 0.53 0.530.660.65
LDA0.450.450.670.64
QDA0.410.410.630.56
WoCC 0.61 0.49 0.75 0.66
[1]

Pankaj Sharma, David Baglee, Jaime Campos, Erkki Jantunen. Big data collection and analysis for manufacturing organisations. Big Data & Information Analytics, 2017, 2 (2) : 127-139. doi: 10.3934/bdia.2017002

[2]

Xiao-Qian Jiang, Lun-Chuan Zhang. Stock price fluctuation prediction method based on time series analysis. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 915-927. doi: 10.3934/dcdss.2019061

[3]

Nick Cercone, F'IEEE. What's the big deal about big data?. Big Data & Information Analytics, 2016, 1 (1) : 31-79. doi: 10.3934/bdia.2016.1.31

[4]

Richard Boire. Understanding AI in a world of big data. Big Data & Information Analytics, 2017, 2 (5) : 22-42. doi: 10.3934/bdia.2018001

[5]

Enrico Capobianco. Born to be big: Data, graphs, and their entangled complexity. Big Data & Information Analytics, 2016, 1 (2&3) : 163-169. doi: 10.3934/bdia.2016002

[6]

Ali Asgary, Jianhong Wu. ADERSIM-IBM partnership in big data. Big Data & Information Analytics, 2016, 1 (4) : 277-278. doi: 10.3934/bdia.2016010

[7]

Weidong Bao, Wenhua Xiao, Haoran Ji, Chao Chen, Xiaomin Zhu, Jianhong Wu. Towards big data processing in clouds: An online cost-minimization approach. Big Data & Information Analytics, 2016, 1 (1) : 15-29. doi: 10.3934/bdia.2016.1.15

[8]

Yang Yu. Introduction: Special issue on computational intelligence methods for big data and information analytics. Big Data & Information Analytics, 2017, 2 (1) : i-ii. doi: 10.3934/bdia.201701i

[9]

Xiangmin Zhang. User perceived learning from interactive searching on big medical literature data. Big Data & Information Analytics, 2017, 2 (5) : 1-16. doi: 10.3934/bdia.2017019

[10]

Yaguang Huangfu, Guanqing Liang, Jiannong Cao. MatrixMap: Programming abstraction and implementation of matrix computation for big data analytics. Big Data & Information Analytics, 2016, 1 (4) : 349-376. doi: 10.3934/bdia.2016015

[11]

Mahdi Mahdiloo, Abdollah Noorizadeh, Reza Farzipoor Saen. Developing a new data envelopment analysis model for customer value analysis. Journal of Industrial & Management Optimization, 2011, 7 (3) : 531-558. doi: 10.3934/jimo.2011.7.531

[12]

Sunmoo Yoon, Maria Patrao, Debbie Schauer, Jose Gutierrez. Prediction models for burden of caregivers applying data mining techniques. Big Data & Information Analytics, 2017, 2 (5) : 1-9. doi: 10.3934/bdia.2017014

[13]

Mohammad Afzalinejad, Zahra Abbasi. A slacks-based model for dynamic data envelopment analysis. Journal of Industrial & Management Optimization, 2019, 15 (1) : 275-291. doi: 10.3934/jimo.2018043

[14]

Michele La Rocca, Cira Perna. Designing neural networks for modeling biological data: A statistical perspective. Mathematical Biosciences & Engineering, 2014, 11 (2) : 331-342. doi: 10.3934/mbe.2014.11.331

[15]

Zheng Dai, I.G. Rosen, Chuming Wang, Nancy Barnett, Susan E. Luczak. Using drinking data and pharmacokinetic modeling to calibrate transport model and blind deconvolution based data analysis software for transdermal alcohol biosensors. Mathematical Biosciences & Engineering, 2016, 13 (5) : 911-934. doi: 10.3934/mbe.2016023

[16]

Tieliang Gong, Qian Zhao, Deyu Meng, Zongben Xu. Why curriculum learning & self-paced learning work in big/noisy data: A theoretical perspective. Big Data & Information Analytics, 2016, 1 (1) : 111-127. doi: 10.3934/bdia.2016.1.111

[17]

Jian-Wu Xue, Xiao-Kun Xu, Feng Zhang. Big data dynamic compressive sensing system architecture and optimization algorithm for internet of things. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1401-1414. doi: 10.3934/dcdss.2015.8.1401

[18]

Zhouchen Lin. A review on low-rank models in data analysis. Big Data & Information Analytics, 2016, 1 (2&3) : 139-161. doi: 10.3934/bdia.2016001

[19]

Habibe Zare Haghighi, Sajad Adeli, Farhad Hosseinzadeh Lotfi, Gholam Reza Jahanshahloo. Revenue congestion: An application of data envelopment analysis. Journal of Industrial & Management Optimization, 2016, 12 (4) : 1311-1322. doi: 10.3934/jimo.2016.12.1311

[20]

Tyrus Berry, Timothy Sauer. Consistent manifold representation for topological data analysis. Foundations of Data Science, 2019, 1 (1) : 1-38. doi: 10.3934/fods.2019001

2018 Impact Factor: 1.313

Metrics

  • PDF downloads (45)
  • HTML views (1700)
  • Cited by (0)

Other articles
by authors

[Back to Top]