Anagnostopoulos,, T., Anagnostopoulos,, C., & Hadjiefthymiades,, S. (2011). An adaptive machine learning algorithm for location prediction. International Journal of Wireless Information Networks, 18(2), 88–99.
Azodi,, A., Gawron,, M., Sapegin,, A., Cheng,, F., & Meinel,, C. (2015). Leveraging event structure for adaptive machine learning on big data landscapes. In S. Boumerdassi, S. Bouzefrane, & É. Renault (Eds.), Proceedings of the international conference on mobile, secure and programmable networking (pp. 28–40). Switzerland: Springer.
Bhatt,, S., Patwa,, F., & Sandhu,, R. (2017). Access control model for AWS internet of things. In Z. Yan, R. Molva, W. Mazurczyk, & R. Kantola (Eds.), Proceedings of the international conference on network and system security (pp. 721–736). Cham: Springer.
Blum,, A. (1998). On‐line algorithms in machine learning. In A. Fiat, & G. J. Woeginger, (Eds.), Online algorithms. Lecture notes in computer science (Vol. 1442, pp. 306–325). Berlin, Heidelberg: Springer‐Verlag.
Cao,, L. (2015). Actionable knowledge discovery and delivery. In L. Cao (Ed.), Metasynthetic computing and engineering of complex systems (pp. 287–312). London: Springer.
Cybenko,, G. (2017). Parallel computing for machine learning in social network analysis. In Proceedings of the international parallel and distributed processing symposium work‐shops (pp. 1464–1471). Lake Buena Vista, FL: IEEE.
Dean,, J., & Ghemawat,, S. (2010). MapReduce: A flexible data processing tool. Communications of the ACM, 53(1), 72–77.
Feurer,, M., Klein,, A., Eggensperger,, K., Springenberg,, J., Blum,, M., & Hutter,, F. (2015). Efficient and robust automated machine learning. In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, & R. Garnett (Eds.), Proceedings of the advances in neural information processing systems (pp. 2962–2970). Curran Associates, Inc.
Hurwitz,, J., Kaufman,, M., & Bowles,, A. (2015). Cognitive computing and big data analytics. Indianapolis, USA: John Wiley %26 Sons, Inc.
Ilievski,, I., Akhtar,, T., Feng,, J., & Shoemaker,, C. A. (2017). Efficient hyperparameter optimization for deep learning algorithms using deterministic RBF surrogates. In Proceedings of the AAAI (pp. 822–829). Retrieved from https://aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14312
Khazaee,, A., Ebrahimzadeh,, A., & Babajani‐Feremi,, A. (2016). Application of advanced machine learning methods on resting‐state fMRI network for identification of mild cognitive impairment and Alzheimer`s disease. Brain Imaging and Behavior, 10(3), 799–817.
Klein,, A., Bartels,, S., Falkner,, S., Hennig,, P., & Hutter,, F. (2015). Towards efficient Bayesian optimization for big data. In Proceedings of the NIPS 2015 workshop on Bayesian optimization.
Komarek,, A., Pavlik,, J., & Sobeslav,, V. (2017). Performance analysis of cloud computing infrastructure. In M. Younas, I. Awan, & I. Holubova (Eds.), Proceedings of the international conference on mobile web and information systems (pp. 303–313). Cham: Springer.
Littlestone,, N., & Warmuth,, M. K. (1994). The weighted majority algorithm. Information and Computation, 108(2), 212–261.
Luketina,, J., Berglund,, M., Greff,, K., & Raiko,, T. (2016). Scalable gradient‐based tuning of continuous regularization hyperparameters. In Proceedings of the international conference on machine learning (pp. 2952–2960). Retrieved from JMLR.org
Luo,, G. (2016). A review of automatic selection methods for machine learning algorithms and hyper‐parameter values. Network Modeling Analysis in Health Informatics and Bioinformatics, 5(1), 18.
Modha,, D. S., Ananthanarayanan,, R., Esser,, S. K., Ndirango,, A., Sherbondy,, A. J., & Singh,, R. (2011). Cognitive computing. Communications of the ACM, 54(8), 62–71.
Nguyen,, D. T., Gupta,, S., Rana,, S., & Venkatesh,, S. (2017). Stable Bayesian optimization. In J. Kim, K. Shim, L. Cao, J. G. Lee, X. Lin, & Y. S. Moon (Eds.), Proceedings of the Pacific‐Asia conference on knowledge discovery and data mining (pp. 578–591). Cham: Springer.
Pääkkönen,, P., & Pakkala,, D. (2015). Reference architecture and classification of technologies, products and services for big data systems. Big Data Research, 2(4), 166–186.
Polson,, N. G., & Sokolov,, V. (2017). Deep learning: A Bayesian perspective. Bayesian Analysis, 12(4), 1275–1304.
Samanpour,, A. R., Ruegenberg,, A., & Ahlers,, R. (2018). The future of machine learning and predictive analytics. In C. Linnhoff‐Popien,, R. Schneider,, & M. Zaddach, (Eds.), Digital marketplaces unleashed (pp. 297–309). Berlin, Heidelberg: Springer.
Shahriari,, B., Swersky,, K., Wang,, Z., Adams,, R. P., & De Freitas,, N. (2016). Taking the human out of the loop: A review of Bayesian optimization. Proceedings of the IEEE, 104(1), 148–175.
Snoek,, J., Larochelle,, H., & Adams,, R. P. (2012). Practical Bayesian optimization of machine learning algorithms. In F. Pereira, C. J. C. Burges, L. Bottou, & K. Q. Weinberger (Eds.), Proceedings of the 25th international conference on neural information processing systems (pp. 2951–2959). USA: Curran Associates Inc.
Sparks,, E. R., Talwalkar,, A., Haas,, D., Franklin,, M. J., Jordan,, M. I., & Kraska,, T. (2015). Automating model search for large scale machine learning. In Proceedings of the 6th ACM symposium on cloud computing (pp. 368–380). New York, NY, USA: ACM.
Stephens,, C. R., Sánchez‐Cordero,, V., & González Salazar,, C. (2017). Bayesian inference of ecological interactions from spatial data. Entropy, 19(12), 547.
Subasi,, O., Di,, S., Balaprakash,, P., Unsal,, O., Labarta,, J., Cristal,, A., … Cappello,, F. (2017). MACORD: Online adaptive machine learning framework for silent error detection. In Proceedings of the international conference on cluster computing (pp. 717–724). Honolulu, HI: IEEE.
Sundaravarathan,, K., Martin,, P., Rope,, D., McRoberts,, M., & Statchuk,, C. (2016). MEWSE: Multi‐engine workflow submission and execution on apache yarn. In B. Jones (Ed.), Proceedings of the 26th annual international conference on computer science and software engineering (pp. 194–200). Riverton, NJ, USA: IBM Corp.
Suthaharan,, S. (2015). Machine learning models and algorithms for big data classification: Thinking with examples for effective learning. New York, USA: Springer.
Suthaharan,, S. (2016). A cognitive random forest: An intra‐and intercognitive computing for big data classification under cune condition. In V. N. Gudivada, V. V. Raghavan, V. Govindaraju, & C. R. Rao (Eds.), Handbook of statistics (Vol. 35, pp. 207–227). Elsevier.
Tafaj,, E., Kasneci,, G., Rosenstiel,, W., & Bogdan,, M. (2012). Bayesian online clustering of eye movement data. In S. N. Spencer (Ed.), Proceedings of the symposium on eye tracking research and applications (pp. 285–288). New York, NY, USA: ACM.
Thornton,, C., Hutter,, F., Hoos,, H. H., & Leyton‐Brown,, K. (2013). Auto‐WEKA: Combined selection and hyperparameter optimization of classification algorithms. In R. Ghani, T. E. Senator, P. Bradley, R. Parekh, & J. He (Eds.), Proceedings of the 19th ACM SIGKDD international conference on knowledge discovery and data mining (pp. 847–855). New York, NY, USA: ACM.
Torabi,, K., Sayad,, S., & Balke,, S. T. (2005). On‐line adaptive Bayesian classification for inline particle image monitoring in polymer film manufacturing. Computers %26 Chemical Engineering, 30(1), 18–27.
Wainer,, J., & Cawley,, G. (2017). Empirical evaluation of resampling procedures for optimising SVM hyperparameters. Journal of Machine Learning Research, 18(15), 1–35.
Wang,, Y., Howard,, N., Kacprzyk,, J., Frieder,, O., Sheu,, P., Fiorini,, R. A., … Widrow,, B. (2018). Cognitive informatics: Towards cognitive machine learning and autonomous knowledge manipulation. International Journal of Cognitive Informatics and Natural Intelligence, 12(1), 1–13.
White,, T. (2012). Hadoop: The definitive guide. California, USA: O`Reilly Media, Inc.