Michalski, R. On the quasi‐minimal solution of the general covering problem. Proceesings of Fifth International Symposium on Information Processing (FCIP 69), Yugoslavia, Bled, vol A3; October 3–11 1969, 125–128.
Michalski, R. A theory and methodology of inductive learning. Mach Learn 1983, 1: 83–134.
Michalski, R. AQVAL/1 computer implementation of a variable‐valued logic system VL 1 and examples of its application to pattern recognition. First International Joint Conference on Pattern Recognition, Washington, D.C., 1973, 3–17.
Chilausky, R, Jacobsen, B, Michalski, R. An application of variable‐valued logic to inductive learning of plant disease diagnostic rules. Proceedings of the Sixth International Symposium on Multiple‐valued Logic. Logan, UT: IEEE Computer Society Press Los Alamitos; 1976, 233–240.
Steinberg, D, Colla, P. CART: Tree‐structured Non‐parametric Data Analysis San Diego, CA: Salford Systems; 1995.
Quinlan, J. C4.5: Programs for Machine Learning. San Mateo: Morgan Kaufmann; 1993.
Breiman, L, Friedman, J, Stone, CJ, Olshen, RA. Classification and Regression Trees: Wadsworth International Group; 1984.
Michalski, R, Mozetic, I, Hong, J, Lavrac, N. The multi‐purpose incremental learning system AQ15 and its testing application to three medical domains. Proceedings of the 1986 AAAI Conference, Philadelphia, PA vol. 104; August 11–15 1986, 1041–1045.
Kaufman, K, Michalski, R. The AQ18 Machine Learning and Data Mining System: An Implementation and User`s Guide. MLI Report. Fairfax, VA: Machine Learning and Inference Laboratory, George Mason University; 1999.
Mitchell, T. Machine Learning. New York: McGraw‐Hill; 1997.
Cervone, G, Panait, L, Michalski, R. The development of the AQ20 learning system and initial experiments. Proceedings of the Fifth International Symposium on Intelligent Information Systems, June 18‐22, 2001, Zakopane, Poland: Physica Verlag; 2001, 13.
Keesee, APK. How Sequential‐Cover Data Mining Programs Learn. College of Science. Fairfax, VA: George Mason University; 2006.
Austern, M. Generic Programming and the STL: Using and Extending the C + + Standard Template Library. 1998.
Gamma, E, Helm, R, Johnson, R, Vlissides, J. Design Patterns: Elements of Reusable Object‐Oriented Software. Westford, MA: Addison‐Wesley Reading; 1995.
Bloedorn, E, Wnek, J, Michalski, R. Multistrategy constructive induction: AQ17‐MCI. Rep Mach Learn Infer Lab 1993, 1051: 93–4.
Wnek, J, Michalski, R. Hypothesis‐driven constructive induction in AQ17‐HCI: a method and experiments. Mach Learn 1994, 14: 139–168.
Cervone, G, Franzese, P, Ezber, Y, Boybeyi, Z. Risk assessment of atmospheric emissions using machine learning. Nat Hazards Earth Syst Sci 2008, 8: 991–1000.
Holland, J. Adaptation in Natural and Artificial Systems. Cambridge, MA: The MIT Press; 1975.
Goldberg, DE. Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, MA: Addison Wesley; 1989.
Bäck, T. Evolutionary Algorithms in Theory and Practice: Evolutionary Straegies, Evolutionary Programming, and Genetic Algorithms. Oxford, NY: Oxford University Press; 1996.
Michalewicz, Z. Genetic Algorithms + Data Structures = Evolution Programs. 3rd ed. Berlin: Springer‐Verlag; 1996.
Fogel, L. Intelligence Through Simulated Evolution: Forty Years of Evolutionary Programming. Wiley Series on Intelligent Systems. New York: John Wiley %26 Sons, Inc.; 1999.
De Jong, K. Evolutionary computation: a unified approach. Proceedings of the 2008 GECCO Conference on Genetic and Evolutionary Computation. New York: ACM; 2008, 2245–2258.
Darwin, C. On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. London: Oxford University Press; 1859.
Ashlock, D. Evolutionary Computation for Modeling and Optimization. Berlin Heidelberg: Springer‐Verlag; 2006.
Grefenstette, J. Incorporating problem specific knowledge into genetic algorithms. Genetic Alg Simul Annealing 1987, 4: 42–60.
Grefenstette, J. Lamarckian learning in multi‐agent environments. Proceedings of the Fourth International Conference on Genetic Algorithms, Morgan Kaufmann Publishers Inc., San Francisco, CA, 1991.
Sebag, M, Schoenauer, M. Controlling Crossover through Inductive Learning. Lecture Notes in Computer Science. London: Springer‐Verlag; 1994, 209–209.
Sebag, M, Schoenauer, M, Ravise, C. Inductive learning of mutation step‐size in evolutionary parameter optimization, Lecture Notes in Computer Science. London: Springer‐Verlag; 1997, 247–261.
Reynolds, R. Cultural Algorithms: Theory and Applications. Mcgraw‐Hill`S Advanced Topics In Computer Science Series. Maidenhead, England: McGraw‐Hill Ltd.; 1999, 367–378.
Hamda, H, Jouve, F, Lutton, E, Schoenauer, M, Sebag, M. Compact unstructured representations for evolutionary design. Appl Intell 2002, 16: 139–155.
Lozano, J. Towards a New Evolutionary Computation: Advances in the Estimation of Distribution Algorithms: Springer; 2006.
Michalski, R. Learnable evolution: combining symbolic and evolutionary learning. Proceedings of the Fourth International Workshop on Multistrategy Learning (MSL`98). 1999, 14–20.
Cervone, G, Michalski, R, Kaufman, K, Panait, L. Combining machine learning with evolutionary computation: Recent results on lem. Proceedings of the Fifth International Workshop on Multistrategy Learning (MSL‐2000). Portugal: Guimaraes; 2000, pp. 41–58.
Cervone, G, Kaufman, K, Michalski, R. Experimental validations of the learnable evolution model. Proceedings of the 2000 Congress on Evolutionary Computation, LaJolla, CA, vol. 2; July 16–19 2000.
Pudykiewicz, J. Application of adjoint tracer transport equations for evaluating source parameters. Atmos Environ 1998, 32: 3039–3050.
Hourdin, F, Issartel, JP. Sub‐surface nuclear tests monitoring through the ctbt xenon network. Geophys Res Lett 2000, 27: 2245–2248.
Enting, I. Inverse Problems in Atmospheric Constituent Transport. Cambridge, NY: Cambridge University Press; 2002, 392.
Gelman, A, Carlin, J, Stern, H, Rubin, D. Bayesian Data Analysis: Chapman %26 Hall/CRC; 2003, 668 pp.
Chow, F, Kosović, B, Chan, T. Source inversion for contaminant plume dispersion in urban environments using building‐resolving simulations. Proceedings of the 86th American Meteorological Society Annual Meeting, Atlanta, GA, January 2006, 12–22.
Senocak, I, Hengartner, N, Short, M, Daniel, W. Stochastic event reconstruction of atmospheric contaminant dispersion using Bayesian inference. Atmos Environ 2008, 42: 7718–7727.
Haupt, SE. A demonstration of coupled receptor/dispersion modeling with a genetic algorithm. Atmos Environ 2005, 39: 7181–7189.
Haupt, SE, Young, GS, Allen, CT. A genetic algorithm method to assimilate sensor data for a toxic contaminant release. J Comput 2007, 2: 85–93.
Allen, CT, Young, GS, Haupt, SE. Improving pollutant source characterization by better estimating wind direction with a genetic algorithm. Atmos Environ 2007, 41: 2283–2289.
Delle Monache, L, Lundquistand, J, Kosović, B, Johannesson, G, Dyer, K, et al. Bayesian inference and markov chain monte carlo sampling to reconstruct a contaminant source on a continental scale. J Appl Meteor Climatol 2008, 47: 2600–2613.
Pasquill, F. The estimation of the dispersion of windborne material. Meteorol Magazine 1961, 90: 33–49.
Pasquill, F, Smith, F. Atmospheric Diffusion. Chichester, UK: Ellis Horwood; 1983.
Arya, PS. Air Pollution Meteorology and Dispersion. Oxford, NY: Oxford University Press; 1999.
Barad, M, Haugen, D. Project Prairie Grass, A Field Program in Diffusion: United States Air Force, Air Research and Development Command, Air Force Cambridge Research Center; Cambridge, MA, 1958.