--- abstract: "Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, meta-learning as search in the space of data models, (dis)similarity based methods providing a framework for such meta-learning, and a more general approach based on chains of transformations. Many useful transformations that extract information from features are discussed. Heterogeneous adaptive systems are presented as particular example of transformation-based systems, and the goal of learning is redefined to facilitate creation of simpler data models. The need to understand data structures leads to techniques for logical and prototype-based rule extraction, and to generation of multiple alternative models, while the need to increase predictive power of adaptive models leads to committees of competent models. Learning from partial observations is a natural extension towards reasoning based on perceptions, and an approach to intuitive solving of such problems is presented. Throughout the paper neurocognitive inspirations are frequently used and are especially important in modeling of the higher cognitive functions. Promising directions such as liquid and laminar computing are identified and many open problems presented. \r\n\r\n" altloc: - http://www.fizyka.umk.pl/publications/kmk/06-CI-foundations.html chapter: ~ commentary: ~ commref: ~ confdates: ~ conference: ~ confloc: ~ contact_email: ~ creators_id: - wduch@is.umk.pl creators_name: - family: Duch given: Wlodzislaw honourific: Prof lineage: '' date: 2007 date_type: published datestamp: 2008-01-08 00:28:59 department: ~ dir: disk0/00/00/58/91 edit_lock_since: ~ edit_lock_until: ~ edit_lock_user: ~ editors_id: [] editors_name: - family: Duch given: W honourific: Prof lineage: '' - family: Mandziuk given: J honourific: Prof lineage: '' eprint_status: archive eprintid: 5891 fileinfo: /style/images/fileicons/application_pdf.png;/5891/1/06%2DCI%2Dfoundations.pdf full_text_status: public importid: ~ institution: ~ isbn: ~ ispublished: pub issn: ~ item_issues_comment: [] item_issues_count: 0 item_issues_description: [] item_issues_id: [] item_issues_reported_by: [] item_issues_resolved_by: [] item_issues_status: [] item_issues_timestamp: [] item_issues_type: [] keywords: 'Computational intelligence, neurocognitive informatics, machine learning, neural networks, artificial intelligence, neurocognitive linguistics' lastmod: 2011-03-11 08:57:02 latitude: ~ longitude: ~ metadata_visibility: show note: ~ number: ~ pagerange: ~ pubdom: TRUE publication: 'Springer "Studies in Computational Intelligence" Series, Vol. 63, 261-316, 2007' publisher: Springer refereed: TRUE referencetext: "1. F. Corbacho A. Sierra, J.A. Macias. Evolution of functional link networks. IEEE Transactions\r\non Evolutionary Computation, 5:54–65, 2001.\r\n2. N.I. Achieser. Theory of Approximation. Frederick Ungar, New York, 1956. Reprinted:\r\nDover Publications, New York 1992.\r\n3. R. Adamczak,W. Duch, and N. Jankowski. New developments in the feature space mapping\r\nmodel. In Third Conference on Neural Networks and Their Applications, pages 65–70,\r\nKule, Poland, Oct 1997.\r\n4. J. A. Anderson, A. Pellionisz, and E. Rosenfeld. Neurocomputing 2. MIT Press, Cambridge,\r\nMA, 1990.\r\n5. J. A. Anderson and E. Rosenfeld. Neurocomputing - foundations of research. MIT Press,\r\nCambridge, MA, 1988.\r\n6. R. Avnimelech and N. Intrator. Boosted mixture of experts: An ensemble learning scheme.\r\nNeural Computation, 11:483–497, 1999.\r\n7. F.R. Bach and M.I. Jordan. Kernel independent component analysis. Journal of Machine\r\nLearning Research, 3:1–48, 2002.\r\n8. P.M. Baggenstoss. The pdf projection theorem and the class-specific method. IEEE Transactions\r\non Signal Processing, 51:672–668, 2003.\r\n9. B. Bakker and T. Heskes. Task clustering and gating for bayesian multitask learning. Journal\r\nof Machine Learning Research, 4:83–99, 2003.\r\n10. E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms:\r\nbagging, boosting and variants. Machine learning, 36:105–142, 1999.\r\n11. Y. Bengio, O. Delalleau, and N. Le Roux. The curse of highly variable functions for local\r\nkernel machines. Advances in Neural Information Processing Systems, 18:107–114, 2006.\r\n12. Y. Bengio, M. Monperrus, and H. Larochelle. Non-local estimation of manifold structure.\r\nNeural Computation, 18:2509–2528, 2006.\r\n13. T. Bilgic¸ and I.B. T¨urks¸en. Measurements of membership functions: Theoretical and empirical\r\nwork. In D. Dubois and H. Prade, editors, Fundamentals of Fuzzy Sets, Vol. 1, pages\r\n195–232. Kluver, Boston, 2000.\r\n14. C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.\r\n15. M. Blachnik, W. Duch, and T. Wieczorek. Selection of prototypes rules context searching\r\nvia clustering. Lecture Notes in Artificial Intelligence, 4029:573–582, 2006.\r\n16. E. Bonabeau, M. Dorigo, and G. Theraulaz. Swarm Intelligence: From Natural to Artificial\r\nSystems. Oxford University Press, 1999.\r\n17. L. Breiman. Bias-variance, regularization, instability and stabilization. In C. M. Bishop,\r\neditor, Neural Networks and Machine Learning, pages 27–56. Springer-Verlag, 1998.\r\n18. Y. Burnod. An Adaptive Neural Network. The Cerebral Cortex. Prentice-Hall, London,\r\n1990.\r\n19. G.A. Carpenter and S. Grossberg. Adaptive resonance theory. In M.A. Arbib, editor, The\r\nHandbook of Brain Theory and Neural Networks, 2nd ed, pages 87–90. MIT Press, Cambridge,\r\nMA, 2003.\r\n20. G. Chaitin. Algorithmic Information Theory. Cambridge University Press, 1987.\r\n21. O. Chapelle. Training a support vector machine in the primal. Neural Computation, in\r\nprint, 2006.\r\n22. N. Chater. The search for simplicity: A fundamental cognitive principle? Quarterly Journal\r\nof Experimental Psychology, 52A:273–302, 1999.\r\n23. A. Cichocki and S. Amari. Adaptive Blind Signal and Image Processing. Learning Algorithms\r\nand Applications. J. Wiley & Sons, New York, 2002.\r\n24. T. M. Cover. Geometrical and statistical properties of systems of linear inequalities with\r\napplications in pattern recognition. IEEE Transactions on Electronic Computers, 14:326–\r\n334, 1965.\r\n25. G.S. Cree, K. McRae, and C. McNorgan. An attractor model of lexical conceptual processing:\r\nSimulating semantic priming. Cognitive Science, 23(3):371–414, 1999.\r\n26. F. Crestani. Application of spreading activation techniques in information retrieval. Artifical\r\nIntelligence Review, 11(6):453–482, 1997.\r\n27. N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines and other\r\nKernel-Based Learning Methods. Cambridge University Press, 2000.\r\n28. P. Dayan and G.E. Hinton. Varieties of helmholtz machines. Neural Networks, 9:1385–\r\n1403, 1996.\r\n29. A.M. de Callata¨y. Natural and Artificial Intelligence. Misconceptions about Brains and\r\nNeural Networks. Elsevier, Amsterdam, 1992.\r\n30. L.N. de Castro and J.I. Timmis. Artificial Immune Systems: A New Computational Intelligence\r\nApproach. Springer, 2002.\r\n31. S. Deneve and A. Pouget. Basis functions for object-centered representations. Neuron,\r\n37:347–359, 2003.\r\n32. T.G. Dietterich and G. Bakiri. Solving multiclass learning problems via error-correcting\r\noutput codes. Journal Of Artificial Intelligence Research, 2:263–286, 1995.\r\n33. W. Duch. Neural minimal distance methods. In Proceedings 3-rd Conference on Neural\r\nNetworks and Their Applications, pages 183–188, Kule, Poland, Oct 1997.\r\n34. W. Duch. Platonic model of mind as an approximation to neurodynamics. In S i. Amari\r\nand N. Kasabov, editors, Brain-like computing and intelligent information systems, pages\r\n491–512. Springer, 1997.\r\n35. W. Duch. Similarity based methods: a general framework for classification, approximation\r\nand association. Control and Cybernetics, 29:937–968, 2000.\r\n36. W. Duch. Coloring black boxes: visualization of neural network decisions. In Int. Joint\r\nConf. on Neural Networks, Portland, Oregon, volume I, pages 1735–1740. IEEE Press,\r\n2003.\r\n37. W. Duch. Uncertainty of data, fuzzy membership functions, and multi-layer perceptrons.\r\nIEEE Transactions on Neural Networks, 16:10–23, 2005.\r\n38. W. Duch. Computational creativity. In World Congres on Computational Intelligence,\r\nVancouver, Canada, pages 1162–1169. IEEE Press, 2006.\r\n39. W. Duch. Filter methods. In I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh, editors,\r\nFeature extraction, foundations and applications, pages 89–118. Physica Verlag, Springer,\r\nBerlin, Heidelberg, New York, 2006.\r\n40. W. Duch. k-separability. Lecture Notes in Computer Science, 4131:188–197, 2006.\r\n41. W. Duch, R. Adamczak, and G. H. F. Diercksen. Distance-based multilayer perceptrons.\r\nIn M. Mohammadian, editor, International Conference on Computational Intelligence for\r\nModelling Control and Automation, pages 75–80, Amsterdam, The Netherlands, 1999. IOS\r\nPress.\r\n42. W. Duch, R. Adamczak, and G. H. F. Diercksen. Neural networks in non-euclidean spaces.\r\nNeural Processing Letters, 10:201–210, 1999.\r\n43. W. Duch, R. Adamczak, and G.H.F. Diercksen. Classification, association and pattern completion\r\nusing neural similarity based methods. Applied Mathemathics and Computer Science,\r\n10:101–120, 2000.\r\n44. W. Duch, R. Adamczak, and G.H.F. Diercksen. Feature space mapping neural network\r\napplied to structure-activity relationship problems. In Soo-Young Lee, editor, 7th International\r\nConference on Neural Information Processing (ICONIP’2000), pages 270–274,\r\nDae-jong, Korea, 2000.\r\n45. W. Duch, R. Adamczak, and G.H.F. Diercksen. Constructive density estimation network\r\nbased on several different separable transfer functions. In 9th European Symposium on\r\nArtificial Neural Networks, Bruges, Belgium, Apr 2001.\r\n46. W. Duch, R. Adamczak, and K. Gra¸bczewski. Extraction of logical rules from backpropagation\r\nnetworks. Neural Processing Letters, 7:1–9, 1998.\r\n47. W. Duch, R. Adamczak, and K. Gra¸bczewski. A new methodology of extraction, optimization\r\nand application of crisp and fuzzy logical rules. IEEE Transactions on Neural\r\nNetworks, 12:277–306, 2001.\r\n48. W. Duch and M. Blachnik. Fuzzy rule-based systems derived from similarity to prototypes.\r\nLecture Notes in Computer Science, 3316:912–917, 2004.\r\n49. W. Duch and M. Blachnik. Fuzzy rule-based systems derived from similarity to prototypes.\r\nIn N.R. Pal, N. Kasabov, R.K. Mudi, S. Pal, and S.K. Parui, editors, Lecture Notes\r\nin Computer Science, volume 3316, pages 912–917. Physica Verlag, Springer, New York,\r\n2004.\r\n50. W. Duch and G. H. F. Diercksen. Feature space mapping as a universal adaptive system.\r\nComputer Physics Communications, 87:341–371, 1995.\r\n51. W. Duch and K. Gra¸bczewski. Heterogeneous adaptive systems. In IEEE World Congress\r\non Computational Intelligence, pages 524–529. IEEE Press, Honolulu, May 2002.\r\n52. W. Duch and K. Grudzi´nski. Search and global minimization in similarity-based methods.\r\nIn International Joint Conference on Neural Networks, page Paper 742, Washington D.C.,\r\n1999. IEEE Press.\r\n53. W. Duch and K. Grudzi´nski. Meta-learning via search combined with parameter optimization.\r\nIn L. Rutkowski and J. Kacprzyk, editors, Advances in Soft Computing, pages 13–22.\r\nPhysica Verlag, Springer, New York, 2002.\r\n54. W. Duch and L. Itert. Competent undemocratic committees. In L. Rutkowski and\r\nJ. Kacprzyk, editors, Neural Networks and Soft Computing, pages 412–417. Physica Verlag,\r\nSpringer, 2002.\r\n55. W. Duch and L. Itert. Committees of undemocratic competent models. In L. Rutkowski and\r\nJ. Kacprzyk, editors, Proc. of Int. Conf. on Artificial Neural Networks (ICANN), Istanbul,\r\npages 33–36, 2003.\r\n56. W. Duch and N. Jankowski. Complex systems, information theory and neural networks. In\r\nFirst Polish Conference on Neural Networks and Their Applications, pages 224–230, Kule,\r\nPoland, Apr 1994.\r\n57. W. Duch and N. Jankowski. Survey of neural transfer functions. Neural Computing Surveys,\r\n1999.\r\n58. W. Duch and N. Jankowski. Taxonomy of neural transfer functions. In International Joint\r\nConference on Neural Networks, volume III, pages 477–484, Como, Italy, 2000. IEEE\r\nPress.\r\n59. W. Duch and N. Jankowski. Transfer functions: hidden possibilities for better neural networks.\r\nIn 9th European Symposium on Artificial Neural Networks, pages 81–94, Brusells,\r\nBelgium, 2001. De-facto publications.\r\n60. W. Duch, N. Jankowski, A. Naud, and R. Adamczak. Feature space mapping: a neurofuzzy\r\nnetwork for system identification. In Proceedings of the European Symposium on Artificial\r\nNeural Networks, pages 221–224, Helsinki, Aug 1995.\r\n61. W. Duch and J. Mandziuk. Quo vadis computational intelligence? In P. Sincak, J. Vascak,\r\nand K. Hirota, editors, Machine Intelligence: Quo Vadis?, volume 21, pages 3–28. World\r\nScientific, Advances in Fuzzy Systems – Applications and Theory, 2004.\r\n62. W. Duch, R. Setiono, and J. Zurada. Computational intelligence methods for understanding\r\nof data. Proceedings of the IEEE, 92(5):771–805, 2004.\r\n63. R. O. Duda, P. E. Hart, and D.G. Stork. Patter Classification. J. Wiley & Sons, New York,\r\n2001.\r\n64. R.S. Michalski (ed). Multistrategy Learning. Kluwer Academic Publishers, 1993.\r\n65. S. E. Fahlman and C. Lebiere. The cascade-correlation learning architecture. In D. S.\r\nTouretzky, editor, Advances in Neural Information Processing Systems 2, pages 524–532.\r\nMorgan Kaufmann, 1990.\r\n66. J.H. Friedman. Exploratory projection pursuit. Journal of the American Statistical Association,\r\n82:249–266, 1987.\r\n67. K.S. Fu. Syntactic Pattern Recognition and Applications. Prentice-Hall, New York, 1982.\r\n68. W. Gerstner and W.M. Kistler. Spiking Neuron Models. Single Neurons, Populations, Plasticity.\r\nCambridge University Press, 2002.\r\n69. G. Giacinto and F. Roli. Dynamic classifier selection based on multiple classifier behaviour.\r\nPattern Recognition, 34:179–181, 2001.\r\n70. A. Gifi. Nonlinear Multivariate Analysis. Wiley, Boston, 1990.\r\n71. Ch. Giraud-Carrier, R. Vilalta, and P. Brazdil. Introduction to the special issue on metalearning.\r\nMachine Learning, 54:197–194, 2004.\r\n72. L. Goldfarb and D. Gay. What is a structural representation? fifth variation. Technical\r\nReport Technical Report TR05-175, Faculty of Computer Science, University of New\r\nBrunswick, Canada, 2005.\r\n73. L. Goldfarb, D. Gay, O. Golubitsky, and D. Korkin. What is a structural representation? a\r\nproposal for a representational formalism. Pattern Recognition, (submitted), 2006.\r\n74. L. Goldfarb and S. Nigam. The unified learning paradigm: A foundation for ai. In V. Honovar\r\nand L. Uhr, editors, Artificial Intelligence and Neural Networks: Steps Toward Principled\r\nIntegration, pages 533–559. Academic Press, Boston, 1994.\r\n75. R.L. Gorsuch. Factor Analysis. Erlbaum, Hillsdale, NJ, 1983.\r\n76. K. Gra¸bczewski and W. Duch. The separability of split value criterion. In Proceedings of\r\nthe 5th Conf. on Neural Networks and Soft Computing, pages 201–208, Zakopane, Poland,\r\n2000. Polish Neural Network Society.\r\n77. K. Gra¸bczewski and W. Duch. Forests of decision trees. Neural Networks and Soft Computing,\r\nAdvances in Soft Computing, pages 602–607, 2002.\r\n78. K. Gra¸bczewski and W. Duch. Heterogenous forests of decision trees. Springer Lecture\r\nNotes in Computer Science, 2415:504–509, 2002.\r\n79. M. Grochowski and N. Jankowski. Comparison of instance selection algorithms. ii. results\r\nand comments. Lecture Notes in Computer Science, 3070:580–585, 2004.\r\n80. S. Grossberg. How does the cerebral cortex work? development, learning, attention, and\r\n3d vision by laminar circuits of visual cortex. Behavioral and Cognitive Neuroscience\r\nReviews, 2:47–76, 2003.\r\n81. A. Gutkin. Towards formal structural representation of spoken language: An evolving transformation\r\nsystem (ets) approach. Technical Report PhD Thesis, School of Informatics,\r\nUniversity of Edinburgh, UK, 2005.\r\n82. I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh. Feature extraction, foundations and applications.\r\nPhysica Verlag, Springer, Berlin, Heidelberg, New York, 2006.\r\n83. L. Gy¨orfi, M. Kohler, A. Krzy˙zak, and H. Walk. A Distribution-Free Theory of Nonparametric\r\nRegression. Springer Series in Statistics, Springer-Verlag, New York, 2002.\r\n84. H. Haas H. Jaeger. Harnessing nonlinearity: Predicting chaotic systems and saving energy\r\nin wireless communication. Science, 304:78–80, 2004.\r\n85. S. Haykin. Neural Networks - A Comprehensive Foundation. Maxwell MacMillian Int.,\r\nNew York, 1994.\r\n86. R. Hecht-Nielsen. Cogent confabulation. Neural Networks, 18:111–115, 2005.\r\n87. G.E. Hinton, S. Osindero, and Y. Teh. A fast learning algorithm for deep belief nets. Neural\r\nComputation, 18:381–414, 2006.\r\n88. V. Honavar and L. Uhr, editors. Artificial Intelligence and Neural Networks: Steps Toward\r\nPrincipled Integration. Academic Press, Boston, 1994.\r\n89. G.-B. Huang, L. Chen, and C.-K. Siew. Universal approximation using incremental constructive\r\nfeedforward networks with random hidden nodes. IEEE Transactions on Neural\r\nNetworks, 17:879–892, 2006.\r\n90. A. Hyv¨arinen, J. Karhunen, and E. Oja. Independent Component Analysis. Wiley & Sons,\r\nNew York, NY, 2001.\r\n91. S i. Amari and H. Nagaoka. Methods of information geometry. American Mathematical\r\nSociety, 2000.\r\n92. E.M Iyoda, H. Nobuhara, and K. Hirota. A solution for the n-bit parity problem using a\r\nsingle translated multiplicative neuron. Neural Processing Letters, 18(3):233–238, 2003.\r\n93. J-S. R. Jang and C.T. Sun. Functional equivalence between radial basis function neural\r\nnetworks and fuzzy inference systems. IEEE Transactions on Neural Networks, 4:156–\r\n158, 1993.\r\n94. N. Jankowski and W. Duch. Optimal transfer function neural networks. In 9th European\r\nSymposium on Artificial Neural Networks, pages 101–106, Bruges, Belgium, 2001. Defacto\r\npublications.\r\n95. N. Jankowski and M. Grochowski. Comparison of instance selection algorithms. i. algorithms\r\nsurvey. Lecture Notes in Computer Science, 3070:598–603, 2004.\r\n96. N. Jankowski and V. Kadirkamanathan. Statistical control of growing and pruning in RBFlike\r\nneural networks. In Third Conference on Neural Networks and Their Applications,\r\npages 663–670, Kule, Poland, October 1997.\r\n97. N. Jankowski and V. Kadirkamanathan. Statistical control of RBF-like networks for classification.\r\nIn 7th International Conference on Artificial Neural Networks, pages 385–390,\r\nLausanne, Switzerland, October 1997. Springer-Verlag.\r\n98. C. Jones and R. Sibson. What is projection pursuit. Journal of the Royal Statistical Society\r\nA, 150:1–36, 1987.\r\n99. M. Jordan and Eds. T.J. Sejnowski. Graphical Models. Foundations of Neural Computation.\r\nMIT Press, 2001.\r\n100. V. Kadirkamanathan. A statistical inference based growth criterion for the RBF networks.\r\nIn Vlontzos, editor, Proceedings of the IEEE. Workshop on Neural Networks for Signal\r\nProcessing, pages 12–21, New York, 1994.\r\n101. N. Kasabov. Evolving Connectionist Systems - Methods and Applications in Bioinformatics,\r\nBrain Study and Intelligent Machines. Springer, Perspectives in Neurocomputing, 2002.\r\n102. M.J. Kearns and U.V. Vazirani. An Introduction to Computational Learning Theory. MIT\r\nPress, Cambridge, MA, 1994.\r\n103. V. Kecman. Learning and Soft Computing. MIT Press, Cambridge, MA, 2001.\r\n104. B. K´egl and A. Krzyzak. Piecewise linear skeletonization using principal curves. IEEE\r\nTransactions on Pattern Analysis and Machine Intelligence, 24:59–74, 2002.\r\n105. J. Kennedy, R.C. Eberhart, and Y. Shi. Swarm Intelligence. Morgan Kaufmann, 2001.\r\n106. T. Kohonen. Self-organizing maps. Springer-Verlag, Heidelberg Berlin, 1995.\r\n107. A. Konar. Computational Intelligence. Principles, Techniques and Applications. Springer,\r\nNew York, 2005.\r\n108. M. Kordos and W. Duch. Variable step search mlp training method. International Journal\r\nof Information Technology and Intelligent Computing, 1:45–56, 2006.\r\n109. B. Kosko. Neural Networks and Fuzzy Systems. Prentice Hall International, 1992.\r\n110. L.I. Kuncheva. Combining Pattern Classifiers. Methods and Algorithms. J. Wiley & Sons,\r\nNew York, 2004.\r\n111. N. Kunstman, C. Hillermeier, B. Rabus, and P. Tavan. An associative memory that can form\r\nhypotheses: a phase-coded neural network. Biological Cybernetics, 72:119–132, 1994.\r\n112. G.R.G. Lanckriet, N. Cristianini, P. Bartlett, L. El Ghaoui, and M.I. Jordan. Learning the\r\nkernel matrix with semidefinite programming. Journal of Machine Learning Research,\r\n5:27–72, 2004.\r\n113. Y.J. Lee and O. L. Mangasarian. Ssvm: A smooth support vector machine for classification.\r\nComputational Optimization and Applications, 20:5–22, 2001.\r\n114. M. Leshno, V.Y. Lin, Pinkus, and S. Schocken. Multilayer feedforward networks with a\r\nnonpolynomial activation function can approximate any function. Neural Networks, 6:861–\r\n867, 1993.\r\n115. H. Leung and S. Haykin. Detection and estimation using an adaptive rational function\r\nfilters. IEEE Transactions on Signal Processing, 12:3365–3376, 1994.\r\n116. H. Li, C.L.P. Chen, and H-P. Huang. Fuzzy Neural Intelligent Systems: Mathematical Foundation\r\nand the Applications in Engineering. CRC Press, 2000.\r\n117. M. Li and P. Vit´anyi. An Introduction to Kolmogorov Complexity and its Applications.\r\nSpringer-Verlag, 1997 (2nd ed).\r\n118. H. Lodhi, C. Saunders, J. Shawe-Taylor, N. Cristianini, and C. Watkins. Text classification\r\nusing string kernels. Journal of Machine Learning Research, 2:419–444, 2002.\r\n119. W. Maass and Eds. C. M. Bishop, editors. Pulsed Neural Networks. MIT Press, Cambridge,\r\nMA, 1998.\r\n120. W. Maass and H. Markram. Theory of the computational function of microcircuit dynamics.\r\nIn S. Grillner and A. M. Graybiel, editors, Microcircuits. The Interface between Neurons\r\nand Global Brain Function, pages 371–392. MIT Press, 2006.\r\n121. W. Maass, T. Natschl¨ager, and H. Markram. Real-time computing without stable states:\r\nA new framework for neural computation based on perturbations. Neural Computation,\r\n14:2531–2560, 2002.\r\n122. R. Maclin. Boosting classifiers regionally. In Proc. 15th National Conference on Artificial\r\nIntelligence, Madison, WI., pages 700–705, 1998.\r\n123. C.D. Manning and H. Sch¨utze. Foundations of Statistical Natural Language Processing.\r\nMIT Press, Cambridge, MA, 1999.\r\n124. P. Matykiewicz, W. Duch, and J. Pestian. Nonambiguous concept mapping in medical\r\ndomain. Lecture Notes in Artificial Intelligence, 4029:941–950, 2006.\r\n125. T.J. McCabe and C.W. Butler. Design complexity measurement and testing. Communications\r\nof the ACM, 32:1415–1425, 1989.\r\n126. T.P. McNamara. Semantic Priming. Perspectives from Memory and Word Recognition. Psychology\r\nPress, 2005.\r\n127. J.M. Mendel. Uncertain Rule-Based Fuzzy Logic Systems: Introduction and New Directions.\r\nPrentice-Hall, 2000.\r\n128. D. Michie, D. J. Spiegelhalter, and C. C. Taylor. Machine learning, neural and statistical\r\nclassification. Elis Horwood, London, 1994.\r\n129. I. Mierswa, M. Wurst, R. Klinkenberg, M. Scholz, and T. Euler. Yale: Rapid prototyping\r\nfor complex data mining tasks. In Proc. 12th ACM SIGKDD Int. Conf. on Knowledge\r\nDiscovery and Data Mining (KDD 2006), 2006.\r\n130. K. Miettinen. Nonlinear Multiobjective Optimization. Kluwer Academic Publishers, 1999.\r\n131. M. Minsky and S. Papert. Perceptrons: An Introduction to Computational Geometry. MIT\r\nPress, 1969.\r\n132. S. Mitra and T. Acharya. Data Mining: Multimedia, Soft Computing, and Bioinformatics.\r\nJ. Wiley & Sons, New York, 2003.\r\n133. D. Nauck, F. Klawonn, R. Kruse, and F. Klawonn. Foundations of Neuro-Fuzzy Systems.\r\nJohn Wiley & Sons, New York, 1997.\r\n134. A. Newell. Unified theories of cognition. Harvard Univ. Press, Cambridge, MA, 1990.\r\n135. C.S. Ong, A. Smola, and B.Williamson. Learning the kernel with hyperkernels. Journal of\r\nMachine Learning Research, 6:1045–1071, 2005.\r\n136. S.K. Pal and S. Mitra. Neuro-fuzzy Pattern Recognition: Methods in Soft Computing\r\nParadigm. J. Wiley & Sons, New York, 1999.\r\n137. Y.H. Pao. Adaptive Pattern Recognition and Neural Networks. Addison-Wesley, Reading,\r\nMA, 1989.\r\n138. E. Pe¸kalska and R.P.W. Duin. The dissimilarity representation for pattern recognition:\r\nfoundations and applications. New Jersey; London: World Scientific, 2005.\r\n139. E. Pe¸kalska, P. Paclik, and R.P.W. Duin. A generalized kernel approach to dissimilaritybased\r\nclassification. Journal of Machine Learning Research, 2:175–211, 2001.\r\n140. T. Poggio and F. Girosi. Network for approximation and learning. Proceedings of the IEEE,\r\n78:1481–1497, 1990.\r\n141. A. Pouget and T.J. Sejnowski. Spatial transformation in the parietal cortex using basis\r\nfunctions. Journal of Cognitive Neuroscience, 9:222–237, 1997.\r\n142. A. Pouget and T.J. Sejnowski. Simulating a lesion in a basis function model of spatial\r\nrepresentations: comparison with hemineglect. Psychological Review, 108:653–673, 2001.\r\n143. M. J. D. Powell. Radial basis functions for multivariable interpolation: A review. In J. C.\r\nMason and M. G. Cox, editors, Algorithms for Approximation of Functions and Data, pages\r\n143–167, Oxford, 1987. Oxford University Press.\r\n144. J.R. Rabunal and J. Dorado, editors. Artificial Neural Networks in Real-life Applications.\r\nIdea Group Pub, 2005.\r\n145. R. Raizada and S. Grossberg. Towards a theory of the laminar architecture of cerebral\r\ncortex: Computational clues from the visual system. Cerebral Cortex, 13:100–113, 2003.\r\n146. I. Roth and V. Bruce. Perception and Representation. Open University Press, 1995. 2nd\r\ned.\r\n147. D. Rousseau and F. Chapeau-Blondeau. Constructive role of noise in signal detection from\r\nparallel arrays of quantizers. Signal Processing, 85:571–580, 2005.\r\n148. S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding.\r\nScience, 290(5500):2323–2326, 2000.\r\n149. Asim Roy. Artificial neural networks - a science in trouble. SIGKDD Explorations, 1:33–\r\n38, 2000.\r\n150. Asim Roy. A theory of the brain: There are parts of the brain that control other parts. In\r\nProc. of the Int. Joint Conf. on Neural Networks (IJCNN 2000), volume 2, pages 81–86.\r\nIEEE Computer Society Press, 2000.\r\n151. D.E. Rumelhart and J.L. McClelland (eds). Parallel Distributed Processing, Vol. 1: Foundations.\r\nMIT Press, Cambridge, MA, 1986.\r\n152. S. J. Russell and P. Norvig. Artificial Intelligence. A Modern Approach. Prentice-Hall,\r\nEnglewood Cliffs, NJ, 1995.\r\n153. E. Salinas and T.J. Sejnowski. Gain modulation in the central nervous system: where behavior,\r\nneurophysiology, and computation meet. Neuroscientist, 7:430–440, 2001.\r\n154. R.E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions.\r\nMachine Learning, 37:297–336, 1999.\r\n155. B. Sch¨olkopf and A.J. Smola. Learning with Kernels. Support Vector Machines, Regularization,\r\nOptimization, and Beyond. MIT Press, Cambridge, MA, 2001.\r\n156. F. Schwenker, H.A. Kestler, and G. Palm. Three learning phases for radial-basis-function\r\nnetworks. Neural Networks, 14:439–458, 2001.\r\n157. A.K. Seewald. Exploring the parameter state space of stacking. In Proceedings of the 2002\r\nIEEE International Conference on Data Mining (ICDM 2002), pages 685–688, 2002.\r\n158. L. Shastri. Advances in shruti - a neurally motivated model of relational knowledge representation\r\nand rapid inference using temporal synchrony. Applied Intelligence, 11:79–108,\r\n1999.\r\n159. Wang Shoujue and Lai Jiangliang. Geometrical learning, descriptive geometry, and\r\nbiomimetic pattern recognition. Neurocomputing, 67:9–28, 2005.\r\n160. E. Simoncelli and B.A. Olshausen. Natural image statistics and neural representation. Annual\r\nReview of Neuroscience, 24:1193–1216, 2001.\r\n161. P. Smyth and D. Wolpert. Linearly combining density estimators via stacking. Machine\r\nLearning, 36:59–83, 1999.\r\n162. Anuj Srivastava and Xiuwen Liu. Tools for application-driven linear dimension reduction.\r\nNeurocomputing, 67:136–160, 2005.\r\n163. R.S. Sutton and A.G. Barto. Reinforcement Learning: An Introduction. MIT Press, Cambridge,\r\nMA, 1998.\r\n164. R. Tibshirani T. Hastie and J. Friedman. The Elements of Statistical Learning. Springer-\r\nVerlag, 2001.\r\n165. J.D. Tebbens and P. Schlesinger. Improving implementation of linear discriminant analysis\r\nfor the small sample size problem. Preprint, submitted to Elsevier Science, 2006.\r\n166. R.F. Thompson. The Brain. The Neuroscience Primer. W.H. Freeman and Co, New York,\r\n1993.\r\n167. S. Thrun, W. Burgard, and D. Fox. Probabilistic Robotics. MIT Press, 2005.\r\n168. K. Torkkola. Feature extraction by non-parametric mutual information maximization. Journal\r\nof Machine Learning Research, 3:1415–1438, 2003.\r\n169. K. Tsuda and W.S. Noble. Learning kernels from biological networks by maximizing entropy.\r\nBioinformatics, 20:i326–i333, 2004.\r\n170. K.P. Unnikrishnan and K.P. Venugopal. Alopex: a correlation-based learning algorithm for\r\nfeeedforward and recurrent neural networks. Neural Computation, 6:469–490, 1994.\r\n171. J.-P. Vert. A tree kernel to analyze phylogenetic profiles. Bioinformatics, 18:S276–S284,\r\n2002.\r\n172. S.F. Walker. A brief history of connectionism and its psychological implications. In\r\nA. Clark and R. Lutz, editors, Connectionism in Context, pages 123–144. Springer-Verlag,\r\nBerlin, 1992.\r\n173. D.L.Wang. On connectedness: a solution based on oscillatory correlation. Neural Computation,\r\n12:131–139, 2000.\r\n174. A.R. Webb. Statistical Pattern Recognition. J. Wiley & Sons, 2002.\r\n175. C. Wendelken and L. Shastri. Multiple instantiation and rule mediation in shruti. Connection\r\nScience, 16:211–217, 2004.\r\n176. S. Wermter and R. Sun. Hybrid Neural Systems. Springer, 2000.\r\n177. T. Wieczorek, M. Blachnik, and W. Duch. Influence of probability estimation parameters\r\non stability of accuracy in prototype rules using heterogeneous distance functions. Artificial\r\nIntelligence Studies, 2:71–78, 2005.\r\n178. T. Wieczorek, M. Blachnik, and W. Duch. Heterogeneous distance functions for prototype\r\nrules: influence of parameters on probability estimation. International Journal of Artificial\r\nIntelligence Studies, 1:xxx–yyy, 2006.\r\n179. P.H. Winston. Artificial Intelligence. Addison-Wesley, Reading, MA, third edition, 1992.\r\n180. I.H. Witten and E. Frank. Data Mining: Practical machine learning tools and techniques.\r\nMorgan Kaufmann, 2nd Ed, 2005.\r\n181. J.G. Wolff. Information compression by multiple alignment, unification and search as a\r\nunifying principle in computing and cognition. Artificial Intelligence Review, 19:193–230,\r\n2003.\r\n182. J.G. Wolff. Unifying Computing and Cognition. The SP Theory and its Applications. CognitionResearch.\r\norg.uk (Ebook edition), 2006. http://www.cognitionresearch.org.uk.\r\n183. D. Wolpert. Stacked generalization. Neural Networks, 5:241–259, 1992." relation_type: [] relation_uri: [] reportno: ~ rev_number: 28 series: ~ source: ~ status_changed: 2008-01-08 00:28:59 subjects: - comp-sci-lang - comp-sci-mach-learn - comp-sci-art-intel - comp-sci-neural-nets succeeds: ~ suggestions: ~ sword_depositor: ~ sword_slug: ~ thesistype: ~ title: Towards comprehensive foundations of computational intelligence type: bookchapter userid: 568 volume: ~