Cogprints

Towards comprehensive foundations of computational intelligence

Duch, Prof Wlodzislaw (2007) Towards comprehensive foundations of computational intelligence. [Book Chapter]

Full text available as:

[img]
Preview
PDF
437Kb

Abstract

Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, meta-learning as search in the space of data models, (dis)similarity based methods providing a framework for such meta-learning, and a more general approach based on chains of transformations. Many useful transformations that extract information from features are discussed. Heterogeneous adaptive systems are presented as particular example of transformation-based systems, and the goal of learning is redefined to facilitate creation of simpler data models. The need to understand data structures leads to techniques for logical and prototype-based rule extraction, and to generation of multiple alternative models, while the need to increase predictive power of adaptive models leads to committees of competent models. Learning from partial observations is a natural extension towards reasoning based on perceptions, and an approach to intuitive solving of such problems is presented. Throughout the paper neurocognitive inspirations are frequently used and are especially important in modeling of the higher cognitive functions. Promising directions such as liquid and laminar computing are identified and many open problems presented.

Item Type:Book Chapter
Keywords:Computational intelligence, neurocognitive informatics, machine learning, neural networks, artificial intelligence, neurocognitive linguistics
Subjects:Computer Science > Language
Computer Science > Machine Learning
Computer Science > Artificial Intelligence
Computer Science > Neural Nets
ID Code:5891
Deposited By: Duch, Prof Wlodzislaw
Deposited On:08 Jan 2008 00:28
Last Modified:11 Mar 2011 08:57

References in Article

Select the SEEK icon to attempt to find the referenced article. If it does not appear to be in cogprints you will be forwarded to the paracite service. Poorly formated references will probably not work.

1. F. Corbacho A. Sierra, J.A. Macias. Evolution of functional link networks. IEEE Transactions

on Evolutionary Computation, 5:54–65, 2001.

2. N.I. Achieser. Theory of Approximation. Frederick Ungar, New York, 1956. Reprinted:

Dover Publications, New York 1992.

3. R. Adamczak,W. Duch, and N. Jankowski. New developments in the feature space mapping

model. In Third Conference on Neural Networks and Their Applications, pages 65–70,

Kule, Poland, Oct 1997.

4. J. A. Anderson, A. Pellionisz, and E. Rosenfeld. Neurocomputing 2. MIT Press, Cambridge,

MA, 1990.

5. J. A. Anderson and E. Rosenfeld. Neurocomputing - foundations of research. MIT Press,

Cambridge, MA, 1988.

6. R. Avnimelech and N. Intrator. Boosted mixture of experts: An ensemble learning scheme.

Neural Computation, 11:483–497, 1999.

7. F.R. Bach and M.I. Jordan. Kernel independent component analysis. Journal of Machine

Learning Research, 3:1–48, 2002.

8. P.M. Baggenstoss. The pdf projection theorem and the class-specific method. IEEE Transactions

on Signal Processing, 51:672–668, 2003.

9. B. Bakker and T. Heskes. Task clustering and gating for bayesian multitask learning. Journal

of Machine Learning Research, 4:83–99, 2003.

10. E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms:

bagging, boosting and variants. Machine learning, 36:105–142, 1999.

11. Y. Bengio, O. Delalleau, and N. Le Roux. The curse of highly variable functions for local

kernel machines. Advances in Neural Information Processing Systems, 18:107–114, 2006.

12. Y. Bengio, M. Monperrus, and H. Larochelle. Non-local estimation of manifold structure.

Neural Computation, 18:2509–2528, 2006.

13. T. Bilgic¸ and I.B. T¨urks¸en. Measurements of membership functions: Theoretical and empirical

work. In D. Dubois and H. Prade, editors, Fundamentals of Fuzzy Sets, Vol. 1, pages

195–232. Kluver, Boston, 2000.

14. C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1995.

15. M. Blachnik, W. Duch, and T. Wieczorek. Selection of prototypes rules context searching

via clustering. Lecture Notes in Artificial Intelligence, 4029:573–582, 2006.

16. E. Bonabeau, M. Dorigo, and G. Theraulaz. Swarm Intelligence: From Natural to Artificial

Systems. Oxford University Press, 1999.

17. L. Breiman. Bias-variance, regularization, instability and stabilization. In C. M. Bishop,

editor, Neural Networks and Machine Learning, pages 27–56. Springer-Verlag, 1998.

18. Y. Burnod. An Adaptive Neural Network. The Cerebral Cortex. Prentice-Hall, London,

1990.

19. G.A. Carpenter and S. Grossberg. Adaptive resonance theory. In M.A. Arbib, editor, The

Handbook of Brain Theory and Neural Networks, 2nd ed, pages 87–90. MIT Press, Cambridge,

MA, 2003.

20. G. Chaitin. Algorithmic Information Theory. Cambridge University Press, 1987.

21. O. Chapelle. Training a support vector machine in the primal. Neural Computation, in

print, 2006.

22. N. Chater. The search for simplicity: A fundamental cognitive principle? Quarterly Journal

of Experimental Psychology, 52A:273–302, 1999.

23. A. Cichocki and S. Amari. Adaptive Blind Signal and Image Processing. Learning Algorithms

and Applications. J. Wiley & Sons, New York, 2002.

24. T. M. Cover. Geometrical and statistical properties of systems of linear inequalities with

applications in pattern recognition. IEEE Transactions on Electronic Computers, 14:326–

334, 1965.

25. G.S. Cree, K. McRae, and C. McNorgan. An attractor model of lexical conceptual processing:

Simulating semantic priming. Cognitive Science, 23(3):371–414, 1999.

26. F. Crestani. Application of spreading activation techniques in information retrieval. Artifical

Intelligence Review, 11(6):453–482, 1997.

27. N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines and other

Kernel-Based Learning Methods. Cambridge University Press, 2000.

28. P. Dayan and G.E. Hinton. Varieties of helmholtz machines. Neural Networks, 9:1385–

1403, 1996.

29. A.M. de Callata¨y. Natural and Artificial Intelligence. Misconceptions about Brains and

Neural Networks. Elsevier, Amsterdam, 1992.

30. L.N. de Castro and J.I. Timmis. Artificial Immune Systems: A New Computational Intelligence

Approach. Springer, 2002.

31. S. Deneve and A. Pouget. Basis functions for object-centered representations. Neuron,

37:347–359, 2003.

32. T.G. Dietterich and G. Bakiri. Solving multiclass learning problems via error-correcting

output codes. Journal Of Artificial Intelligence Research, 2:263–286, 1995.

33. W. Duch. Neural minimal distance methods. In Proceedings 3-rd Conference on Neural

Networks and Their Applications, pages 183–188, Kule, Poland, Oct 1997.

34. W. Duch. Platonic model of mind as an approximation to neurodynamics. In S i. Amari

and N. Kasabov, editors, Brain-like computing and intelligent information systems, pages

491–512. Springer, 1997.

35. W. Duch. Similarity based methods: a general framework for classification, approximation

and association. Control and Cybernetics, 29:937–968, 2000.

36. W. Duch. Coloring black boxes: visualization of neural network decisions. In Int. Joint

Conf. on Neural Networks, Portland, Oregon, volume I, pages 1735–1740. IEEE Press,

2003.

37. W. Duch. Uncertainty of data, fuzzy membership functions, and multi-layer perceptrons.

IEEE Transactions on Neural Networks, 16:10–23, 2005.

38. W. Duch. Computational creativity. In World Congres on Computational Intelligence,

Vancouver, Canada, pages 1162–1169. IEEE Press, 2006.

39. W. Duch. Filter methods. In I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh, editors,

Feature extraction, foundations and applications, pages 89–118. Physica Verlag, Springer,

Berlin, Heidelberg, New York, 2006.

40. W. Duch. k-separability. Lecture Notes in Computer Science, 4131:188–197, 2006.

41. W. Duch, R. Adamczak, and G. H. F. Diercksen. Distance-based multilayer perceptrons.

In M. Mohammadian, editor, International Conference on Computational Intelligence for

Modelling Control and Automation, pages 75–80, Amsterdam, The Netherlands, 1999. IOS

Press.

42. W. Duch, R. Adamczak, and G. H. F. Diercksen. Neural networks in non-euclidean spaces.

Neural Processing Letters, 10:201–210, 1999.

43. W. Duch, R. Adamczak, and G.H.F. Diercksen. Classification, association and pattern completion

using neural similarity based methods. Applied Mathemathics and Computer Science,

10:101–120, 2000.

44. W. Duch, R. Adamczak, and G.H.F. Diercksen. Feature space mapping neural network

applied to structure-activity relationship problems. In Soo-Young Lee, editor, 7th International

Conference on Neural Information Processing (ICONIP’2000), pages 270–274,

Dae-jong, Korea, 2000.

45. W. Duch, R. Adamczak, and G.H.F. Diercksen. Constructive density estimation network

based on several different separable transfer functions. In 9th European Symposium on

Artificial Neural Networks, Bruges, Belgium, Apr 2001.

46. W. Duch, R. Adamczak, and K. Gra¸bczewski. Extraction of logical rules from backpropagation

networks. Neural Processing Letters, 7:1–9, 1998.

47. W. Duch, R. Adamczak, and K. Gra¸bczewski. A new methodology of extraction, optimization

and application of crisp and fuzzy logical rules. IEEE Transactions on Neural

Networks, 12:277–306, 2001.

48. W. Duch and M. Blachnik. Fuzzy rule-based systems derived from similarity to prototypes.

Lecture Notes in Computer Science, 3316:912–917, 2004.

49. W. Duch and M. Blachnik. Fuzzy rule-based systems derived from similarity to prototypes.

In N.R. Pal, N. Kasabov, R.K. Mudi, S. Pal, and S.K. Parui, editors, Lecture Notes

in Computer Science, volume 3316, pages 912–917. Physica Verlag, Springer, New York,

2004.

50. W. Duch and G. H. F. Diercksen. Feature space mapping as a universal adaptive system.

Computer Physics Communications, 87:341–371, 1995.

51. W. Duch and K. Gra¸bczewski. Heterogeneous adaptive systems. In IEEE World Congress

on Computational Intelligence, pages 524–529. IEEE Press, Honolulu, May 2002.

52. W. Duch and K. Grudzi´nski. Search and global minimization in similarity-based methods.

In International Joint Conference on Neural Networks, page Paper 742, Washington D.C.,

1999. IEEE Press.

53. W. Duch and K. Grudzi´nski. Meta-learning via search combined with parameter optimization.

In L. Rutkowski and J. Kacprzyk, editors, Advances in Soft Computing, pages 13–22.

Physica Verlag, Springer, New York, 2002.

54. W. Duch and L. Itert. Competent undemocratic committees. In L. Rutkowski and

J. Kacprzyk, editors, Neural Networks and Soft Computing, pages 412–417. Physica Verlag,

Springer, 2002.

55. W. Duch and L. Itert. Committees of undemocratic competent models. In L. Rutkowski and

J. Kacprzyk, editors, Proc. of Int. Conf. on Artificial Neural Networks (ICANN), Istanbul,

pages 33–36, 2003.

56. W. Duch and N. Jankowski. Complex systems, information theory and neural networks. In

First Polish Conference on Neural Networks and Their Applications, pages 224–230, Kule,

Poland, Apr 1994.

57. W. Duch and N. Jankowski. Survey of neural transfer functions. Neural Computing Surveys,

1999.

58. W. Duch and N. Jankowski. Taxonomy of neural transfer functions. In International Joint

Conference on Neural Networks, volume III, pages 477–484, Como, Italy, 2000. IEEE

Press.

59. W. Duch and N. Jankowski. Transfer functions: hidden possibilities for better neural networks.

In 9th European Symposium on Artificial Neural Networks, pages 81–94, Brusells,

Belgium, 2001. De-facto publications.

60. W. Duch, N. Jankowski, A. Naud, and R. Adamczak. Feature space mapping: a neurofuzzy

network for system identification. In Proceedings of the European Symposium on Artificial

Neural Networks, pages 221–224, Helsinki, Aug 1995.

61. W. Duch and J. Mandziuk. Quo vadis computational intelligence? In P. Sincak, J. Vascak,

and K. Hirota, editors, Machine Intelligence: Quo Vadis?, volume 21, pages 3–28. World

Scientific, Advances in Fuzzy Systems – Applications and Theory, 2004.

62. W. Duch, R. Setiono, and J. Zurada. Computational intelligence methods for understanding

of data. Proceedings of the IEEE, 92(5):771–805, 2004.

63. R. O. Duda, P. E. Hart, and D.G. Stork. Patter Classification. J. Wiley & Sons, New York,

2001.

64. R.S. Michalski (ed). Multistrategy Learning. Kluwer Academic Publishers, 1993.

65. S. E. Fahlman and C. Lebiere. The cascade-correlation learning architecture. In D. S.

Touretzky, editor, Advances in Neural Information Processing Systems 2, pages 524–532.

Morgan Kaufmann, 1990.

66. J.H. Friedman. Exploratory projection pursuit. Journal of the American Statistical Association,

82:249–266, 1987.

67. K.S. Fu. Syntactic Pattern Recognition and Applications. Prentice-Hall, New York, 1982.

68. W. Gerstner and W.M. Kistler. Spiking Neuron Models. Single Neurons, Populations, Plasticity.

Cambridge University Press, 2002.

69. G. Giacinto and F. Roli. Dynamic classifier selection based on multiple classifier behaviour.

Pattern Recognition, 34:179–181, 2001.

70. A. Gifi. Nonlinear Multivariate Analysis. Wiley, Boston, 1990.

71. Ch. Giraud-Carrier, R. Vilalta, and P. Brazdil. Introduction to the special issue on metalearning.

Machine Learning, 54:197–194, 2004.

72. L. Goldfarb and D. Gay. What is a structural representation? fifth variation. Technical

Report Technical Report TR05-175, Faculty of Computer Science, University of New

Brunswick, Canada, 2005.

73. L. Goldfarb, D. Gay, O. Golubitsky, and D. Korkin. What is a structural representation? a

proposal for a representational formalism. Pattern Recognition, (submitted), 2006.

74. L. Goldfarb and S. Nigam. The unified learning paradigm: A foundation for ai. In V. Honovar

and L. Uhr, editors, Artificial Intelligence and Neural Networks: Steps Toward Principled

Integration, pages 533–559. Academic Press, Boston, 1994.

75. R.L. Gorsuch. Factor Analysis. Erlbaum, Hillsdale, NJ, 1983.

76. K. Gra¸bczewski and W. Duch. The separability of split value criterion. In Proceedings of

the 5th Conf. on Neural Networks and Soft Computing, pages 201–208, Zakopane, Poland,

2000. Polish Neural Network Society.

77. K. Gra¸bczewski and W. Duch. Forests of decision trees. Neural Networks and Soft Computing,

Advances in Soft Computing, pages 602–607, 2002.

78. K. Gra¸bczewski and W. Duch. Heterogenous forests of decision trees. Springer Lecture

Notes in Computer Science, 2415:504–509, 2002.

79. M. Grochowski and N. Jankowski. Comparison of instance selection algorithms. ii. results

and comments. Lecture Notes in Computer Science, 3070:580–585, 2004.

80. S. Grossberg. How does the cerebral cortex work? development, learning, attention, and

3d vision by laminar circuits of visual cortex. Behavioral and Cognitive Neuroscience

Reviews, 2:47–76, 2003.

81. A. Gutkin. Towards formal structural representation of spoken language: An evolving transformation

system (ets) approach. Technical Report PhD Thesis, School of Informatics,

University of Edinburgh, UK, 2005.

82. I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh. Feature extraction, foundations and applications.

Physica Verlag, Springer, Berlin, Heidelberg, New York, 2006.

83. L. Gy¨orfi, M. Kohler, A. Krzy˙zak, and H. Walk. A Distribution-Free Theory of Nonparametric

Regression. Springer Series in Statistics, Springer-Verlag, New York, 2002.

84. H. Haas H. Jaeger. Harnessing nonlinearity: Predicting chaotic systems and saving energy

in wireless communication. Science, 304:78–80, 2004.

85. S. Haykin. Neural Networks - A Comprehensive Foundation. Maxwell MacMillian Int.,

New York, 1994.

86. R. Hecht-Nielsen. Cogent confabulation. Neural Networks, 18:111–115, 2005.

87. G.E. Hinton, S. Osindero, and Y. Teh. A fast learning algorithm for deep belief nets. Neural

Computation, 18:381–414, 2006.

88. V. Honavar and L. Uhr, editors. Artificial Intelligence and Neural Networks: Steps Toward

Principled Integration. Academic Press, Boston, 1994.

89. G.-B. Huang, L. Chen, and C.-K. Siew. Universal approximation using incremental constructive

feedforward networks with random hidden nodes. IEEE Transactions on Neural

Networks, 17:879–892, 2006.

90. A. Hyv¨arinen, J. Karhunen, and E. Oja. Independent Component Analysis. Wiley & Sons,

New York, NY, 2001.

91. S i. Amari and H. Nagaoka. Methods of information geometry. American Mathematical

Society, 2000.

92. E.M Iyoda, H. Nobuhara, and K. Hirota. A solution for the n-bit parity problem using a

single translated multiplicative neuron. Neural Processing Letters, 18(3):233–238, 2003.

93. J-S. R. Jang and C.T. Sun. Functional equivalence between radial basis function neural

networks and fuzzy inference systems. IEEE Transactions on Neural Networks, 4:156–

158, 1993.

94. N. Jankowski and W. Duch. Optimal transfer function neural networks. In 9th European

Symposium on Artificial Neural Networks, pages 101–106, Bruges, Belgium, 2001. Defacto

publications.

95. N. Jankowski and M. Grochowski. Comparison of instance selection algorithms. i. algorithms

survey. Lecture Notes in Computer Science, 3070:598–603, 2004.

96. N. Jankowski and V. Kadirkamanathan. Statistical control of growing and pruning in RBFlike

neural networks. In Third Conference on Neural Networks and Their Applications,

pages 663–670, Kule, Poland, October 1997.

97. N. Jankowski and V. Kadirkamanathan. Statistical control of RBF-like networks for classification.

In 7th International Conference on Artificial Neural Networks, pages 385–390,

Lausanne, Switzerland, October 1997. Springer-Verlag.

98. C. Jones and R. Sibson. What is projection pursuit. Journal of the Royal Statistical Society

A, 150:1–36, 1987.

99. M. Jordan and Eds. T.J. Sejnowski. Graphical Models. Foundations of Neural Computation.

MIT Press, 2001.

100. V. Kadirkamanathan. A statistical inference based growth criterion for the RBF networks.

In Vlontzos, editor, Proceedings of the IEEE. Workshop on Neural Networks for Signal

Processing, pages 12–21, New York, 1994.

101. N. Kasabov. Evolving Connectionist Systems - Methods and Applications in Bioinformatics,

Brain Study and Intelligent Machines. Springer, Perspectives in Neurocomputing, 2002.

102. M.J. Kearns and U.V. Vazirani. An Introduction to Computational Learning Theory. MIT

Press, Cambridge, MA, 1994.

103. V. Kecman. Learning and Soft Computing. MIT Press, Cambridge, MA, 2001.

104. B. K´egl and A. Krzyzak. Piecewise linear skeletonization using principal curves. IEEE

Transactions on Pattern Analysis and Machine Intelligence, 24:59–74, 2002.

105. J. Kennedy, R.C. Eberhart, and Y. Shi. Swarm Intelligence. Morgan Kaufmann, 2001.

106. T. Kohonen. Self-organizing maps. Springer-Verlag, Heidelberg Berlin, 1995.

107. A. Konar. Computational Intelligence. Principles, Techniques and Applications. Springer,

New York, 2005.

108. M. Kordos and W. Duch. Variable step search mlp training method. International Journal

of Information Technology and Intelligent Computing, 1:45–56, 2006.

109. B. Kosko. Neural Networks and Fuzzy Systems. Prentice Hall International, 1992.

110. L.I. Kuncheva. Combining Pattern Classifiers. Methods and Algorithms. J. Wiley & Sons,

New York, 2004.

111. N. Kunstman, C. Hillermeier, B. Rabus, and P. Tavan. An associative memory that can form

hypotheses: a phase-coded neural network. Biological Cybernetics, 72:119–132, 1994.

112. G.R.G. Lanckriet, N. Cristianini, P. Bartlett, L. El Ghaoui, and M.I. Jordan. Learning the

kernel matrix with semidefinite programming. Journal of Machine Learning Research,

5:27–72, 2004.

113. Y.J. Lee and O. L. Mangasarian. Ssvm: A smooth support vector machine for classification.

Computational Optimization and Applications, 20:5–22, 2001.

114. M. Leshno, V.Y. Lin, Pinkus, and S. Schocken. Multilayer feedforward networks with a

nonpolynomial activation function can approximate any function. Neural Networks, 6:861–

867, 1993.

115. H. Leung and S. Haykin. Detection and estimation using an adaptive rational function

filters. IEEE Transactions on Signal Processing, 12:3365–3376, 1994.

116. H. Li, C.L.P. Chen, and H-P. Huang. Fuzzy Neural Intelligent Systems: Mathematical Foundation

and the Applications in Engineering. CRC Press, 2000.

117. M. Li and P. Vit´anyi. An Introduction to Kolmogorov Complexity and its Applications.

Springer-Verlag, 1997 (2nd ed).

118. H. Lodhi, C. Saunders, J. Shawe-Taylor, N. Cristianini, and C. Watkins. Text classification

using string kernels. Journal of Machine Learning Research, 2:419–444, 2002.

119. W. Maass and Eds. C. M. Bishop, editors. Pulsed Neural Networks. MIT Press, Cambridge,

MA, 1998.

120. W. Maass and H. Markram. Theory of the computational function of microcircuit dynamics.

In S. Grillner and A. M. Graybiel, editors, Microcircuits. The Interface between Neurons

and Global Brain Function, pages 371–392. MIT Press, 2006.

121. W. Maass, T. Natschl¨ager, and H. Markram. Real-time computing without stable states:

A new framework for neural computation based on perturbations. Neural Computation,

14:2531–2560, 2002.

122. R. Maclin. Boosting classifiers regionally. In Proc. 15th National Conference on Artificial

Intelligence, Madison, WI., pages 700–705, 1998.

123. C.D. Manning and H. Sch¨utze. Foundations of Statistical Natural Language Processing.

MIT Press, Cambridge, MA, 1999.

124. P. Matykiewicz, W. Duch, and J. Pestian. Nonambiguous concept mapping in medical

domain. Lecture Notes in Artificial Intelligence, 4029:941–950, 2006.

125. T.J. McCabe and C.W. Butler. Design complexity measurement and testing. Communications

of the ACM, 32:1415–1425, 1989.

126. T.P. McNamara. Semantic Priming. Perspectives from Memory and Word Recognition. Psychology

Press, 2005.

127. J.M. Mendel. Uncertain Rule-Based Fuzzy Logic Systems: Introduction and New Directions.

Prentice-Hall, 2000.

128. D. Michie, D. J. Spiegelhalter, and C. C. Taylor. Machine learning, neural and statistical

classification. Elis Horwood, London, 1994.

129. I. Mierswa, M. Wurst, R. Klinkenberg, M. Scholz, and T. Euler. Yale: Rapid prototyping

for complex data mining tasks. In Proc. 12th ACM SIGKDD Int. Conf. on Knowledge

Discovery and Data Mining (KDD 2006), 2006.

130. K. Miettinen. Nonlinear Multiobjective Optimization. Kluwer Academic Publishers, 1999.

131. M. Minsky and S. Papert. Perceptrons: An Introduction to Computational Geometry. MIT

Press, 1969.

132. S. Mitra and T. Acharya. Data Mining: Multimedia, Soft Computing, and Bioinformatics.

J. Wiley & Sons, New York, 2003.

133. D. Nauck, F. Klawonn, R. Kruse, and F. Klawonn. Foundations of Neuro-Fuzzy Systems.

John Wiley & Sons, New York, 1997.

134. A. Newell. Unified theories of cognition. Harvard Univ. Press, Cambridge, MA, 1990.

135. C.S. Ong, A. Smola, and B.Williamson. Learning the kernel with hyperkernels. Journal of

Machine Learning Research, 6:1045–1071, 2005.

136. S.K. Pal and S. Mitra. Neuro-fuzzy Pattern Recognition: Methods in Soft Computing

Paradigm. J. Wiley & Sons, New York, 1999.

137. Y.H. Pao. Adaptive Pattern Recognition and Neural Networks. Addison-Wesley, Reading,

MA, 1989.

138. E. Pe¸kalska and R.P.W. Duin. The dissimilarity representation for pattern recognition:

foundations and applications. New Jersey; London: World Scientific, 2005.

139. E. Pe¸kalska, P. Paclik, and R.P.W. Duin. A generalized kernel approach to dissimilaritybased

classification. Journal of Machine Learning Research, 2:175–211, 2001.

140. T. Poggio and F. Girosi. Network for approximation and learning. Proceedings of the IEEE,

78:1481–1497, 1990.

141. A. Pouget and T.J. Sejnowski. Spatial transformation in the parietal cortex using basis

functions. Journal of Cognitive Neuroscience, 9:222–237, 1997.

142. A. Pouget and T.J. Sejnowski. Simulating a lesion in a basis function model of spatial

representations: comparison with hemineglect. Psychological Review, 108:653–673, 2001.

143. M. J. D. Powell. Radial basis functions for multivariable interpolation: A review. In J. C.

Mason and M. G. Cox, editors, Algorithms for Approximation of Functions and Data, pages

143–167, Oxford, 1987. Oxford University Press.

144. J.R. Rabunal and J. Dorado, editors. Artificial Neural Networks in Real-life Applications.

Idea Group Pub, 2005.

145. R. Raizada and S. Grossberg. Towards a theory of the laminar architecture of cerebral

cortex: Computational clues from the visual system. Cerebral Cortex, 13:100–113, 2003.

146. I. Roth and V. Bruce. Perception and Representation. Open University Press, 1995. 2nd

ed.

147. D. Rousseau and F. Chapeau-Blondeau. Constructive role of noise in signal detection from

parallel arrays of quantizers. Signal Processing, 85:571–580, 2005.

148. S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding.

Science, 290(5500):2323–2326, 2000.

149. Asim Roy. Artificial neural networks - a science in trouble. SIGKDD Explorations, 1:33–

38, 2000.

150. Asim Roy. A theory of the brain: There are parts of the brain that control other parts. In

Proc. of the Int. Joint Conf. on Neural Networks (IJCNN 2000), volume 2, pages 81–86.

IEEE Computer Society Press, 2000.

151. D.E. Rumelhart and J.L. McClelland (eds). Parallel Distributed Processing, Vol. 1: Foundations.

MIT Press, Cambridge, MA, 1986.

152. S. J. Russell and P. Norvig. Artificial Intelligence. A Modern Approach. Prentice-Hall,

Englewood Cliffs, NJ, 1995.

153. E. Salinas and T.J. Sejnowski. Gain modulation in the central nervous system: where behavior,

neurophysiology, and computation meet. Neuroscientist, 7:430–440, 2001.

154. R.E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions.

Machine Learning, 37:297–336, 1999.

155. B. Sch¨olkopf and A.J. Smola. Learning with Kernels. Support Vector Machines, Regularization,

Optimization, and Beyond. MIT Press, Cambridge, MA, 2001.

156. F. Schwenker, H.A. Kestler, and G. Palm. Three learning phases for radial-basis-function

networks. Neural Networks, 14:439–458, 2001.

157. A.K. Seewald. Exploring the parameter state space of stacking. In Proceedings of the 2002

IEEE International Conference on Data Mining (ICDM 2002), pages 685–688, 2002.

158. L. Shastri. Advances in shruti - a neurally motivated model of relational knowledge representation

and rapid inference using temporal synchrony. Applied Intelligence, 11:79–108,

1999.

159. Wang Shoujue and Lai Jiangliang. Geometrical learning, descriptive geometry, and

biomimetic pattern recognition. Neurocomputing, 67:9–28, 2005.

160. E. Simoncelli and B.A. Olshausen. Natural image statistics and neural representation. Annual

Review of Neuroscience, 24:1193–1216, 2001.

161. P. Smyth and D. Wolpert. Linearly combining density estimators via stacking. Machine

Learning, 36:59–83, 1999.

162. Anuj Srivastava and Xiuwen Liu. Tools for application-driven linear dimension reduction.

Neurocomputing, 67:136–160, 2005.

163. R.S. Sutton and A.G. Barto. Reinforcement Learning: An Introduction. MIT Press, Cambridge,

MA, 1998.

164. R. Tibshirani T. Hastie and J. Friedman. The Elements of Statistical Learning. Springer-

Verlag, 2001.

165. J.D. Tebbens and P. Schlesinger. Improving implementation of linear discriminant analysis

for the small sample size problem. Preprint, submitted to Elsevier Science, 2006.

166. R.F. Thompson. The Brain. The Neuroscience Primer. W.H. Freeman and Co, New York,

1993.

167. S. Thrun, W. Burgard, and D. Fox. Probabilistic Robotics. MIT Press, 2005.

168. K. Torkkola. Feature extraction by non-parametric mutual information maximization. Journal

of Machine Learning Research, 3:1415–1438, 2003.

169. K. Tsuda and W.S. Noble. Learning kernels from biological networks by maximizing entropy.

Bioinformatics, 20:i326–i333, 2004.

170. K.P. Unnikrishnan and K.P. Venugopal. Alopex: a correlation-based learning algorithm for

feeedforward and recurrent neural networks. Neural Computation, 6:469–490, 1994.

171. J.-P. Vert. A tree kernel to analyze phylogenetic profiles. Bioinformatics, 18:S276–S284,

2002.

172. S.F. Walker. A brief history of connectionism and its psychological implications. In

A. Clark and R. Lutz, editors, Connectionism in Context, pages 123–144. Springer-Verlag,

Berlin, 1992.

173. D.L.Wang. On connectedness: a solution based on oscillatory correlation. Neural Computation,

12:131–139, 2000.

174. A.R. Webb. Statistical Pattern Recognition. J. Wiley & Sons, 2002.

175. C. Wendelken and L. Shastri. Multiple instantiation and rule mediation in shruti. Connection

Science, 16:211–217, 2004.

176. S. Wermter and R. Sun. Hybrid Neural Systems. Springer, 2000.

177. T. Wieczorek, M. Blachnik, and W. Duch. Influence of probability estimation parameters

on stability of accuracy in prototype rules using heterogeneous distance functions. Artificial

Intelligence Studies, 2:71–78, 2005.

178. T. Wieczorek, M. Blachnik, and W. Duch. Heterogeneous distance functions for prototype

rules: influence of parameters on probability estimation. International Journal of Artificial

Intelligence Studies, 1:xxx–yyy, 2006.

179. P.H. Winston. Artificial Intelligence. Addison-Wesley, Reading, MA, third edition, 1992.

180. I.H. Witten and E. Frank. Data Mining: Practical machine learning tools and techniques.

Morgan Kaufmann, 2nd Ed, 2005.

181. J.G. Wolff. Information compression by multiple alignment, unification and search as a

unifying principle in computing and cognition. Artificial Intelligence Review, 19:193–230,

2003.

182. J.G. Wolff. Unifying Computing and Cognition. The SP Theory and its Applications. CognitionResearch.

org.uk (Ebook edition), 2006. http://www.cognitionresearch.org.uk.

183. D. Wolpert. Stacked generalization. Neural Networks, 5:241–259, 1992.

Metadata

Repository Staff Only: item control page