Cogprints

Representational information: a new general notion and measure of information

Vigo, Professor Ronaldo (2011) Representational information: a new general notion and measure of information. [Journal (Paginated)]

Full text available as:

[img]
Preview
PDF (Vigo (2011b) )
613Kb

Abstract

In what follows, we introduce the notion of representational information (information conveyed by sets of dimensionally defined objects about their superset of origin) as well as an original deterministic mathematical framework for its analysis and measurement. The framework, based in part on categorical invariance theory [Vigo, 2009], unifies three key constructsof universal science – invariance, complexity, and information. From this unification we define the amount of information that a well-defined set of objects R carries about its finite superset of origin S, as the rate of change in the structural complexity of S (as determined by its degree of categorical invariance), whenever the objects in R are removed from the set S. The measure captures deterministically the significant role that context and category structure play in determining the relative quantity and quality of subjective information conveyed by particular objects in multi-object stimuli.

Item Type:Journal (Paginated)
Keywords:Representational Information, Concepts, Invariance, Complexity, Information measure, Subjective information
Subjects:Psychology > Applied Cognitive Psychology
Computer Science > Artificial Intelligence
Computer Science > Complexity Theory
Computer Science > Robotics
Psychology > Perceptual Cognitive Psychology
Psychology > Psychophysics
ID Code:7961
Deposited By: Zeigler , Derek
Deposited On:09 Nov 2012 17:47
Last Modified:09 Nov 2012 17:47

References in Article

Select the SEEK icon to attempt to find the referenced article. If it does not appear to be in cogprints you will be forwarded to the paracite service. Poorly formated references will probably not work.

H.H. Aiken, The Staff of the Computation Laboratory at Harvard University, Synthesis of Electronic Computing and Control Circuits, Harvard University

Press, Cambridge, 1951.

[2] David Applebaum, Probability and Information, Cambridge University Press, 1996.

[3] C.M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.

[4] L.E. Bourne, Human Conceptual Behavior, Allyn and Bacon, Boston, 1966.

[5] N. Cowan, The magical number 4 in short-term memory: a reconsideration of mental storage capacity, Behavioral and Brain Sciences 24 (2001) 87–

185.

[6] K. Devlin, Claude Shannon, 1916–2001, Focus: The Newsletter of the Mathematical Association of America. 21 (2001) 20–21.

[7] K. Devlin, Logic and Information, Cambridge University Press, 1991.

[8] W.K. Estes, Classification and cognition, Oxford Psychology Series, vol. 22, Oxford University Press, Oxford, 1994.

[9] J. Feldman, A catalog of Boolean concepts, Journal of Mathematical Psychology 47 (1) (2003) 98–112.

[10] D. Fisch, B. Kühbeck, B. Sick, S.J. Ovaska, So near and yet so far: new insight into properties of some well-known classifier paradigms, Information

Sciences 180 (2010) 3381–3401.

[11] W.R. Garner, The Processing of Information and Structure, Wiley, New York, 1974.

4858 R. Vigo / Information Sciences 181 (2011) 4847–4859Author's personal copy

[12] S. Guadarrama, A. Ruiz-Mayor, Approximate robotic mapping from sonar data by modeling perceptions with antonyms, Information Sciences 180

(2010) 4164–4188.

[13] R.V.L. Hartley, Transmission of information, Bell System Technical Journal (1928) 535–563.

[14] J.P. Hayes, Introduction to Digital Logic Design, Addison-Wesley, 1993.

[15] R.A. Higonnet, R.A. Grea, Logical Design of Electrical Circuits, McGraw-Hill, New York, 1958.

[16] E. Horowitz, S. Sahni, Computing partitions with applications to the Knapsack problem, JACM 21 (2) (1974) 277–292.

[17] J.K. Kruschke, ALCOVE: an exemplar-based connectionist model of category learning, Psychological Review 99 (1992) 22–44.

[18] W. Lenski, Information: a conceptual investigation, Information 1 (2) (2010) 74–118.

[19] R.D. Luce, Whatever happened to information theory in psychology?, Review of General Psychology 7 (2) (2003) 183–188

[20] G.L. Murphy, The Big Book of Concepts, MIT Press, 2002.

[21] R.M. Nosofsky, M.A. Gluck, T.J. Palmeri, S.C. McKinley, P.G. Glauthier, Comparing models of rule-based classification learning: a replication and

extension of Shepard, Hovland, and Jenkins (1961), Memory and Cognition 22 (3) (1994) 352–369.

[22] R. Seising, Cybernetics, system(s) theory, information theory and fuzzy sets and systems in the 1950s and 1960s, Information Sciences 180 (2010)

4459–4476.

[23] C.E. Shannon, A mathematical theory of communication, System Technical Journal 27 (1948) 379–423. 623–656.

[24] C.E. Shannon, W. Weaver, The mathematical theory of communication, University of Illinois Press, Urbana, 1949.

[25] R.N. Shepard, C.L. Hovland, H.M. Jenkins, Learning and memorization of classifications, Psychological Monographs: General and Applied 75 (13) (1961)

1–42.

[26] B. Skyrms, The flow of information in signaling games, Philosophical Studies 147 (2010) 155–165.

[27] F.S. Tseng, Y. Kuo, Y. Huang, Toward boosting distributed association rule mining by data de-clustering, Information Sciences 80 (2010) 4459–4476.

[28] R. Vigo, A note on the complexity of Boolean concepts, Journal of Mathematical Psychology 50 (5) (2006) 501–510.

[29] R. Vigo, Modal similarity, Journal of Experimental and Artificial Intelligence 21 (3) (2009) 181–196.

[30] R. Vigo, Categorical invariance and structural complexity in human concept learning, Journal of Mathematical Psychology 53 (4) (2009) 203–221.

[31] R. Vigo, A dialogue on concepts, Think 9 (24) (2010) 109–120.

[32] R. Vigo, Towards a law of invariance in human conceptual behavior, in: L. Carlson, C. Hölscher, T. Shipley (Eds.), Proceedings of the 33rd Annual

Conference of the Cognitive Science Society, Austin, TX: Cognitive Science Society (in press).

Metadata

Repository Staff Only: item control page