Cogprints

Computation on Information, Meaning and Representations. An Evolutionary Approach

Menant, Mr Christophe (2011) Computation on Information, Meaning and Representations. An Evolutionary Approach. [Book Chapter]

Full text available as:

[img] HTML - Published Version
25Kb

Abstract

Understanding computation as “a process of the dynamic change of information” brings to look at the different types of computation and information. Computation of information does not exist alone by itself but is to be considered as part of a system that uses it for some given purpose. Information can be meaningless like a thunderstorm noise, it can be meaningful like an alert signal, or like the representation of a desired food. A thunderstorm noise participates to the generation of meaningful information about coming rain. An alert signal has a meaning as allowing a safety constraint to be satisfied. The representation of a desired food participates to the satisfaction of some metabolic constraints for the organism. Computations on information and representations will be different in nature and in complexity as the systems that link them have different constraints to satisfy. Animals have survival constraints to satisfy. Humans have many specific constraints coming in addition. And computers will compute what the designer and programmer ask for. We propose to analyze the different relations between information, meaning and representation by taking an evolutionary approach on the systems that link them. Such a bottom-up approach allows starting with simple organisms and avoids an implicit focus on humans, which is the most complex and difficult case. To make available a common background usable for the many different cases, we use a systemic tool that defines the generation of meaningful information by and for a system submitted to a constraint [Menant, 2003]. This systemic tool allows to position information, meaning and representations for systems relatively to environmental entities in an evolutionary perspective. We begin by positioning the notions of information, meaning and representation and recall the characteristics of the Meaning Generator System (MGS) that link a system submitted to a constraint to its environment. We then use the MGS for animals and highlight the network nature of the interrelated meanings about an entity of the environment. This brings us to define the representation of an item for an agent as being the network of meanings relative to the item for the agent. Such meaningful representations embed the agents in their environments and are far from the Good Old Fashion Artificial Intelligence type ones. The MGS approach is then used for humans with a limitation resulting of the unknown nature of human consciousness. Application of the MGS to artificial systems brings to look for compatibilities with different levels of Artificial Intelligence (AI) like embodied-situated AI, the Guidance Theory of Representations, and enactive AI. Concerns relative to different types of autonomy and organic or artificial constraints are highlighted. We finish by summarizing the points addressed and by proposing some continuations.

Item Type:Book Chapter
Additional Information:Paragraphs A.1 Information and meaning. Meaning generation. A.1.1 Information. Meaning of information and quantity of information. A.1.2 Meaningful information and constraint satisfaction. A systemic approach. A.2 Information, meaning and representations. An evolutionary approach. A.2.1 Stay alive constraint and meaning generation for organisms. A.2.2 The Meaning Generator System (MGS). A systemic and evolutionary approach. A.2.3 Meaning transmission. A.2.4 Individual and species constraints. Group life constraints. Networks of meanings. A.2.5 From meaningful information to meaningful representations. A.3 Meaningful information and representations in humans. A.4 Meaningful information and representations in artificial systems. A.4.1 Meaningful information and representations from traditional AI to Nouvelle AI. Embodied-situated AI. A.4.2 Meaningful representations versus the Guidance Theory of Representation. A.4.3 Meaningful information and representations versus the enactive approach. A.5 Conclusion and continuation. A.5.1 Conclusion. A.5.2 Continuation
Keywords:Information, meaning, constraint, representation, evolution, Peirce, enaction
Subjects:Biology > Evolution
Computer Science > Artificial Intelligence
Philosophy > Philosophy of Mind
ID Code:7676
Deposited By: Menant, Mr Christophe
Deposited On:27 Oct 2011 01:33
Last Modified:27 Oct 2011 01:33

References in Article

Select the SEEK icon to attempt to find the referenced article. If it does not appear to be in cogprints you will be forwarded to the paracite service. Poorly formated references will probably not work.

Anderson, M. (2005). Representation, evolution and embodiment; Institute for Advanced Computer Studies. University of Maryland. http://cogprints.org/3947/

Anderson, M. and Rosenberg, G. (2008). Content and Action: The Guidance Theory of Representation. The Institute of Mind and Behavior, Inc. The Journal of Mind and Behavior Winter and Spring 2008, Volume 29, Numbers 1 and 2. Pages 55–86. ISSN 0271–0137

Block, N. (2002). Some Concepts of Consciousness. In Philosophy of Mind: Classical and Contemporary Readings, David Chalmers (ed.) Oxford University Press.

Brooks, R. (1991,a). Intelligence without representation. Artificial Intelligence 47 (1991), 139–159 (received in 1987….)

Brooks, R. (1991, b). New Approaches to Robotics. Science, 13 September 1991: Vol. 253. no. 5025, pp. 1227 – 1232

Brooks, R. (2001) The relationship between matter and life. Nature, Vol 409, 18 Jan 2001

Depraz, N. (2007). Phenomenology and Enaction. Summer school: Cognitive sciences and Enaction. Fréjus, 5-12 september 2007

Di Paolo, E. (2003), Organismically-inspired robotics: homeostatic adaptation and teleology beyond the closed sensori-motor loop. In: K. Murase & T. Asakura (eds.), Dynamical Systems Approach to Embodiment and Sociality, Adelaide, Australia: Advanced Knowledge International, pp. 19-42

Di Paolo, E. (2005), Autopoiesis, adaptivity, teleology, agency, Phenomenology and the Cognitive Sciences, 4(4), pp. 429-452

Di Paolo, E. Rohde, M. De Jaegher, H. (2007), Horizons for the Enactive Mind: Values, Social Interaction, and Play CSRP 587 April 2007 ISSN 1350-3162. Cognitive Science Research Papers.

Dreyfus H. (2007). Why Heideggerian AI Failed and how Fixing it would Require making it more Heideggerian. Philosophical Psychology, 20(2), pp. 247-268.

Floridi, L. (2003). From data to semantic information. Entropy, 2003, 5, 125-145 . http://www.mdpi.org/entropy/papers/e5020125.pdf

Froese, T. (2007), “On the role of AI in the ongoing paradigm shift within the cognitive sciences”; In: M. Lungarella et al. (eds.), Proc. of the 50th Anniversary Summit of Artificial Intelligence, Berlin, Germany: Springer Verlag, in press

Froese, T. Virgo, N. Izquierdo, E. (2007), Autonomy: a review and reappraisal. University of Sussex research paper. ISSN 1350-3162.

Froese, T. and Ziemke, T. (2009). Enactive artificial intelligence: Investigating the systemic organization of life and mind. Artificial Intelligence, Volume 173 , Issue 3-4 (March 2009), 466-500.

Harnad, S. (1990). The Symbol Grounding Problem Physica, D, 335-346.

Haugeland, J. (1989) Artificial Intelligence, the very idea, 7th Ed. (MIT Press, USA)

Lieberman, P. (2006) Toward an evolutionary biology of language. Harvard University Press, 2006. ISBN 0674021843, 9780674021846

Menant, C. (2003) Information and meaning. Entropy, 2003, 5, 193-204. http://www.mdpi.org/entropy/papers/e5020193.pdf

Menant, C. (2005). Information and Meaning in Life, Humans and Robots.

Foundations of Information Sciences. Presentation. Paris 2005

http://www.mdpi.org/fis2005/F.45.paper.pdf

Menant, C. (2006, a). Evolution of Representations. From Basic Life to Self-representation and Self-consciousness. TSC 2006 Poster, http://cogprints.org/4843/

Menant, C. (2006, b). Evolution of Representations and Intersubjectivity as sources of the Self. An Introduction to the Nature of Self-Consciousness. ASSC 10 poster

http://cogprints.org/4957/

Menant, C. (2008). Evolution as connecting first-person and third-person perspectives of consciousness" ASSC 12 Poster http://cogprints.org/6120/

Newell, A. and Simon, H. (1976),Computer Science as Empirical Inquiry: Symbols and Search, Communications of the ACM, 19,

Queiroz, J. and El-Hani C,. (2006). Semiosis as an Emergent Process. Transactions of the Charles S. Peirce Society Vol 42., N° 1

Searle, J. (1980) Minds, brains, and programs. Behavioral and Brain Sciences, 3 (3): 417-457

Shannon, C. (1948) A mathematical theory of communication. Bell System Technical Journal, vol. 27.

Sharov, A. (1998) What is Biosemiotics ? http://home.comcast.net/~sharov/biosem/geninfo.html#summary

Torrance, S. (2005). In search of the enactive: Introduction to special issue on Enactive Experience Phenomenology and the Cognitive Science., 4(4) December 2005, pp. 357-368.).

Varela, F. Thompson, E. Rosch, E. (1991) The Embodied Mind: Cognitive Science and Human Experience. Cambridge, MA: MIT Press.

Vernon, D., Furlong, D. (2007) Philosophical Foundations of AI. In M. Lungarella et al. (Eds.): 50 Years of AI, Festschrift, LNAI 4850, pp. 53–62, 2007.c_Springer-Verlag Berlin Heidelberg 20007 http://www.robotcub.org/misc/papers/07_Vernon_Furlong_AI50.pdf

Ziemke, T. Sharkey, N. (2001). A stroll through the worlds of robots and animals: Applying Jakob von Uexküll’s theory of meaning to adaptive robots and artificial life. Published in: Semiotica, 134(1-4), 701-746 (2001).

Metadata

Repository Staff Only: item control page