Cogprints: No conditions. Results ordered -Date, Title. 2018-01-17T14:25:31ZEPrintshttp://cogprints.org/images/sitelogo.gifhttp://cogprints.org/2015-02-24T18:36:37Z2015-02-24T18:36:37Zhttp://cogprints.org/id/eprint/9827This item is in the repository with the URL: http://cogprints.org/id/eprint/98272015-02-24T18:36:37ZSentence syntax trees should be made from morphemes. Semantically ordered treesSome critique of usage of sentence parse trees in modern linguistics. Two propositions on constructing trees, as mentioned in the title. Introduction of an English-to-Tatar translator program that is being developed by the author. Precedence by specificity.Dinar Qurbanovqdinar@gmail.com2014-05-10T00:08:42Z2014-05-10T00:08:42Zhttp://cogprints.org/id/eprint/9229This item is in the repository with the URL: http://cogprints.org/id/eprint/92292014-05-10T00:08:42ZUniversal Grammar Is a Universal GrammarIs Universal Grammar a universal grammar? From Chomsky's hierarchy we deduce that for each grammar there is a Turing machine, and conversely. Following this equivalence, it is immediate to conclude that a universal Turing machine is equivalent to a universal grammar. Meanwhile, in linguistics, Universal Grammar is the human brain circuitry that implements the faculty of language. So the definitive answer is achieved only when we show that the human brain is Turing complete, and that language uses this capability. So yes: Universal Grammar is a universal grammar, because the human brain circuitry that implements the faculty of language is Turing complete.Ramón Casarespapa@ramoncasares.com2014-03-11T11:55:42Z2014-05-02T11:51:22Zhttp://cogprints.org/id/eprint/9209This item is in the repository with the URL: http://cogprints.org/id/eprint/92092014-03-11T11:55:42ZResolution MachineryThe value of syntax is controversial:
some see syntax as defining us as species,
while for others it just facilitates communication.
To assess syntax
we investigate its relation to problem resolving.
First we define a problem theory from first principles,
and then we translate the theory concepts to mathematics,
obtaining the requirements
that every resolution machine has to implement.
Such a resolution machine will be able to
execute any possible resolution, that is,
any possible way of taking a problem expression
and computing the problem solutions.
Two main requirements are found:
1) syntax is needed to express problems,
that is, separate words are not enough, and
2) the resolution machine has to be as powerful
as lambda calculus is, that is,
it has to be Turing complete.
Noting that every device that can generate
any possible syntax, that is,
any possible syntactically correct sentence
of any possible, natural or artificial, language,
has to be Turing complete,
we conclude that syntax and problem resolving
can use the same components, as, for example,
sentences, functions, and conditionals.
The implication to human evolution is that
syntax and problem resolving should have
co-evolved in humans towards Turing completeness.
Ramón Casarespapa@ramoncasares.com2011-12-16T00:11:24Z2011-12-16T00:11:24Zhttp://cogprints.org/id/eprint/7708This item is in the repository with the URL: http://cogprints.org/id/eprint/77082011-12-16T00:11:24ZCan Internalism and Externalism be Reconciled in a Biological Epistemology of Language?This paper is an attempt at exploring the possibility of reconciling the two interpretations of biolinguistics which have been recently projected by Koster(Biolinguistics 3(1):61–92, 2009). The two interpretations—trivial and nontrivial—can be roughly construed as non-internalist and internalist conceptions of biolinguistics respectively. The internalist approach boils down to a conception of language where language as a mental grammar in the form of I-language grows and functions like a biological organ. On the other hand, under such a construal consistent with Koster’s (Biolinguistics 3(1):61-92, 2009), the non-internalist version does not necessarily have to be externalist in nature; rather it is a matter of mutual reinforcement of biology and culture under the rubric of a co-evolutionary dynamics. Here it will be argued that the apparent dichotomy between these two conceptions of biolinguistics can perhaps be resolved if we have a richer synthesis that accounts for both internalism and non-internalism.Prakash Mondalmndlprksh@yahoo.co.in2010-09-13T03:57:35Z2011-03-11T08:57:40Zhttp://cogprints.org/id/eprint/6948This item is in the repository with the URL: http://cogprints.org/id/eprint/69482010-09-13T03:57:35ZFrom Domains Towards a Logic of Universals: A Small Calculus for the Continuous Determination of WorldsAt the end of the 19th century, 'logic' moved from the discipline of philosophy to that of mathematics. One hundred years later, we have a plethora of formal logics. Looking at the situation form informatics, the mathematical discipline proved only a temporary shelter for `logic'. For there is Domain Theory, a constructive mathematical theory which extends the notion of computability into the continuum and spans the field of all possible deductive systems. Domain Theory describes the space of data-types which computers can ideally compute -- and computation in terms of these types. Domain Theory is constructive but only potentially operational. Here one particular operational model is derived from Domain Theory which consists of `universals', that is, model independent operands and operators. With these universals, Domains (logical models) can be approximated and continuously determined. The universal data-types and rules derived from Domain Theory relate strongly to the first formal logic conceived on philosophical grounds, Aristotelian (categorical) logic. This is no accident. For Aristotle, deduction was type-dependent and he too thought in term of type independent universal `essences'. This paper initiates the next `logical' step `beyond' Domain Theory by reconnecting `formal logic' with its origin.Dr. Claus Brillowskibrillowski@logike.info2010-08-11T11:45:56Z2011-03-11T08:57:40Zhttp://cogprints.org/id/eprint/6930This item is in the repository with the URL: http://cogprints.org/id/eprint/69302010-08-11T11:45:56ZTeacher's Book for Body ImageThis is the teacher's book that accompanies the Body Image unit.
There are the answers to the activites, as well as a rationale and suggestions for classroom activities. Miss Victoria L Clarkvictoria.clark@gmail.com2010-08-11T11:46:09Z2011-03-11T08:57:40Zhttp://cogprints.org/id/eprint/6926This item is in the repository with the URL: http://cogprints.org/id/eprint/69262010-08-11T11:46:09ZTeaching Body Image to EFL TeenagersAn extract of an Upper-Intermediate EFL coursebook for teenage learners I designed in partial requirement for MA Applied Linguistics & English Language Teaching.
The material is centred about the topic of 'Body Image' and includes a focus on learner training; infinitives & gerunds; skimming and scanning reading tasks; intensive listening practice; giving opinions/speculating; rhyming words. Miss `Victoria L Clarkvictoria.clark@gmail.com2008-04-07T21:06:26Z2011-03-11T08:57:06Zhttp://cogprints.org/id/eprint/6008This item is in the repository with the URL: http://cogprints.org/id/eprint/60082008-04-07T21:06:26ZWhy and How the Problem of the Evolution of Universal Grammar (UG) is HardUniversal Grammar (UG) is a complicated set of grammatical rules that underlies our grammatical capacity. We all follow the rules of UG, but we were never taught them, and we could not have learned them from trial and error experience either (not enough data, or time). So UG must be inborn. But for similar reasons, it seems implausible that UG was “learned” by trial and error evolution either: What was the variation and competition? And what were UG’s adaptive advantages? So this leaves the hard problem of explaining where our brain’s UG capacity came from. Christiansen & Chater (C&C) suggest an answer: Language is an organism, like us, and our brains were not selected for UG capacity; rather, languages were selected for learnability with minimal trial and error experience by our brains. This explanation is circular: Where did our brains’ selective capacity to learn all and only UG-compliant languages come from? Chomsky suggests it might be a combination of optimality and logical necessity.Stevan Harnadnielsseh@hotmail.com2008-07-15T09:56:00Z2011-03-11T08:57:09Zhttp://cogprints.org/id/eprint/6116This item is in the repository with the URL: http://cogprints.org/id/eprint/61162008-07-15T09:56:00ZCognitive Linguistics and the Evolution of Body and Soul
in the Western World: from Ancient Hebrew to Modern EnglishA philological and comparative analysis of the lexical items
concerning personhood in Ancient Hebrew, Ancient Greek and Modern English reveals semantic shifts concerning the relative lexical concepts. Ancient Hebrew presents an essentially holistic idea of personhood, whereas, via Biblical translations and Greek philosophical influences, the Western World has conceptualized humans as being
dualistic in nature. I analyze the polysemy and semantic shifts in the lexicon used for "body" and "soul" in Ancient Hebrew and Ancient Greek, which are the two linguistic systems known by St. Paul of Tarsus, and then confront them with Paul's usage context, and finally with Modern English, hypothesizing a possible case of linguistic relativity.Vito Evolaevola@unipa.it2012-11-09T19:58:28Z2013-02-18T15:13:35Zhttp://cogprints.org/id/eprint/8713This item is in the repository with the URL: http://cogprints.org/id/eprint/87132012-11-09T19:58:28ZType logic served by co-Merge, Merge and Move: an account for sluicing and questions of `common European' and Japanese typesWe explore the power of type-logical grammar as a linguistic theory, specifically, of a new tentative development inside the framework—a “symmetricized” Lambek Calculus, due to [Moortgat2005]. The basis for our discussion is an account we give for constructions involving questions and—in particular—involving sluicing; it seeks to solve puzzles these constructions have been setting for linguistic theory. Two things in the organization of grammar are of interest here: first, a uniform system joining structures from the surface side (syntactic) and structures from the “mind side” (discourse)—we call MERGE and co-MERGE the relations by which the former and the latter structures are arranged; second, a view on the circumstances of performing MOVE (by Syntax) from the type-logical perspective. As it is usual for type-logical grammars, the theory is conscious of semantics. We refer to examples from Japanese, on one side, and from English and Russian, on the other.Ivan Zakharyaschevimz@altlinux.org2007-04-04Z2011-03-11T08:56:49Zhttp://cogprints.org/id/eprint/5482This item is in the repository with the URL: http://cogprints.org/id/eprint/54822007-04-04ZRegimes in Babel are Confirmed: Report on Findings in Several Indonesian Ethnic Biblical Texts The paper introduces the presence of three statistical regimes in the Zipfian analysis of texts in quantitative linguistics: the Mandelbrot, original Zipf, and Cancho- Solé-Montemurro regimes. The work is carried out over nine different languages of the same intention semantically: the bible from different languages in Indonesian ethnic and national language. As always, the same analysis is also brought in English version of the Bible for reference. The existence of the three regimes are confirmed while in advance the length of the texts are also becomes an important issue. We outline some further works regarding the quantitative analysis for parameterization used to analyze the three regimes and the task to have broad explanation, especially the microstructure of the language in human decision or linguistic effort – emerging the robustness of them.Hokky Situngkir2007-04-04Z2011-03-11T08:56:49Zhttp://cogprints.org/id/eprint/5481This item is in the repository with the URL: http://cogprints.org/id/eprint/54812007-04-04ZAn Observational Framework to the Zipfian Analysis among Different Languages: Studies to Indonesian Ethnic Biblical Texts
The paper introduces the used of Zipfian statistics to observe the human languages by using the same (meaning) corpus/corpora but different in grammatical and structural utterances. We used biblical texts since they contain corpuses that have been most widely and carefully translated into many languages. The idea is to reduce the possibility of noise came from the meaning of the texts in distinctive language. The result is that the robustness of the Zipfian law is observable and some statistical differences are discovered between English and widely used national and several ethnic languages in Indonesia. The paper ends by modestly propose further possible framework in interdisciplinary approaches to human language evolution.Hokky Situngkir2008-07-15T09:55:43Z2011-03-11T08:57:09Zhttp://cogprints.org/id/eprint/6118This item is in the repository with the URL: http://cogprints.org/id/eprint/61182008-07-15T09:55:43ZSt. Paul's Error: The Semantic Changes of BODY and SOUL in the Western World Historically Christianity owes much to Judaism. St. Paul’s Christianity, however, changed the way of thinking of many of the first Jews because of a new way of reasoning about selfhood, the human body, and human cognition. Without wanting to treat certain theological concepts, I want to underline how modern science’s view of the person is closer to traditional Judaism than it is to Christianity, and how Paul’s “error” was diffused throughout the Western world, by analyzing the semantics of linguistic references to the body, the soul, and emotions.
What was St. Paul’s error? The question means to be both allusive and provocative. He was born by the name Saul in the city of Tarsus, in modern Turkey, during the height of its splendour as a Roman-Greek city. Paul grew up as a “free man”, that is, as a Roman citizen in a cosmopolitan environment. He is considered to be the most influential and productive of the testimonies of the Christian thought throughout Asia Minor and Western Europe. His epistles circulated throughout his time and continue to influence millions of followers, who often interpret his thoughts in contrasting manner, but nonetheless attest to his authority.
An erudite Greek-Roman, persecutor of the first Christians, Paul battled to spread the story of Jesus of Nazareth. His ideology, indeed, is a blend of Greek-Roman thought and of what he learned from the first Christians. The Hellenic characteristics of his faith created a divergence from traditional Judaic thought within what was to become the Christian creed though his influence. As a matter of fact, Christianity came to have a more coherent structure because of Paul, and Christian belief in a way is more Paul’s thought than it is Jesus’.
Jewish teaching circa selfhood was quite holistic. The Hebrew word nephesh is often translated as “soul” but also means “body”, whereas Paul clearly distinguishes the two, talking about a co-existence, “concupiscence” and the necessity of dominating the body to exalt the spirit. I will examine the semantic changes in words dealing with body and soul, and how Paul’s authority eventually influenced the Western world’s way of reasoning about such concepts.Vito Evolaevola@unipa.it2006-12-08Z2011-03-11T08:56:42Zhttp://cogprints.org/id/eprint/5269This item is in the repository with the URL: http://cogprints.org/id/eprint/52692006-12-08ZPolisemia e slittamenti semantici nei concetti ANIMA e CORPO nel mondo occidentale, ovvero l'errore di San PaoloStoricamente il cristianesimo deve molto al giudaismo. Il cristianesimo di San Paolo, tuttavia, ha cambiato il modo di ragionare su concetti come il sé, il corpo, e la cognizione umana. Senza volere trattare certi concetti teologici, mi prefiggo di sottolineare come il punto di vista della scienza moderna è più vicino al giudaismo tradizionale che al cristianesimo, e di spiegare la diffusione dell’“errore” di Paolo nel mondo occidentale, analizzando la semantica dei riferimenti linguistici (e in particolar modo le metafore e le metonimie) dei concetti anima e corpo e del rapporto con la concezione del sé.
Cresciuto da “uomo franco” cioè, da cittadino romano in un ambiente cosmopolita, Paolo è considerato il testimone più influente e produttivo del pensiero cristiano nell’Asia Minore e nell’Europa Occidentale. Le sue epistole circolarono durante la sua vita e continuano ad influenzare miliardi di seguaci, i quali spesso interpretano le sue idee in modo contrastante, ma ciononostante attestando una specifica autorevolezza.
Erudito greco-romano, inizialmente persecutore dei primi Cristiani, Paolo ha lottato per diffondere la storia di Gesù di Nazareth. La sua ideologia, infatti, è stata vista da molti come un amalgama tra il pensiero greco-romano e ciò che egli stesso ha appreso dai primi cristiani. Queste caratteristiche elleniche, più o meno reali, del sistema religioso introdussero una differenza significativa all’interno del pensiero giudaico tradizionale, dal quale, per mezzo dell’influenza dei suoi scritti, si sarebbe sviluppato il credo cristiano. Di fatti, il cristianesimo ha acquisito una struttura più coerente grazie a Paolo, quasi da inferire che la fede cristiana deve più a Paolo che a Gesù.
Quale era l’errore di San Paolo? La domanda vuole essere sia allusiva che provocatoria. L’insegnamento giudaico a proposito del concetto del sé era piuttosto olistico. Per esempio, la parola ebraica nephesh è spesso tradotta come “anima” ma, metaforicamente, significa anche “corpo”, mentre, secondo i suoi interpreti, Paolo chiaramente fa delle distinzioni dualistiche, e parlando della “concupiscenza” predica la necessità di dominare la carne per esaltare lo spirito. Con gli strumenti della linguistica cognitiva, propongo un’analisi della polisemia e degli slittamenti semantici nei concetti di ANIMA e CORPO, e come l’autorità attribuita a Paolo eventualmente ha influenzato il pensiero occidentale sul ragionare di queste rappresentazioni mentali.
Vito Evola2004-12-28Z2011-03-11T08:55:47Zhttp://cogprints.org/id/eprint/4004This item is in the repository with the URL: http://cogprints.org/id/eprint/40042004-12-28ZComplex sentence as a structure for representing knowledge Structural variations involving both morphological and syntactic features of the complex sentence of the type “When S, S” and their relevance for the interpretation of sentence meaning are analyzed. It is hypothesized that the constraints on certain sequences intuitively felt by native speakers are due to semantic contradictions that arise between the indexical content of the verbal tense and aspect and the syntactic structure of the sentence which iconically reflects the cognitive processing of perceptual data. The cognitive value of different syntactically acceptable sequences is assessed from the point of view of the relationship between the morphosyntactic categories of tense and as-pect and sentence iconicity. Prof. A.V. KravchenkoDr. J.B. Zelberg2007-03-07Z2011-03-11T08:56:47Zhttp://cogprints.org/id/eprint/5432This item is in the repository with the URL: http://cogprints.org/id/eprint/54322007-03-07ZP-model Alternative to the T-modelStandard linguistic analysis of syntax uses the T-model. This model
requires the ordering: D-structure $>$ S-structure $>$ LF,
where D-structure is the deep structure,
S-structure is the surface structure, and LF is logical form.
Between each of these representations there is movement which alters
the order of the constituent words; movement is achieved using the principles
and parameters of syntactic theory. Psychological analysis of sentence
production is usually either serial or connectionist. Psychological serial
models do not accommodate the T-model immediately so that here a new model
called the P-model is introduced. The P-model is different from previous
linguistic and psychological models. Here it is argued that the LF
representation should be replaced by a variant
of Frege's three qualities (sense, reference, and force),
called the Frege representation or F-representation.
In the F-representation the order of elements is not necessarily the same as
that in LF and it is suggested that the correct ordering is:
F-representation $>$ D-structure $>$ S-structure.
This ordering appears to lead to a more natural
view of sentence production and processing. Within this framework movement
originates as the outcome of emphasis applied to the sentence. The
requirement that the F-representation precedes the D-structure needs a picture
of the particular principles and parameters which pertain to movement of words
between representations. In general this would imply that there is a
preferred or optimal ordering of the symbolic string in the F-representation.
The standard ordering is retained because the general way of producing
such an optimal ordering is unclear. In this case it is possible to produce
an analysis of movement between LF and D-structure similar to the usual
analysis of movement between S-structure and LF.
It is suggested that a maximal amount of information about
a language's grammar and lexicon is stored,
because of the necessity of analyzing corrupted data.Mark D. Roberts2005-05-14Z2011-03-11T08:56:03Zhttp://cogprints.org/id/eprint/4351This item is in the repository with the URL: http://cogprints.org/id/eprint/43512005-05-14ZA Bi-Polar Theory of Nominal and Clause Structure and FunctionIt is taken as axiomatic that grammar encodes meaning. Two key dimensions of meaning that get grammatically encoded are referential meaning and relational meaning. The key claim is that, in English, these two dimensions of meaning are typically encoded in distinct grammatical poles—a referential pole and a relational pole—with a specifier functioning as the locus of the referential pole and a head functioning as the locus of the relational pole. Specifiers and heads combine to form referring expressions corresponding to the syntactic notion of a maximal projection. Lexical items and expressions functioning as modifiers are preferentially attracted to one pole or the other. If the head of an expression describes a relation, one or more complements may be associated with the head. The four grammatical functions specifier, head, modifier and complement are generally adequate to represent much of the basic structure and function of nominals and clauses. These terms are borrowed from X-Bar Theory, but they are motivated on semantic grounds having to do with their grammatical function to encode referential and relational meaning.Dr. Jerry T. Balljerryball2004-06-05Z2011-03-11T08:55:36Zhttp://cogprints.org/id/eprint/3657This item is in the repository with the URL: http://cogprints.org/id/eprint/36572004-06-05ZFrequency Value Grammar and Information TheoryI previously laid the groundwork for Frequency Value Grammar (FVG) in papers I submitted in the proceedings of the 4th International Conference on Cognitive Science (2003), Sydney Australia, and Corpus Linguistics Conference (2003), Lancaster, UK. FVG is a formal syntax theoretically based in large part on Information Theory principles. FVG relies on dynamic physical principles external to the corpus which shape and mould the corpus whereas generative grammar and other formal syntactic theories are based exclusively on patterns (fractals) found occurring within the well-formed portion of the corpus. However, FVG should not be confused with Probability Syntax, (PS), as described by Manning (2003). PS is a corpus based approach that will yield the probability distribution of possible syntax constructions over a fixed corpus. PS makes no distinction between well and ill formed sentence constructions and assumes everything found in the corpus is well formed. In contrast, FVG’s primary objective is to distinguish between well and ill formed sentence constructions and, in so doing, relies on corpus based parameters which determine sentence competency. In PS, a syntax of high probability will not necessarily yield a well formed sentence. However, in FVG, a syntax or sentence construction of high ‘frequency value’ will yield a well-formed sentence, at least, 95% of the time satisfying most empirical standards. Moreover, in FVG, a sentence construction of ‘high frequency value’ could very well be represented by an underlying syntactic construction of low probability as determined by PS. The characteristic ‘frequency values’ calculated in FVG are not measures of probability but rather are fundamentally determined values derived from exogenous principles which impact and determine corpus based parameters serving as an index of sentence competency. The theoretical framework of FVG has broad applications beyond that of formal syntax and NLP. In this paper, I will demonstrate how FVG can be used as a model for improving the upper bound calculation of entropy of written English. Generally speaking, when a function word precedes an open class word, the backward n-gram analysis will be homomorphic with the information source and will result in frequency values more representative of co-occurrences in the information source.
Asa M Stepak2002-10-10Z2011-03-11T08:55:00Zhttp://cogprints.org/id/eprint/2482This item is in the repository with the URL: http://cogprints.org/id/eprint/24822002-10-10ZShort remarks on the hierarchical structure recently found in all languages by Guglielmo CinqueIt is argued that structural properties shared by a large number of studied languages cannot be a mere result of coincidences but must have been acquired („learnt") and proved useful in the course of evolution. Bodily and cognitive boundary conditions and limitations as well as efficiency requirements are claimed to lie the basis of the observed regularities. Dr. Knud Thomsen2002-02-25Z2011-03-11T08:54:53Zhttp://cogprints.org/id/eprint/2086This item is in the repository with the URL: http://cogprints.org/id/eprint/20862002-02-25ZSimple principles for a complex output: An experiment in early syntactic developmentA set of iterative mechanisms, the Three-Step Algorithm, is proposed to account for the burst in the syntactic capacities of children over age two. These mechanisms are based on the childrens perception, memory, elementary rule-like behavior and cognitive capacities, and do not require any specific innate grammatical capacities. The relevance of the Three-Step Algorithm is tested, using the large Manchester corpus in the CHILDES database. The results show that 80% of the utterances can be exactly reconstructed and that, when incomplete reconstructions are taken into account, 94% of all utterances are reconstructed. The Three-Step Algorithm should be followed by the progressive acquisition of syntactic categories and use of slot-and-frame structures which lead to a greater and more complex linguistic mastery.Christophe Parisse2002-06-16Z2011-03-11T08:54:56Zhttp://cogprints.org/id/eprint/2276This item is in the repository with the URL: http://cogprints.org/id/eprint/22762002-06-16ZThe immune system and other cognitive systemsIn the following pages we propose a theory on cognitive systems and the common strategies of perception, which are at the basis of their function. We demonstrate that these strategies are easily seen to be in place in known cognitive systems such as vision and language. Furthermore we show that taking these strategies into consideration implies a new outlook on immune function calling for a new appraisal of the immune system as a cognitive system.Uri HershbergSol Efroni2003-10-29Z2011-03-11T08:55:23Zhttp://cogprints.org/id/eprint/3256This item is in the repository with the URL: http://cogprints.org/id/eprint/32562003-10-29ZReview of A. Cormack’s "Definitions"Review of Cormack, Annabel. 1998. Definitions: Implications for Syntax, Semantics, and the Language of Thought. New York: Garland.Richard Horsey2002-04-12Z2011-03-11T08:54:55Zhttp://cogprints.org/id/eprint/2177This item is in the repository with the URL: http://cogprints.org/id/eprint/21772002-04-12ZERP analysis of cognitive sequencing : a left-anterior negativity related to structural transformation processingA major objective of cognitive neuroscience is to identify those neurocomputational processes that may be shared by multiple cognitive functions vs those that are highly specifc. This problem of identifying general vs specialized functions is of particular interest in the domain of language processing. Within this domain, event related brain potential (ERP) studies have demonstrated a left anterior negativity (LAN) in a range 300 to 700 ms, associated with syntactic processing, often linked to grammatical function words. These words have little or no
semantic content, but rather play a role in encoding syntactic structure required for parsing. In the current study we test the hypothesis that the LAN reflects the operation of a more general sequence processing capability in which special symbols encode structural information that, when combined with past elements in the sequence, allows the prediction of successor elements. We recorded ERPs during a non-linguistic sequencing task that required subjects (nà10) to process special symbols possessing the functional property defined above. When compared to ERPs in a control condition, function symbol processing elicits a left anterior negative shift between with temporal and spatial characteristics quite similar to the LAN described during function word processing in language, supporting
our hypothesis. These results are discussed in the
context of related studies of syntactic and cognitive sequence processing. Michel HoenPeter-Ford Dominey2000-07-04Z2011-03-11T08:53:42Zhttp://cogprints.org/id/eprint/150This item is in the repository with the URL: http://cogprints.org/id/eprint/1502000-07-04ZConstructional Tools as the Origin of Cognitive CapacitiesIt is argued that cognitive capacities can be understood as the outcome of the collective action of a set of agents created by tools that explore possible behaviours and train the agents to behave in such appropriate ways as may be discovered. The coherence of the whole system is assured by a combination of vetting the performance of new agents and dealing appropriately with any faults that the whole system may develop. This picture is shown to account for a range of cognitive capacities, including language.Brian D. Josephson2001-07-04Z2011-03-11T08:54:44Zhttp://cogprints.org/id/eprint/1669This item is in the repository with the URL: http://cogprints.org/id/eprint/16692001-07-04ZSemantico-Phonetic Form: A Unitarianist GrammarSemantico-Phonetic Form: A Unitarianist Grammar
Ahmad R. Lotfi
Azad University at Esfahan (IRAN)
ABSTRACT
Semantico-Phonetic Form is a unitarianist theory of language in two different
but inter-related senses: first, it assumes that the Conceptual-Intentional and
Articulatory-Perceptual systems (responsible for semantic and phonetic
interpretations respectively) access the data at one and the same level of inter-
pretation; hence a single interface level--Semantico-Phonetic Form, SPF.
Second, it is unitarianist in that (although it is still a formalist theory of language)
it potentially permits the incorporation of both formalist and functionalist explana-
tions in its formulation of the architecture of language.
Within the framework of Semantico-Phonetic Form, and as an alternative proposal
to Chomsky's minimalist thesis of movement, the Pooled Features Hypothesis pro-
poses that "movement" is the consequence of the way in which the language faculty
is organised (rather than a simple "imperfection" of language). The computational
system CHL for human language is considered to be economical in its selection of
formal features from the lexicon so that if two LIs (to be introduced in the same
derivation) happen to have some identical formal feature in common, the feature is
selected only once but shared by the syntactic objects in the derivation. It follows
that the objects in question must be as local in their relations as possible. The local-
ity of relations as such, which is due to economy considerations, results in some kind
of (bare) phrase structure with pooled features labelling the structural tree nodes that
dominate the syntactic objects. Pooled features, in a sense, are structurally interpreted.
Other features, i.e. those not pooled, will be interpreted at SPF.
KEY WORDS:
bare phrase structure, economy, faculty of language,
feature checking, feature sharing, formal features,
imperfections, lexicon, logical forms, minimalist syntax,
Semantico-Phonetic Form, strength, unitarianist theory
Ahmad Reza Lotfi2003-02-12Z2011-03-11T08:55:09Zhttp://cogprints.org/id/eprint/2766This item is in the repository with the URL: http://cogprints.org/id/eprint/27662003-02-12ZWh-movement vs. scrambling: The brain makes a difference(no abstract)Angela D. FriedericiMatthias SchlesewskyChristian J. Fiebach1999-08-20Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/219This item is in the repository with the URL: http://cogprints.org/id/eprint/2191999-08-20ZBook Review--Ronald Cole (editor-in-chief), Joseph Mariani, Hans Uszkoreit, Annie Zaenen, and Victor Zue, eds., Survey of the State of the Art in Human Language TechnologyThis is a review of Survey of the State of the Art in Human Language Technology, edited by Ronald Cole (editor-in-chief), Joseph Mariani, Hans Uszkoreit, Annie Zaenen, and Victor Zue, published by Cambridge University Press in 1997.Varol Akman2003-08-16Z2011-03-11T08:55:20Zhttp://cogprints.org/id/eprint/3112This item is in the repository with the URL: http://cogprints.org/id/eprint/31122003-08-16ZLogical connectives, relationships and relevanceThis paper was written in the context of a workshop (held in 1989) on logical connectives in discourse. It addresses difficulties that arise from attempts at analysing logical connectives by assigning logical relations to them, taken from a typological listing. As the workshop itself showed, such attempts typically face the problem of on the one hand needing to be broad enough to cover all the instances of relations found in texts and on the other hand needing to be specific enough to differentiate types of relationships from each other.
Applying the relevance-theoretic framework proposed by Sperber and Wilson, this paper argued that the root of the problem is that (logical) connectivity in discourse is not created by a fixed set of interpropositional or rhetorical relations, but by the search for relevance, through the inferential interaction between a given utterance and its context. This explains the virtually unlimited variety of relations that can and do arise in texts, without reliance on any defined set of relations.
After a brief introduction to the relevance-theoretical framework, the paper applies it to the analysis of the connective –m in Silt’i, an Ethio-Semitic language. It attempts to show how the variety of relations associated with this connective can be explained in terms of the interaction of a simple semantic property with different pieces of contextual information, accessed in the search for relevance.
The paper then proposes and illustrates three methods of testing the validity of such relevance-theoretic analyses of connectivity, two of which can be carried out experimentally.
Dr Ernst-August Gutt1999-12-17Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/222This item is in the repository with the URL: http://cogprints.org/id/eprint/2221999-12-17ZStrawson on Intended Meaning and ContextStrawson proposed in the early seventies an attractive threefold distinction regarding how context bears on the meaning of `what is said' when a sentence is uttered. The proposed scheme is somewhat crude and, being aware of this aspect, Strawson himself raised various points to make it more adequate. In this paper, we review the scheme of Strawson, note his concerns, and add some of our own. However, our main point is to defend the essence of Strawson's approach and to recommend it as a starting point for research into intended meaning and context.Varol AkmanFerda N. Alpaslan1999-08-20Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/220This item is in the repository with the URL: http://cogprints.org/id/eprint/2201999-08-20ZBook Review--Jaap van der Does and Jan van Eijk, eds., Quantifiers, Logic, and LanguageThis is a review of Quantifiers, Logic, and Language, edited by Jaap van der Does and Jan van Eijk, published by CSLI (Center for the Study of Language and Information) Publications in 1996.Varol Akman2001-06-03Z2011-03-11T08:54:40Zhttp://cogprints.org/id/eprint/1538This item is in the repository with the URL: http://cogprints.org/id/eprint/15382001-06-03ZNull Arguments in English Registers: A Minimalist AccountThe syntax of null arguments in the diary and instructional registers of English is investigated in a Minimalist framework. The first unified analysis of null arguments in the two registers is given.
Following Haegeman (1996, 1997) and Rizzi (1997) the null argument in these registers is analysed as an antecedentless nonvariable DP (ec) which is licensed only in the leftmost position of the clause. In clauses with such null arguments, a TopP (topic phrase) is posited as the highest projection. The head of this projection is taken to have a [D-] feature. The licensing requirement of ec ensures that it must raise to check the [D-] feature of the topic head, enabling ec to be identified with the discourse topic; if there is any closer [D] feature, then ec will not raise and it will fail to be licensed, causing the derivation to crash. It is shown that the distribution of ec in diaries and instructions can be captured on these assumptions. In each case where ec is ungrammatical, it is shown that some element with a [D] feature intervenes between ec and TopP, preventing ec from raising to a position where it can be licensed.
Telegraphese, note-taking and headlinese, other registers of English which also exhibit null arguments, are then investigated to see if the analysis also extends to these cases. It is argued that the analysis cannot fully account for null arguments in these registers. However, subject drop in colloquial speech is demonstrated to be an instance of the same phenomenon, suggesting that null arguments, and in particular null subjects, are a general possibility in English rather than a marked phenomenon.
Richard Horsey1998-11-10Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/214This item is in the repository with the URL: http://cogprints.org/id/eprint/2141998-11-10ZAn Analysis of English Punctuation: The Special Case of CommaPunctuation has usually been ignored by researchers in computational linguistics over the years. Recently, it has been realized that a true understanding of written language will be impossible if punctuation marks are not taken into account. This paper contains the details of a computer-aided exercise to investigate English punctuation practice for the special case of comma (the most significant punctuation mark) in a parsed corpus. The study classifies the various ``structural'' uses of the comma according to the syntax-patterns in which a comma occurs. The corpus (Penn Treebank) consists of syntactically annotated sentences with no part-of-speech tag information about individual words.Murat BayraktarBilge SayVarol Akman1998-03-24Z2011-03-11T08:53:42Zhttp://cogprints.org/id/eprint/164This item is in the repository with the URL: http://cogprints.org/id/eprint/1641998-03-24ZCo-Evolution of Language-Size and the Critical PeriodSpecies evolve, very slowly, through selection of genes which give rise to phenotypes well adapted to their environments. The cultures, including the languages, of human communities evolve, much faster, maintaining at least a minimum level of adaptedness to the external, non- cultural environment. In the phylogenetic evolution of species, the transmission of information across generations is via copying of molecules, and innovation is by mutation and sexual recombination. In cultural evolution, the transmission of information across generations is by learning, and innovation is by sporadic invention or borrowing from other cultures. This much is the foundational bedrock of evolutionary theory. But things get more complicated; there can be gene-culture co-evolution. Prior to the rise of culture, the physical environment is the only force shaping biological evolution from outside the organism, and cultures themselves are clearly constrained by the evolved biological characteristics of their members. But cultures become part of the external environment, and influence the course of biological evolution. For example, altruistic cultures with developed medical knowledge reduce the cost to the individual of carrying genes disposing to certain pathologies (such as diabetes); and such genes become more widespread in the populations maintaining such cultures. Assortative mating can affect biological evolution, and particular cultures may influence the factors which are sorted for in mating. (For a careful discussion of the effects of cultural evolution on natural selection, see Cavalli-Sforza and Bodmer, 1971:774- 804). This paper examines mechanisms involved in the co-evolution of a biological trait, the critical period for language acquisition, and a property of human cultures, the size of their languages. A gene/culture interaction will be shown that can be described as a kind of symbiosis, but perhaps more aptly as an `arms race'. In this introduction, we will sketch the basic mechanics of the interaction in very broad terms; the rest of the paper will explain and justify the details. The implications of our model for second language acquisition are given toward the end of the paper.James R HurfordSimon Kirby2006-08-06Z2011-03-11T08:56:33Zhttp://cogprints.org/id/eprint/5045This item is in the repository with the URL: http://cogprints.org/id/eprint/50452006-08-06ZDescription Theory, LTAGs and Underspecified SemanticsAn attractive way to model
the relation between an underspecified syntactic representation and
its completions is to let the underspecified representation correspond
to a logical description and the completions to the
models of that description. This approach, which underlies the
Description Theory of (Marcus et al. 1983) has been integrated
in (Vijay-Shanker 1992) with a pure unification approach to
Lexicalized Tree-Adjoining
Grammars (Joshi et al.\ 1975, Schabes 1990). We generalize
Description Theory by integrating semantic
information, that is, we propose to tackle both syntactic and
semantic underspecification using descriptions.Reinhard MuskensEmiel Krahmer1998-03-24Z2011-03-11T08:54:07Zhttp://cogprints.org/id/eprint/617This item is in the repository with the URL: http://cogprints.org/id/eprint/6171998-03-24ZThe evolution of language and languagesHuman languages, such as French, Cantonese or American Sign Language, are socio- cultural entities. Knowledge of them (`competence') is acquired by exposure to the ap- propriate environment. Languages are maintained and transmitted by acts of speaking and writing; and this is also the means by which languages evolve. The utterances of one generation are processed by their children to form mental grammars, which in some sense summarize, or generalize over, the children's linguistic experiences. These grammars are the basis for the production of a new avalanche of utterances to which the next generation in its turn is subjected. (This picture is simplified, of course, as generations overlap.) Languages inhabit two distinct and separate modes of existence, which have been called (by Chomsky, 1986) `E-Language' and `I-Language'. E-language is the external observable behaviour --- utterances and inscriptions and manifestations of their meanings. E-language is regarded by some as so chaotic and subject to the vicissitudes of everyday human life as to be a poor candidate for systematic study. (E-Language corresponds to what Chomsky, in earlier terminology, called `performance'.) Out of this blooming buzzing confusion the individual child distils an order internal to the mind; the child constructs a coherent systematic set of rules mapping meanings onto forms. This set of rules is the child's I-Language (where `I' is for `internal'). No two individuals' I-Languages have to be the same, although those of people living in the same community will overlap very significantly. But there will usually be at least some slight difference between the I-language features prevalent in one generation and those prevalent in the next. This is the stuff of language evolution, in the sense of the historical development of individual languages, such as Swedish, Navaho or Zulu.James R Hurford1998-04-27Z2011-03-11T08:53:43Zhttp://cogprints.org/id/eprint/194This item is in the repository with the URL: http://cogprints.org/id/eprint/1941998-04-27ZFunctional Innateness: explaining the critical period for language acquisitionIn recent years, several explanations have been offered for the critical period in language acquisition, itself, a priori a somewhat surprising phenomenon. Two such explanations are considered here. Both studies use computer simulations, but the factors they model are very different. Hurford (1991) simulates the phylogenetic evolution over hundreds of generations of a species in which the timing of life history traits is under genetic control. The period when an individual is most proficient at language acquisition is just such a life history trait, and is capable of adaptive evolution. Evolutionary simulations lead to a concentration of language acquisition proficiency in the period up to puberty, with a subsequent tailing off. Elman (1993) demonstrates `the advantages of starting small' in neural networks learning mini-languages with many of the complex interacting grammatical factors found in real languages. A neural network which starts mature, with a full adult `working memory' cannot acquire such complex grammatical competence, whereas a net whose attention span is initially limited and then grows with maturation can acquire the appropriate grammar. This explains, in adaptive terms, the existence of a period in which an organism's characteristics, relevant to the language learning task, change, increasing a certain capacity (`working memory') from an immature to an adult value. These accounts are complementary and mutually compatible. An evolutionary account is proposed, in which genetically controlled `working memory' size in relation to life history is the variable operated on by natural selection. This account promises to produce a more detailed explanation of the critical period, which can be related to a wider range of data, including the coincidence with puberty and the involvement of sentence processing in language acquisition The relationships between Elman's `working memory' and the distinct psychological concept of working memory are also explored.Jim Hurford1998-11-12Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/216This item is in the repository with the URL: http://cogprints.org/id/eprint/2161998-11-12ZAn Information-Based Treatment of Punctuation in Discourse Representation TheoryPunctuation has so far attracted attention within the linguistics community mostly from a syntactic perspective. In this paper, we give a preliminary account of the information-based aspects of punctuation, drawing our points from assorted, naturally occurring sentences. We present our formal models of these sentences and the semantic contributions of punctuation marks. Our formalism is a simplified analogue of an extension--due to Nicholas Asher--of Discourse Representation Theory.Bilge SayVarol Akman1998-03-24Z2011-03-11T08:53:43Zhttp://cogprints.org/id/eprint/189This item is in the repository with the URL: http://cogprints.org/id/eprint/1891998-03-24ZThe interaction between numerals and nounsThis paper is a descriptive survey of the principal phenomena surrounding cardinal numerals in attribution to nouns, with some concentration on European languages, but within a world-wide perspective. The paper is focussed on describing the syntagmatic distribution and the internal structure of numerals. By contrast, the important topic of the paradigmatic context of numerals, that is how their structure and behavior related to those of quantifiers, determiners, adjectives, and nouns, does not receive systematic discussion here, although many relevant comments are made in passing. A further necessary limitation in scope is the exclusion of forms which are only marginally cardinal numerals, if at all, such as English both, dozen, fourscore, pair, triple and their counterparts in other languages.Jim Hurford1998-04-27Z2011-03-11T08:53:43Zhttp://cogprints.org/id/eprint/193This item is in the repository with the URL: http://cogprints.org/id/eprint/1931998-04-27ZThe interaction between numerals and nounsThis paper is a descriptive survey of the principal phenomena surrounding cardinal numerals in attribution to nouns, with some concentration on European languages, but within a world-wide perspective. The paper is focussed on describing the syntagmatic distribution and the internal structure of numerals. By contrast, the important topic of the paradigmatic context of numerals, that is how their structure and behaviour relates to those of quantifiers, determiners, adjectives, and nouns, does not receive systematic discussion here, although many relevant comments are made in passing. A further necessary limitation in scope is the exclusion of forms which are only marginally cardinal numerals, if at all, such as English both, dozen, fourscore, pair, triple and their counterparts in other languages.Jim Hurford1998-10-15Z2011-03-11T08:54:15Zhttp://cogprints.org/id/eprint/745This item is in the repository with the URL: http://cogprints.org/id/eprint/7451998-10-15ZReflexive Thoughts about a Medieval Russian EpicThe paper applies Lefebvre's theory of moral cognition to the analysis of a medieval Russian epic poem. The text is viewed as a study of the protagonists' inner world rather than a story of a failed military expedition.Konstantin K. Bogatyrev1998-03-31Z2011-03-11T08:54:07Zhttp://cogprints.org/id/eprint/623This item is in the repository with the URL: http://cogprints.org/id/eprint/6231998-03-31ZVerbal Working Memory and Sentence ComprehensionThis target article discusses the verbal working memory system used in sentence comprehension. We review the idea of working memory as a short duration system in which small amounts of information are simultaneously stored and manipulated in the service of a task and that syntactic processing in sentence comprehension requires such a storage and computational system. We inquire whether the working memory system used in syntactic processing is the same as that used in verbally mediated tasks involving conscious, controlled processing. Various forms of evidence are considered: the relationship between individual differences in working memory and individual differences in the efficiency of syntactic processing; the effect of concurrent verbal memory load on syntactic processing; and syntactic processing in patients with poor short term memory, poor working memory, or aphasia. The experimental results suggest that the verbal working memory system specialized for assigning the syntactic structure of a sentence and for using that structure in determining sentence meaning is distinct from the working memory system that underlies the use of sentence meaning to accomplish further functions. We present a theory of the components of the verbal working memory system and suggestions as to its neural basis.David CaplanGloria Waters1998-06-16Z2011-03-11T08:53:43Zhttp://cogprints.org/id/eprint/198This item is in the repository with the URL: http://cogprints.org/id/eprint/1981998-06-16ZCurrent Approaches to Punctuation in Computational LinguisticsSome recent studies in computational linguistics have aimed to take advantage of various cues presented by punctuation marks. This short survey is intended to summarise these research efforts and additionally, to outline a current perspective for the usage and functions of punctuation marks. We conclude by presenting an information-based framework for punctuation, influenced by treatments of several related phenomena in computational linguistics.Bilge SayVarol Akman2002-02-21Z2011-03-11T08:54:53Zhttp://cogprints.org/id/eprint/2084This item is in the repository with the URL: http://cogprints.org/id/eprint/20842002-02-21ZEvent-related potentials elicited by spoken relative clausesSentence-length event-related potential (ERP) waveforms were obtained from 23 scalp sites as 24 subjects listened to normally spoken sentences of various syntactic structures. The critical materials consisted of 36 sentences each containing one of 2 types of relative clauses that differ in processing difficulty, namely Subject Object (SO) and Subject Subject (SS) relative clauses. Sentence-length ERPs showed several differences in the slow scalp potentials elicited by SO and SS sentences that were similar in their temporal dynamics to those elicited by the same stimuli in a word-by-word reading experiment, although the effects in the two modalities have non identical distributions. Just as for written sentences, there was a large, fronto-central negativity beginning at the linguistically defined "gap" in the SO sentences; this effect was largest for listeners with above-median comprehension rates, and is hypothesized to index changes in on-line processing demands during comprehension.Horst M. MuellerJonathan W. KingMarta Kutas1998-07-19Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/212This item is in the repository with the URL: http://cogprints.org/id/eprint/2121998-07-19ZUNIVERSAL SEMANTIC CONTINUA AND FUNCTIONAL GRAMMATICAL INTEGRALSOne of the logical effects of the principle of asymmetric dualism of the linguistic sign is that languages are incomparable, at least, what regards separate grammatical forms, even if these forms are of one and the same type. A certain correlation between languages can only be found at the universal level, in the form of a potential of formal means and grammatical integrals, which unite elementary meanings (atomic senses). In some sense, one can find, in any language of the world, and in the shape of underformed, non-grammatized, potential oppositions, atomic (elementary) meanings, which appear as grammatized in some other language. The majority of oppositions between forms get defragmentated in language contrasts. More universal units for comparing languages are found either at a lower (atomic senses), or at a higher (grammatical-contextual complex) level. If 'full grammar' is considered, it is likely to suppose that all elementary senses of a universal grammatical integral find their representation in any language, within the grammatical-contextual complex. Thus, various languages complement each other within the framework of the universal human language. Separate grammatical forms of particular languages also complement each other within the framework of universal grammatical concepts. Grammatical integrals taken as wholes, as well as types of grammatical-contextual complexes are not just chaotic sets of occasional senses. In interlingual contrasts, if one of the languages is 'perfect-devoid' ('article-devoid', etc.), the relevant functional types retreat to the potential, covert domain. Formally, a semantic zone, or the functional potential of a universal grammatical integral, finds its representation in formal means that belong to different levels of language structure, but get united in a complex.Vyatcheslav B. Kashkin2001-11-16Z2011-03-11T08:54:49Zhttp://cogprints.org/id/eprint/1887This item is in the repository with the URL: http://cogprints.org/id/eprint/18872001-11-16ZAuditory Implicit Learning, and Its Transfer to and from Visual Implicit LearningReber and others have shown that the passive learning of synthetic grammars ("implicit learning") is a robust phenomenon when visual stimulus materials are employed. It was the main aim of this study to discover if the same effects occur in the auditory modality, and then to determine if such learning can be transferred from the visual to the auditory mode, and vice versa. In the present study, first, the standard effect was replicated with visual material (Experiment I). Second the effect was also shown to occur when the same material was presented to the auditory modality (Experiment II). It was then shown that implicitly learned material can be transferred from the visual to the auditory modality (Experiment III) and from the auditory to the visual modality (Experiment IV). The implications of the results are discussed with respect to the debate about the "abstractness" or "concreteness" of the mental representation of the material learned.Christopher D. GreenPhilip R. Groff1998-06-22Z2011-03-11T08:54:12Zhttp://cogprints.org/id/eprint/704This item is in the repository with the URL: http://cogprints.org/id/eprint/7041998-06-22ZBeyond Turing EquivalenceWhat is the relation between intelligence and computation? Although the difficulty of defining `intelligence' is widely recognized, many are unaware that it is hard to give a satisfactory definition of `computational' if computation is supposed to provide a non-circular explanation for intelligent abilities. The only well-defined notion of `computation' is what can be generated by a Turing machine or a formally equivalent mechanism. This is not adequate for the key role in explaining the nature of mental processes, because it is too general, as many computations involve nothing mental, nor even processes: they are simply abstract structures. We need to combine the notion of `computation' with that of `machine'. This may still be too restrictive, if some non-computational mechanisms prove to be useful for intelligence. We need a theory-based taxonomy of {\em architectures} and {\em mechanisms} and corresponding process types. Computational machines my turn out to be a sub-class of the machines available for implementing intelligent agents. The more general analysis starts with the notion of a system with independently variable, causally interacting sub-states that have different causal roles, including both `belief-like' and `desire-like' sub-states, and many others. There are many significantly different such architectures. For certain architectures (including simple computers), some sub-states have a semantic interpretation for the system. The relevant concept of semantics is defined partly in terms of a kind of Tarski-like structural correspondence (not to be confused with isomorphism). This always leaves some semantic indeterminacy, which can be reduced by causal loops involving the environment. But the causal links are complex, can share causal pathways, and always leave mental states to some extent semantically indeterminate.Aaron Sloman2000-08-14Z2011-03-11T08:54:22Zhttp://cogprints.org/id/eprint/933This item is in the repository with the URL: http://cogprints.org/id/eprint/9332000-08-14ZP-model Alternative to the T-model.Standard linguistic analysis of syntax uses the T-model. This model
requires the ordering: D-structure $>$ S-structure $>$ LF,
where D-structure is the deep structure,
S-structure is the surface structure, and LF is logical form.
Between each of these representations there is movement which alters
the order of the constituent words; movement is achieved using the principles
and parameters of syntactic theory. Psychological analysis of sentence
production is usually either serial or connectionist. Psychological serial
models do not accommodate the T-model immediately so that here a new model
called the P-model is introduced. The P-model is different from previous
linguistic and psychological models. Here it is argued that the LF
representation should be replaced by a variant
of Frege's three qualities (sense, reference, and force),
called the Frege representation or F-representation.
In the F-representation the order of elements is not necessarily the same as
that in LF and it is suggested that the correct ordering is:
F-representation $>$ D-structure $>$ S-structure.
This ordering appears to lead to a more natural
view of sentence production and processing. Within this framework movement
originates as the outcome of emphasis applied to the sentence. The
requirement that the F-representation precedes the D-structure needs a picture
of the particular principles and parameters which pertain to movement of words
between representations. In general this would imply that there is a
preferred or optimal ordering of the symbolic string in the F-representation.
The standard ordering is retained because the general way of producing
such an optimal ordering is unclear. In this case it is possible to produce
an analysis of movement between LF and D-structure similar to the usual
analysis of movement between S-structure and LF.
It is suggested that a maximal amount of information about
a language's grammar and lexicon is stored,
because of the necessity of analyzing corrupted data.Mark D. Roberts2001-11-18Z2011-03-11T08:54:49Zhttp://cogprints.org/id/eprint/1891This item is in the repository with the URL: http://cogprints.org/id/eprint/18912001-11-18ZImplicit Learning of Spatial SequencesImplicit learning for verbal strings generated by a finite state machine (FSM) has been demonstrated repeatedly. No one,
however, has investigated whether such learning can take place when the information to be learned is spatial in nature. In this
study, subjects learned sequences of FSM-generated spatial information displayed on a 3´3 grid. They were able to learn
these sequences faster than subjects similarly set to learn random spatial sequences. As is typical in implicit learning studies,
although the subjects in the experimental group were unable to articulate the rules governing the sequences, they were
nevertheless able to distinguish new grammatical strings from random ones at a rate far above chance.Christopher D. GreenEllen Munro2006-07-16Z2011-03-11T08:56:29Zhttp://cogprints.org/id/eprint/4972This item is in the repository with the URL: http://cogprints.org/id/eprint/49722006-07-16ZMOTOR THEORY OF LANGUAGE IN RELATION TO SYNTAXThe semantic, syntactic and phonetic structures of language develop from a complex preexisting system, more specifically the preexisting motor system. Language thus emerged as an external physical expression of the neural basis for movement control. Features which made a wide range of skilled action possible - a set of elementary motor subprograms together with rules expressed in neural organization for combining subprograms into extended action sequences - were transferred to form a parallel set of programs and rules for speech and language. The already established integration of motor control with perceptual organization led directly to a systematic relation between language and the externally perceived world.Robin Allott1998-06-22Z2011-03-11T08:54:12Zhttp://cogprints.org/id/eprint/706This item is in the repository with the URL: http://cogprints.org/id/eprint/7061998-06-22ZMusings on the roles of logical and non-logical representations in intelligenceThis paper offers a short and biased overview of the history of discussion and controversy about the role of different forms of representation in intelligent agents. It repeats and extends some of the criticisms of the `logicist' approach to AI that I first made in 1971, while also defending logic for its power and generality. It identifies some common confusions regarding the role of visual or diagrammatic reasoning including confusions based on the fact that different forms of representation may be used at different levels in an implementation hierarchy. This is contrasted with the way in the use of one form of representation (e.g. pictures) can be {\em controlled} using another (e.g. logic, or programs). Finally some questions are asked about the role of metrical information in biological visual systems.Aaron Sloman2004-05-06Z2011-03-11T08:55:33Zhttp://cogprints.org/id/eprint/3619This item is in the repository with the URL: http://cogprints.org/id/eprint/36192004-05-06ZParler, rédiger: présentation d'un outil d'analyse syntaxique et de quelques résultatsLe psychologue peut analyser la production langagière en étudiant trois types de phénomènes et leurs interrelations : 1) les conditions contextuelles dans lesquelles la production émerge, 2) les processus et connaissances mis en oeuvre pour réaliser la tâche langagière, 3) les caractéristiques du produit langagier. Dans le travail de psychologie expérimentale présenté ci-après, l'organisation syntaxique de corpus oraux et écrits a été caractérisée e quantifiée afin de repérer les déterminants de situation (destinataire, thème) et les contraintes fonctionnelles (cadence de production, possibilité d'autocorrection) qui provoquent des variations dans la structuration syntaxique. Si l'on appelle "système de production", un ensemble articulé de contraintes fonctionnelles qui conditionnent la nature et le déroulement de l'activité d'élaboration d'un texte lors de son ajustement à une situation de communication donnée, les résultats obtenus ont permis de conclure, pour les deux modalités de production (parler, rédiger), à l'utilisation de deux systèmes différents. Leur communauté fonctionnelle n'enlève rien à leurs différences qu'il n'est pas réaliste de négliger ou de réduire à quelques contraintes périphériques de type phonologique ou calligraphique. D'ailleurs, avec de nouvelles méthodologies qui permettent d'étudier, en temps réel, le déroulement même de l'activité (particulièrement de l'activité rédactionnelle, Piolat, 1990), la mise en évidence des différences fonctionnelles entre la production par écrit et la production par oral est encore plus patente.A Piolat2000-07-04Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/225This item is in the repository with the URL: http://cogprints.org/id/eprint/2252000-07-04ZUltrametric Distance in SyntaxPhrase structure trees have a hierarchical structure. In many subjects, most notably in Taxonomy such tree structures have been studied using ultrametrics. Here syntactical hierarchical phrase trees are subject to a similar analysis, which is much simpler as the branching structure is more readily discernible and switched. The occurrence of hierarchical structure elsewhere in linguistics is mentioned. The phrase tree can be represented by a matrix and the elements of the matrix can be represented by triangles. The height at which branching occurs is not prescribed in previous syntactic models, but it is by using the ultrametric matrix. In other words the ultrametric approach gives a complete description of phrase trees, unlike previous approaches. The ambiguity of which branching height to choose, is resolved by postulating that branching occurs at the lowest height available. An ultrametric produces a measure of the complexity of sentences: presumably the complexity of sentences increases as a language is acquired so that this can be tested. All ultrametric triangles are equilateral or isoceles, here it is shown that \={X} structure implies that there are no equilateral triangles. Restricting attention to simple syntax a minimum ultrametric distance between lexical categories is calculated. This ultrametric distance is shown to be different than the matrix obtained from features. It is shown that the definition of {\sc c-command} can be replaced by an equivalent ultrametric definition. The new definition invokes a minimum distance between nodes and this is more aesthetically satisfying than previous varieties of definitions. From the new definition of {\sc c-command} follows a new definition of {\sc government}.Mark D. Roberts2007-03-06Z2011-03-11T08:56:47Zhttp://cogprints.org/id/eprint/5434This item is in the repository with the URL: http://cogprints.org/id/eprint/54342007-03-06ZUltrametric Distance in SyntaxPhrase structure trees have a hierarchical structure. In many subjects, most notably in Taxonomy such tree structures have been studied using ultrametrics. Here syntactical hierarchical phrase trees are subject to a similar analysis, which is much simpler as the branching structure is more readily discernible and switched. The occurrence of hierarchical structure elsewhere in linguistics is mentioned. The phrase tree can be represented by a matrix and the elements of the matrix can be represented by triangles. The height at which branching occurs is not prescribed in previous syntactic models, but it is by using the ultrametric matrix. In other words the ultrametric approach gives a complete description of phrase trees, unlike previous approaches. The ambiguity of which branching height to choose, is resolved by postulating that branching occurs at the lowest height available. An ultrametric produces a measure of the complexity of sentences: presumably the complexity of sentences increases as a language is acquired so that this can be tested. All ultrametric triangles are equilateral or isoceles, here it is shown that \={X} structure implies that there are no equilateral triangles. Restricting attention to simple syntax a minimum ultrametric distance between lexical categories is calculated. This ultrametric distance is shown to be different than the matrix obtained from features. It is shown that the definition of {\sc c-command} can be replaced by an equivalent ultrametric definition. The new definition invokes a minimum distance between nodes and this is more aesthetically satisfying than previous varieties of definitions. From the new definition of {\sc c-command} follows a new definition of {\sc government}.Mark D. Roberts1998-07-18Z2011-03-11T08:54:13Zhttp://cogprints.org/id/eprint/721This item is in the repository with the URL: http://cogprints.org/id/eprint/7211998-07-18ZSemantics in an intelligent control systemMuch research on intelligent systems has concentrated on low level mechanisms or sub-systems of restricted functionality. We need to understand how to put all the pieces together in an \ul{architecture} for a complete agent with its own mind, driven by its own desires. A mind is a self-modifying control system, with a hierarchy of levels of control, and a different hierarchy of levels of implementation. AI needs to explore alternative control architectures and their implications for human, animal, and artificial minds. Only within the framework of a theory of actual and possible architectures can we solve old problems about the concept of mind and causal roles of desires, beliefs, intentions, etc. The high level ``virtual machine'' architecture is more useful for this than detailed mechanisms. E.g. the difference between connectionist and symbolic implementations is of relatively minor importance. A good theory provides both explanations and a framework for systematically generating concepts of possible states and processes. Lacking this, philosophers cannot provide good analyses of concepts, psychologists and biologists cannot specify what they are trying to explain or explain it, and psychotherapists and educationalists are left groping with ill-understood problems. The paper sketches some requirements for such architectures, and analyses an idea shared between engineers and philosophers: the concept of ``semantic information''.A. Sloman2001-06-26Z2011-03-11T08:54:44Zhttp://cogprints.org/id/eprint/1651This item is in the repository with the URL: http://cogprints.org/id/eprint/16512001-06-26ZSymbols and Nets: Cooperation vs. CompetitionCritique of two critiques of connectionism.Stevan Harnad1999-03-16Z2011-03-11T08:53:44Zhttp://cogprints.org/id/eprint/217This item is in the repository with the URL: http://cogprints.org/id/eprint/2171999-03-16ZA Holistic Approach to LanguageThe following progress report views language acquisition as primarily the attempt to create processes that connect together in a fruitful way linguistic input and other activity. The representations made of linguistic input are thus those that are optimally effective in mediating such interconnections. An effective Language Acquisition Device should contain mechanisms specific to the task of creating the desired interconnection processes in the linguistic environment in which the language learner finds himself or herself. Analysis of this requirement gives clear indications as to what these mechanisms may be.Brian D. JosephsonDavid G. Blair2000-12-12Z2011-03-11T08:54:27Zhttp://cogprints.org/id/eprint/1148This item is in the repository with the URL: http://cogprints.org/id/eprint/11482000-12-12ZA Review of B. F. Skinner's Verbal Behavior I had intended this review not specifically as a criticism of Skinner's speculations regarding language, but rather as a more general critique of behaviorist (I would now prefer to say "empiricist") speculation as to the nature of higher mental processes. My reason for discussing Skinner's book in such detail was that it was the most careful and thoroughgoing presentation of such speculations, an evaluation that I feel is still accurate. Therefore, if the conclusions I attempted to substantiate in the review are correct, as I believe they are, then Skinner's work can be regarded as, in effect, a reductio ad absurdum of behaviorist assumptions. My personal view is that it is a definite merit, not a defect, of Skinner's work that it can be used for this purpose, and it was for this reason that I tried to deal with it fairly exhaustively. I do not see how his proposals can be improved upon, aside from occasional details and oversights, within the framework of the general assumptions that he accepts. I do not, in other words, see any way in which his proposals can be substantially improved within the general framework of behaviorist or neobehaviorist, or, more generally, empiricist ideas that has dominated much of modern linguistics, psychology, and philosophy. The conclusion that I hoped to establish in the review, by discussing these speculations in their most explicit and detailed form, was that the general point of view was largely mythology, and that its widespread acceptance is not the result of empirical support, persuasive reasoning, or the absence of a plausible alternative.Noam Chomsky