Lexus RX 350
Photos Photo Blogs
Personals Phones Pod
President Bush Air Force Academy
RIP John Phillip Law
Videos Video Blogs
Premium Product Finder Premium Product Search Premium Coupons & Price Drops Finder
Product Finder Product Search Coupons & Price Drops Finder
What, When, Where, How, Who?
Introduction, Important Definitions and Related Concepts:
A rewrite rule (phrase-structure rule or production) in generative grammar is a rule of the form A → X where A is a syntactic category label, such as noun phrase or sentence, and X is a sequence of such labels and/or morphemes, expressing the fact that A can be replaced by X in generating the constituent structure of a sentence. A production in computer science is a rewrite rule specifying a symbol substitution that can be recursively performed to generate new symbol sequences. A set of productions specifies a formal grammar, specifically a generative grammar. A special "start symbol" is used to begin the sequence construction, which then proceeds by the substitution of "terminal symbols" (which cannot themselves be the target of substitution) and "non-terminal symbols" (which are available for further substitution). The complete set of terminal-only strings represents the grammar. In theoretical linguistics, generative grammar refers to a particular approach to the study of syntax. A generative grammar of a language attempts to give a set of rules that will correctly predict which combinations of words will form grammatical sentences. In most approaches to generative grammar, the rules will also predict the morphology of a sentence.
Generative grammar originates in the work of Noam Chomsky, beginning in the late 1950s. (Early versions of Chomsky's theory were called Transformational Grammar.) There are a number of competing versions of generative grammar currently practiced within linguistics. Chomsky's current theory is known as the Minimalist Program. Other prominent theories include or have included Head-driven phrase structure grammar, Lexical functional grammar, Categorial grammar, Relational grammar, and Tree-adjoining grammar. Noam Chomsky has argued that many of the properties of a generative grammar arise from an "innate" Universal grammar, which is common to all languages. Proponents of generative grammar have argued that most grammar is not the result of communicative function and is not simply learned from the environment. In this respect, generative grammar takes a point of view different from functional and behaviourist theories. Most versions of generative grammar characterize sentences as either grammatically correct (also known as well formed) or not. The rules of a generative grammar typically function as an algorithm to predict grammaticality as a discrete (yes-or-no) result. In this respect, it differs from stochastic grammar which considers grammaticality as a probabilistic variable. However, some work in generative grammar (e.g. recent work by Joan Bresnan) uses stochastic versions of Optimality theory. A noun is a word used to name a person, animal, place, thing, and abstract idea. Nouns are usually the first words which small children learn. The highlighted words in the following sentences are all nouns: Late last year our neighbours bought a goat.
A noun can function in a sentence as a subject, a direct object, an indirect object, a subject complement, an object complement, an appositive, an adjective or an adverb. In morpheme-based morphology, a morpheme is the smallest linguistic unit that has semantic meaning. In spoken language, morphemes are composed of phonemes (the smallest linguistically distinctive units of sound), and in written language morphemes are composed of graphemes (the smallest units of written language). The concept morpheme differs from the concept word, as many morphemes cannot stand as words on their own. A morpheme is free if it can stand alone, or bound if it is used exclusively alongside a free morpheme.
|Privacy Statement> Advertise with us||All rights reserved ExcitingAds® 1998-2009|