A subfield of linguistics, syntax is the study of the rules, or "patterned relations," that govern the way the words in a sentence come together. It concerns how different words which are categorized as nouns, adjectives, verbs etc. (goes back to Dionysios Trax) are combined into clauses which in turn combine into sentences.
In the framework of transformational-generative grammar the structure of a sentence is represented by Phrase Structure Trees. Such a tree provides three types of information about the sentence it represents:
The analysis of programming language syntax usually entails the transformation of a linear sequence of tokens (a token is akin to an individual word or punctuation mark in a natural language) into a hierarchical syntax tree (abstract syntax trees are one convenient form of syntax tree). This process, called parsing, is in some respects analogous to syntactic analysis in linguistics; in fact, certain concepts, such as the Chomsky hierarchy and context-free grammars, are common to the study of syntax in both linguistics and computer science. However, the applications of these concepts vary widely between the two fields, and the practical resemblances are small.
Search Encyclopedia
|
Featured Article
|