Jump to content

Recursive grammar

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by David Eppstein (talk | contribs) at 04:18, 12 September 2014 (Not bollocks and this appears to be standard terminology. Add more sources and untag.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, a grammar is informally called a recursive grammar if it contains production rules that are recursive;[citation needed] otherwise it is called a non-recursive grammar.[1]

For example, a grammar for a context-free language is (left-)recursive if there exists a non-terminal symbol A that can be put through the production rules to produce a string with A(as the leftmost symbol).[2][3] All types of grammars in the Chomsky hierarchy can be recursive and it is recursion that allows to produce infinite sets of words.[1]

Properties

A non-recursive grammar can produce only a finite language; and each finite language can be produced by a non-recursive grammar.[1] For example, a straight-line grammar produces just a single word.

A recursive grammar that contains no useless rules necessarily produces an infinite language.

See also

References

  1. ^ a b c Nederhof, Mark-Jan; Satta, Giorgio (2002), "Parsing Non-recursive Context-free Grammars", Proceedings of the 40th Annual Meeting on Association for Computational Linguistics (ACL '02), Stroudsburg, PA, USA: Association for Computational Linguistics, pp. 112–119, doi:10.3115/1073083.1073104.
  2. ^ Notes on Formal Language Theory and Parsing, James Power, Department of Computer Science National University of Ireland, Maynooth Maynooth, Co. Kildare, Ireland.
  3. ^ Moore, Robert C. (2000), "Removing Left Recursion from Context-free Grammars", Proceedings of the 1st North American Chapter of the Association for Computational Linguistics Conference (NAACL 2000), Stroudsburg, PA, USA: Association for Computational Linguistics, pp. 249–255.