Jump to content

Recursive grammar

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by David Eppstein (talk | contribs) at 04:12, 12 September 2014 (References: remove irrelevant-looking refs). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, a grammar is informally called a recursive grammar if it contains production rules that are recursive;[citation needed] otherwise it is called a non-recursive grammar.[citation needed]

For example,[improper synthesis?] a grammar for a context-free language is (left-)recursive if there exists a non-terminal symbol A that can be put through the production rules to produce a string with A(as the leftmost symbol).[1] All types of grammars in the Chomsky hierarchy can be recursive and it is recursion that allows to produce infinite sets of words.

Properties

A non-recursive grammar can produce only a finite language; and each finite language can be produced by a non-recursive grammar. For example, a straight-line grammar produces just a single word.

A recursive grammar that contains no useless rules necessarily produces an infinite language.

See also

References

  1. ^ Notes on Formal Language Theory and Parsing, James Power, Department of Computer Science National University of Ireland, Maynooth Maynooth, Co. Kildare, Ireland.JPR02