48 relations: Addison-Wesley, Ambiguity, Ambiguous grammar, Association for Computing Machinery, Backtracking, Bellman–Ford algorithm, Boolean grammar, Circular definition, Commutative property, Comparison of parser generators, Compiler Description Language, Computer science, Constructed language, Context-free grammar, Context-free language, Cut (logic programming), CYK algorithm, Dangling else, Earley parser, Expression (mathematics), Floyd–Warshall algorithm, Formal grammar, Formal language, Function (mathematics), GLR parser, Greedy algorithm, Left recursion, List of algorithms, LL parser, Logic programming, Lojban, LR parser, Memoization, Mutual recursion, Natural language, OMeta, Parse tree, Parser combinator, Parsing, PDF, Recursion, Recursive descent parser, Regular expression, String (computer science), Syntactic predicate, Terminal and nonterminal symbols, Time complexity, Top-down parsing language.
Addison-Wesley is a publisher of textbooks and computer literature.
Ambiguity is a type of meaning in which several interpretations are plausible.
In computer science, an ambiguous grammar is a context-free grammar for which there exists a string that can have more than one leftmost derivation or parse tree, while an unambiguous grammar is a context-free grammar for which every valid string has a unique leftmost derivation or parse tree.
The Association for Computing Machinery (ACM) is an international learned society for computing.
Backtracking is a general algorithm for finding all (or some) solutions to some computational problems, notably constraint satisfaction problems, that incrementally builds candidates to the solutions, and abandons a candidate ("backtracks") as soon as it determines that the candidate cannot possibly be completed to a valid solution.
The Bellman–Ford algorithm is an algorithm that computes shortest paths from a single source vertex to all of the other vertices in a weighted digraph.
Boolean grammars, introduced by Okhotin, are a class of formal grammars studied in formal language theory.
A circular definition is one that uses the term(s) being defined as a part of the definition or assumes a prior understanding of the term being defined.
In mathematics, a binary operation is commutative if changing the order of the operands does not change the result.
This is a list of notable lexer generators and parser generators for various language classes.
Compiler Description Language, or CDL, is a programming language based on affix grammars.
Computer science deals with the theoretical foundations of information and computation, together with practical techniques for the implementation and application of these foundations.
A constructed language (sometimes called a conlang) is a language whose phonology, grammar, and vocabulary have been consciously devised for human or human-like communication, instead of having developed naturally.
In formal language theory, a context-free grammar (CFG) is a certain type of formal grammar: a set of production rules that describe all possible strings in a given formal language.
In formal language theory, a context-free language (CFL) is a language generated by a context-free grammar (CFG).
The cut, in Prolog, is a goal, written as !, which always succeeds, but cannot be backtracked.
In computer science, the Cocke–Younger–Kasami algorithm (alternatively called CYK, or CKY) is a parsing algorithm for context-free grammars, named after its inventors, John Cocke, Daniel Younger and Tadao Kasami.
The dangling else is a problem in computer programming in which an optional else clause in an if–then(–else) statement results in nested conditionals being ambiguous.
In computer science, the Earley parser is an algorithm for parsing strings that belong to a given context-free language, though (depending on the variant) it may suffer problems with certain nullable grammars.
In mathematics, an expression or mathematical expression is a finite combination of symbols that is well-formed according to rules that depend on the context.
In computer science, the Floyd–Warshall algorithm is an algorithm for finding shortest paths in a weighted graph with positive or negative edge weights (but with no negative cycles).
In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) is a set of production rules for strings in a formal language.
In mathematics, computer science, and linguistics, a formal language is a set of strings of symbols together with a set of rules that are specific to it.
In mathematics, a function was originally the idealization of how a varying quantity depends on another quantity.
A GLR parser (GLR standing for "generalized LR", where L stands for "left-to-right" and R stands for "rightmost (derivation)") is an extension of an LR parser algorithm to handle nondeterministic and ambiguous grammars.
A greedy algorithm is an algorithmic paradigm that follows the problem solving heuristic of making the locally optimal choice at each stage with the intent of finding a global optimum.
In the formal language theory of computer science, left recursion is a special case of recursion where a string is recognized as part of a language by the fact that it decomposes into a string from that same language (on the left) and a suffix (on the right).
The following is a list of algorithms along with one-line descriptions for each.
In computer science, an LL parser is a top-down parser for a subset of context-free languages.
Logic programming is a type of programming paradigm which is largely based on formal logic.
Lojban (pronounced) is a constructed, syntactically unambiguous human language, succeeding the Loglan project.
In computer science, LR parsers are a type of bottom-up parser that efficiently read deterministic context-free languages, in guaranteed linear time.
In computing, memoization or memoisation is an optimization technique used primarily to speed up computer programs by storing the results of expensive function calls and returning the cached result when the same inputs occur again.
In mathematics and computer science, mutual recursion is a form of recursion where two mathematical or computational objects, such as functions or data types, are defined in terms of each other.
In neuropsychology, linguistics, and the philosophy of language, a natural language or ordinary language is any language that has evolved naturally in humans through use and repetition without conscious planning or premeditation.
OMeta is a specialized object-oriented programming language for pattern matching, developed by Alessandro Warth and Ian Piumarta in 2007 under the Viewpoints Research Institute.
A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar.
In computer programming, a parser combinator is a higher-order function that accepts several parsers as input and returns a new parser as its output.
Parsing, syntax analysis or syntactic analysis is the process of analysing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar.
The Portable Document Format (PDF) is a file format developed in the 1990s to present documents, including text formatting and images, in a manner independent of application software, hardware, and operating systems.
Recursion occurs when a thing is defined in terms of itself or of its type.
In computer science, a recursive descent parser is a kind of top-down parser built from a set of mutually recursive procedures (or a non-recursive equivalent) where each such procedure usually implements one of the productions of the grammar.
A regular expression, regex or regexp (sometimes called a rational expression) is, in theoretical computer science and formal language theory, a sequence of characters that define a search pattern.
In computer programming, a string is traditionally a sequence of characters, either as a literal constant or as some kind of variable.
A syntactic predicate specifies the syntactic validity of applying a production in a formal grammar and is analogous to a semantic predicate that specifies the semantic validity of applying a production.
In computer science, terminal and nonterminal symbols are the lexical elements used in specifying the production rules constituting a formal grammar.
In computer science, the time complexity is the computational complexity that describes the amount of time it takes to run an algorithm.
Top-Down Parsing Language (TDPL) is a type of analytic formal grammar developed by Alexander Birman in the early 1970s in order to study formally the behavior of a common class of practical top-down parsers that support a limited form of backtracking.