164 relations: AC (complexity), Adjacency list, Adjacency matrix, Age of the universe, Alan Turing, Algorithm, ALL (complexity), Alphabet (formal languages), Alternating Turing machine, Analysis of algorithms, Arthur–Merlin protocol, Axiom, ♯P, Best, worst and average case, Big O notation, Binary number, Biology, Bit array, Blum axioms, Blum's speedup theorem, Boolean circuit, Boolean satisfiability problem, Boris Trakhtenbrot, BPP (complexity), BQP, Cambridge University Press, Cellular automaton, Cengage, Church–Turing thesis, Circuit complexity, Clay Mathematics Institute, Co-NP, Cobham's thesis, Combinatorics, Communication complexity, Complement (complexity), Complete (complexity), Complexity, Complexity class, Computability theory, Computational complexity, Computational problem, Computer, Connectivity (graph theory), Context of computational complexity, Conway's Game of Life, Cook–Levin theorem, Counting problem (complexity), Decision problem, Decision tree model, ..., Descriptive complexity theory, Deterministic algorithm, Discrete logarithm, DSPACE, DTIME, Euclidean algorithm, Eugene M. Luks, EXPSPACE, EXPTIME, Feasible region, Formal language, FP (complexity), Function problem, Gabriel Lamé, Game complexity, General number field sieve, Graph (discrete mathematics), Graph isomorphism, Graph isomorphism problem, Graph theory, Hamiltonian path problem, Hisao Yamada, Integer, Integer factorization, Integer programming, Interactive proof system, Intractability, Introduction to Automata Theory, Languages, and Computation, Introduction to the Theory of Computation, IP (complexity), Jack Edmonds, John Myhill, John Wiley & Sons, Juris Hartmanis, Knapsack problem, L (complexity), Lance Fortnow, László Babai, Leonid Levin, Linear bounded automaton, List of complexity classes, List of computability and complexity topics, List of important publications in theoretical computer science, List of unsolved problems in computer science, Log-space reduction, Logic gate, Logistics, Manuel Blum, Mathematical optimization, Milan, Millennium Prize Problems, Minimax, MIT Press, Model of computation, NC (complexity), NEXPTIME, NL (complexity), Non-deterministic Turing machine, Nondeterministic algorithm, NP (complexity), NP-completeness, NP-hardness, NP-intermediate, NSPACE, NTIME, Operations research, Optimization problem, P (complexity), Parallel computing, Parameterized complexity, Partial function, Polynomial hierarchy, Polynomial-time reduction, PP (complexity), Presburger arithmetic, Primality test, Probabilistic Turing machine, Probability distribution, Promise problem, Proof complexity, Protein structure prediction, PSPACE, Pure mathematics, QMA, Quantum algorithm, Quantum complexity theory, Quantum Turing machine, Quicksort, Random-access machine, Randomized algorithm, Raymond Smullyan, Richard E. Stearns, Richard M. Karp, RP (complexity), RSA (cryptosystem), Savitch's theorem, Shor's algorithm, Space hierarchy theorem, Springer Science+Business Media, Stephen Cook, String (computer science), Structural complexity theory, Switching circuit theory, Symmetric Turing machine, Symposium on Theoretical Aspects of Computer Science, Theoretical computer science, Theory of computation, Time complexity, Time hierarchy theorem, Transcomputational problem, Travelling salesman problem, Turing machine, Vertex cover, ZPP (complexity). Expand index (114 more) » « Shrink index
In circuit complexity, AC is a complexity class hierarchy.
In graph theory and computer science, an adjacency list is a collection of unordered lists used to represent a finite graph.
In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph.
In physical cosmology, the age of the universe is the time elapsed since the Big Bang.
Alan Mathison Turing (23 June 1912 – 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst, philosopher, and theoretical biologist.
In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems.
In computability and complexity theory, ALL is the class of all decision problems.
In formal language theory, a string is defined as a finite sequence of members of an underlying base set; this set is called the alphabet of a string or collection of strings.
In computational complexity theory, an alternating Turing machine (ATM) is a non-deterministic Turing machine (NTM) with a rule for accepting computations that generalizes the rules used in the definition of the complexity classes NP and co-NP.
In computer science, the analysis of algorithms is the determination of the computational complexity of algorithms, that is the amount of time, storage and/or other resources necessary to execute them.
In computational complexity theory, an Arthur–Merlin protocol is an interactive proof system in which the verifier's coin tosses are constrained to be public (i.e. known to the prover too).
An axiom or postulate is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments.
In computational complexity theory, the complexity class ♯P (pronounced "number P" or, sometimes "sharp P" or "hash P") is the set of the counting problems associated with the decision problems in the set NP.
In computer science, best, worst, and average cases of a given algorithm express what the resource usage is at least, at most and on average, respectively.
Big O notation is a mathematical notation that describes the limiting behaviour of a function when the argument tends towards a particular value or infinity.
In mathematics and digital electronics, a binary number is a number expressed in the base-2 numeral system or binary numeral system, which uses only two symbols: typically 0 (zero) and 1 (one).
Biology is the natural science that studies life and living organisms, including their physical structure, chemical composition, function, development and evolution.
A bit array (also known as bit map, bit set, bit string, or bit vector) is an array data structure that compactly stores bits.
In computational complexity theory the Blum axioms or Blum complexity axioms are axioms that specify desirable properties of complexity measures on the set of computable functions.
In computational complexity theory, Blum's speedup theorem, first stated by Manuel Blum in 1967, is a fundamental theorem about the complexity of computable functions.
In computational complexity theory and circuit complexity, a Boolean circuit is a mathematical model for digital logic circuits.
In computer science, the Boolean satisfiability problem (sometimes called propositional satisfiability problem and abbreviated as SATISFIABILITY or SAT) is the problem of determining if there exists an interpretation that satisfies a given Boolean formula.
Boris (Boaz) Avraamovich Trakhtenbrot (Борис Авраамович Трахтенброт; 19 February 1921 – 19 September 2016), or Boaz (Boris) Trakhtenbrot (בועז טרכטנברוט) was an Israeli and Russian mathematician in mathematical logic, algorithms, theory of computation, and cybernetics.
In computational complexity theory, BPP, which stands for bounded-error probabilistic polynomial time is the class of decision problems solvable by a probabilistic Turing machine in polynomial time with an error probability bounded away from 1/2 for all instances.
In computational complexity theory, BQP (bounded-error quantum polynomial time) is the class of decision problems solvable by a quantum computer in polynomial time, with an error probability of at most 1/3 for all instances.
Cambridge University Press (CUP) is the publishing business of the University of Cambridge.
A cellular automaton (pl. cellular automata, abbrev. CA) is a discrete model studied in computer science, mathematics, physics, complexity science, theoretical biology and microstructure modeling.
Cengage is an educational content, technology, and services company for the higher education, K-12, professional, and library markets worldwide.
In computability theory, the Church–Turing thesis (also known as computability thesis, the Turing–Church thesis, the Church–Turing conjecture, Church's thesis, Church's conjecture, and Turing's thesis) is a hypothesis about the nature of computable functions.
In theoretical computer science, circuit complexity is a branch of computational complexity theory in which Boolean functions are classified according to the size or depth of Boolean circuits that compute them.
The Clay Mathematics Institute (CMI) is a private, non-profit foundation, based in Peterborough, New Hampshire, United States.
In computational complexity theory, co-NP is a complexity class.
Cobham's thesis, also known as Cobham–Edmonds thesis (named after Alan Cobham and Jack Edmonds), asserts that computational problems can be feasibly computed on some computational device only if they can be computed in polynomial time; that is, if they lie in the complexity class P (PTIME).
Combinatorics is an area of mathematics primarily concerned with counting, both as a means and an end in obtaining results, and certain properties of finite structures.
In theoretical computer science, communication complexity studies the amount of communication required to solve a problem when the input to the problem is distributed among two or more parties.
In computational complexity theory, the complement of a decision problem is the decision problem resulting from reversing the yes and no answers.
In computational complexity theory, a computational problem is complete for a complexity class if it is, in a technical sense, among the "hardest" (or "most expressive") problems in the complexity class.
Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, meaning there is no reasonable higher instruction to define the various possible interactions.
In computational complexity theory, a complexity class is a set of problems of related resource-based complexity.
Computability theory, also known as recursion theory, is a branch of mathematical logic, of computer science, and of the theory of computation that originated in the 1930s with the study of computable functions and Turing degrees.
In computer science, the computational complexity, or simply complexity of an algorithm is the amount of resources required for running it.
In theoretical computer science, a computational problem is a mathematical object representing a collection of questions that computers might be able to solve.
A computer is a device that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming.
In mathematics and computer science, connectivity is one of the basic concepts of graph theory: it asks for the minimum number of elements (nodes or edges) that need to be removed to disconnect the remaining nodes from each other.
In computational complexity theory and analysis of algorithms, a number of metrics are defined describing the resources, such as time or space, that a machine needs to solve a particular problem.
The Game of Life, also known simply as Life, is a cellular automaton devised by the British mathematician John Horton Conway in 1970.
In computational complexity theory, the Cook–Levin theorem, also known as Cook's theorem, states that the Boolean satisfiability problem is NP-complete.
In computational complexity theory and computability theory, a counting problem is a type of computational problem.
In computability theory and computational complexity theory, a decision problem is a problem that can be posed as a yes-no question of the input values.
In computational complexity and communication complexity theories the decision tree model is the model of computation or communication in which an algorithm or communication process is considered to be basically a decision tree, i.e., a sequence of branching operations based on comparisons of some quantities, the comparisons being assigned the unit computational cost.
Descriptive complexity is a branch of computational complexity theory and of finite model theory that characterizes complexity classes by the type of logic needed to express the languages in them.
In computer science, a deterministic algorithm is an algorithm which, given a particular input, will always produce the same output, with the underlying machine always passing through the same sequence of states.
In the mathematics of the real numbers, the logarithm logb a is a number x such that, for given numbers a and b. Analogously, in any group G, powers bk can be defined for all integers k, and the discrete logarithm logb a is an integer k such that.
In computational complexity theory, DSPACE or SPACE is the computational resource describing the resource of memory space for a deterministic Turing machine.
In computational complexity theory, DTIME (or TIME) is the computational resource of computation time for a deterministic Turing machine.
. EXAMPLES CAN BE FOUND BELOW, E.G., IN THE "Matrix method" SECTION.
Eugene Michael Luks (born circa 1940) is an American mathematician and computer scientist, a professor emeritus of computer and information science at the University of Oregon.
In complexity theory, '' is the set of all decision problems solvable by a deterministic Turing machine in O(2p(n)) space, where p(n) is a polynomial function of n. (Some authors restrict p(n) to be a linear function, but most authors instead call the resulting class.) If we use a nondeterministic machine instead, we get the class, which is equal to by Savitch's theorem.
In computational complexity theory, the complexity class EXPTIME (sometimes called EXP or DEXPTIME) is the set of all decision problems that have exponential runtime, i.e., that are solvable by a deterministic Turing machine in O(2p(n)) time, where p(n) is a polynomial function of n. In terms of DTIME, We know and also, by the time hierarchy theorem and the space hierarchy theorem, that so at least one of the first three inclusions and at least one of the last three inclusions must be proper, but it is not known which ones are.
In mathematical optimization, a feasible region, feasible set, search space, or solution space is the set of all possible points (sets of values of the choice variables) of an optimization problem that satisfy the problem's constraints, potentially including inequalities, equalities, and integer constraints.
In mathematics, computer science, and linguistics, a formal language is a set of strings of symbols together with a set of rules that are specific to it.
In computational complexity theory, the complexity class FP is the set of function problems which can be solved by a deterministic Turing machine in polynomial time; it is the function problem version of the decision problem class P. Roughly speaking, it is the class of functions that can be efficiently computed on classical computers without randomization.
In computational complexity theory, a function problem is a computational problem where a single output (of a total function) is expected for every input, but the output is more complex than that of a decision problem.
Gabriel Lamé (22 July 1795 – 1 May 1870) was a French mathematician who contributed to the theory of partial differential equations by the use of curvilinear coordinates, and the mathematical theory of elasticity.
Combinatorial game theory has several ways of measuring game complexity.
In number theory, the general number field sieve (GNFS) is the most efficient classical algorithm known for factoring integers larger than.
In mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related".
In graph theory, an isomorphism of graphs G and H is a bijection between the vertex sets of G and H such that any two vertices u and v of G are adjacent in G if and only if ƒ(u) and ƒ(v) are adjacent in H. This kind of bijection is commonly described as "edge-preserving bijection", in accordance with the general notion of isomorphism being a structure-preserving bijection.
The graph isomorphism problem is the computational problem of determining whether two finite graphs are isomorphic.
In mathematics, graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects.
In the mathematical field of graph theory the Hamiltonian path problem and the Hamiltonian cycle problem are problems of determining whether a Hamiltonian path (a path in an undirected or directed graph that visits each vertex exactly once) or a Hamiltonian cycle exists in a given graph (whether directed or undirected).
was a Japanese computer scientist, known for his influential contributions to theoretical computer science, as well as for the development of Japanese keyboard layouts, a challenging practical problem.
An integer (from the Latin ''integer'' meaning "whole")Integer 's first literal meaning in Latin is "untouched", from in ("not") plus tangere ("to touch").
In number theory, integer factorization is the decomposition of a composite number into a product of smaller integers.
An integer programming problem is a mathematical optimization or feasibility program in which some or all of the variables are restricted to be integers.
In computational complexity theory, an interactive proof system is an abstract machine that models computation as the exchange of messages between two parties.
Intractable may refer to.
Introduction to Automata Theory, Languages, and Computation is an influential computer science textbook by John Hopcroft and Jeffrey Ullman on formal languages and the theory of computation.
Introduction to the Theory of Computation is a standard textbook in theoretical computer science, written by Michael Sipser and first published by PWS Publishing in 1997.
In computational complexity theory, the class IP (which stands for Interactive Polynomial time) is the class of problems solvable by an interactive proof system.
Jack R. Edmonds (born April 5, 1934) is an American computer scientist, regarded as one of the most important contributors to the field of combinatorial optimization.
John R. Myhill Sr. (11 August 1923 – 15 February 1987) was a British mathematician.
John Wiley & Sons, Inc., also referred to as Wiley, is a global publishing company that specializes in academic publishing.
Juris Hartmanis (born July 5, 1928) is a prominent computer scientist and computational theorist who, with Richard E. Stearns, received the 1993 ACM Turing Award "in recognition of their seminal paper which established the foundations for the field of computational complexity theory".
The knapsack problem or rucksack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible.
In computational complexity theory, L (also known as LSPACE or DLOGSPACE) is the complexity class containing decision problems that can be solved by a deterministic Turing machine using a logarithmic amount of writable memory space.
Lance Jeremy Fortnow (born August 15, 1963) is a computer scientist known for major results in computational complexity and interactive proof systems.
László "Laci" Babai (born July 20, 1950 in Budapest) from Babai's web site, retrieved 2016-01-28.
Leonid Anatolievich Levin (Леони́д Анато́льевич Ле́вин; Леоні́д Анато́лійович Ле́він; born November 2, 1948) is a Soviet-American computer scientist.
In computer science, a linear bounded automaton (plural linear bounded automata, abbreviated LBA) is a restricted form of Turing machine.
This is a list of complexity classes in computational complexity theory.
This is a list of computability and complexity topics, by Wikipedia page.
This is a list of important publications in theoretical computer science, organized by field.
This article is a list of unsolved problems in computer science.
In computational complexity theory, a log-space reduction is a reduction computable by a deterministic Turing machine using logarithmic space.
In electronics, a logic gate is an idealized or physical device implementing a Boolean function; that is, it performs a logical operation on one or more binary inputs and produces a single binary output.
Logistics is generally the detailed organization and implementation of a complex operation.
Manuel Blum (Caracas, 26 April 1938) is a Venezuelan computer scientist who received the Turing Award in 1995 "In recognition of his contributions to the foundations of computational complexity theory and its application to cryptography and program checking".
In mathematics, computer science and operations research, mathematical optimization or mathematical programming, alternatively spelled optimisation, is the selection of a best element (with regard to some criterion) from some set of available alternatives.
Milan (Milano; Milan) is a city in northern Italy, capital of Lombardy, and the second-most populous city in Italy after Rome, with the city proper having a population of 1,380,873 while its province-level municipality has a population of 3,235,000.
The Millennium Prize Problems are seven problems in mathematics that were stated by the Clay Mathematics Institute in 2000.
Minimax (sometimes MinMax or MM) is a decision rule used in decision theory, game theory, statistics and philosophy for minimizing the possible loss for a worst case (maximum loss) scenario.
The MIT Press is a university press affiliated with the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts (United States).
In computer science, and more specifically in computability theory and computational complexity theory, a model of computation is a model which describes how a set of outputs are computed given a set of inputs.
In complexity theory, the class NC (for "Nick's Class") is the set of decision problems decidable in polylogarithmic time on a parallel computer with a polynomial number of processors.
In computational complexity theory, the complexity class NEXPTIME (sometimes called NEXP) is the set of decision problems that can be solved by a non-deterministic Turing machine using time 2n O(1).
In computational complexity theory, NL (Nondeterministic Logarithmic-space) is the complexity class containing decision problems which can be solved by a nondeterministic Turing machine using a logarithmic amount of memory space.
In theoretical computer science, a Turing machine is a theoretical machine that is used in thought experiments to examine the abilities and limitations of computers.
In computer science, a nondeterministic algorithm is an algorithm that, even for the same input, can exhibit different behaviors on different runs, as opposed to a deterministic algorithm.
In computational complexity theory, NP (for nondeterministic polynomial time) is a complexity class used to describe certain types of decision problems.
In computational complexity theory, an NP-complete decision problem is one belonging to both the NP and the NP-hard complexity classes.
NP-hardness (''n''on-deterministic ''p''olynomial-time hardness), in computational complexity theory, is the defining property of a class of problems that are, informally, "at least as hard as the hardest problems in NP".
In computational complexity, problems that are in the complexity class NP but are neither in the class P nor NP-complete are called NP-intermediate, and the class of such problems is called NPI.
In computational complexity theory, non-deterministic space or NSPACE is the computational resource describing the memory space for a non-deterministic Turing machine.
In computational complexity theory, the complexity class NTIME(f(n)) is the set of decision problems that can be solved by a non-deterministic Turing machine which runs in time O(f(n)).
Operations research, or operational research in British usage, is a discipline that deals with the application of advanced analytical methods to help make better decisions.
In mathematics and computer science, an optimization problem is the problem of finding the best solution from all feasible solutions.
In computational complexity theory, P, also known as PTIME or DTIME(nO(1)), is a fundamental complexity class.
Parallel computing is a type of computation in which many calculations or the execution of processes are carried out concurrently.
In computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output.
In mathematics, a partial function from X to Y (written as or) is a function, for some subset X ′ of X.
In computational complexity theory, the polynomial hierarchy (sometimes called the polynomial-time hierarchy) is a hierarchy of complexity classes that generalize the classes P, NP and co-NP to oracle machines.
In computational complexity theory, a polynomial-time reduction is a method of solving one problem by means of a hypothetical subroutine for solving a different problem (that is, a reduction), that uses polynomial time excluding the time within the subroutine.
In complexity theory, PP is the class of decision problems solvable by a probabilistic Turing machine in polynomial time, with an error probability of less than 1/2 for all instances.
Presburger arithmetic is the first-order theory of the natural numbers with addition, named in honor of Mojżesz Presburger, who introduced it in 1929.
A primality test is an algorithm for determining whether an input number is prime.
In computability theory, a probabilistic Turing machine is a non-deterministic Turing machine which chooses between the available transitions at each point according to some probability distribution.
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.
In computational complexity theory, a promise problem is a generalization of a decision problem where the input is promised to belong to a particular subset of all possible inputs.
In theoretical computer science, and specifically computational complexity theory, proof complexity is the field aiming to understand and analyse the computational resources that are required to prove or refute statements.
Protein structure prediction is the inference of the three-dimensional structure of a protein from its amino acid sequence—that is, the prediction of its folding and its secondary and tertiary structure from its primary structure.
In computational complexity theory, PSPACE is the set of all decision problems that can be solved by a Turing machine using a polynomial amount of space.
Broadly speaking, pure mathematics is mathematics that studies entirely abstract concepts.
In computational complexity theory, QMA, which stands for Quantum Merlin Arthur, is the quantum analog of the nonprobabilistic complexity class NP or the probabilistic complexity class MA.
In quantum computing, a quantum algorithm is an algorithm which runs on a realistic model of quantum computation, the most commonly used model being the quantum circuit model of computation.
Quantum complexity theory is a part of computational complexity theory in theoretical computer science.
A quantum Turing machine (QTM), also a universal quantum computer, is an abstract machine used to model the effect of a quantum computer.
Quicksort (sometimes called partition-exchange sort) is an efficient sorting algorithm, serving as a systematic method for placing the elements of an array in order.
In computer science, random-access machine (RAM) is an abstract machine in the general class of register machines.
A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic.
Raymond Merrill Smullyan (May 25, 1919 – February 6, 2017) was an American mathematician, magician, concert pianist, logician, Taoist, and philosopher.
Richard Edwin Stearns (born July 5, 1936) is a prominent computer scientist who, with Juris Hartmanis, received the 1993 ACM Turing Award "in recognition of their seminal paper which established the foundations for the field of computational complexity theory" (Hartmanis and Stearns, 1965).
Richard Manning Karp (born January 3, 1935) is an American computer scientist and computational theorist at the University of California, Berkeley.
In computational complexity theory, randomized polynomial time (RP) is the complexity class of problems for which a probabilistic Turing machine exists with these properties.
RSA (Rivest–Shamir–Adleman) is one of the first public-key cryptosystems and is widely used for secure data transmission.
In computational complexity theory, Savitch's theorem, proved by Walter Savitch in 1970, gives a relationship between deterministic and non-deterministic space complexity.
Shor's algorithm, named after mathematician Peter Shor, is a quantum algorithm (an algorithm that runs on a quantum computer) for integer factorization formulated in 1994.
In computational complexity theory, the space hierarchy theorems are separation results that show that both deterministic and nondeterministic machines can solve more problems in (asymptotically) more space, subject to certain conditions.
Springer Science+Business Media or Springer, part of Springer Nature since 2015, is a global publishing company that publishes books, e-books and peer-reviewed journals in science, humanities, technical and medical (STM) publishing.
Stephen Arthur Cook, (born December 14, 1939) is an American-Canadian computer scientist and mathematician who has made major contributions to the fields of complexity theory and proof complexity.
In computer programming, a string is traditionally a sequence of characters, either as a literal constant or as some kind of variable.
In computational complexity theory of computer science, the structural complexity theory or simply structural complexity is the study of complexity classes, rather than computational complexity of individual problems and algorithms.
Switching circuit theory is the mathematical study of the properties of networks of idealized switches.
A symmetric Turing machine is a Turing machine which has a configuration graph that is undirected (that is, configuration i yields configuration j if and only if j yields i).
Symposium on Theoretical Aspects of Computer Science (STACS) is an academic conference in the field of computer science.
Theoretical computer science, or TCS, is a subset of general computer science and mathematics that focuses on more mathematical topics of computing and includes the theory of computation.
In theoretical computer science and mathematics, the theory of computation is the branch that deals with how efficiently problems can be solved on a model of computation, using an algorithm.
In computer science, the time complexity is the computational complexity that describes the amount of time it takes to run an algorithm.
In computational complexity theory, the time hierarchy theorems are important statements about time-bounded computation on Turing machines.
In computational complexity theory, a transcomputational problem is a problem that requires processing of more than 1093 bits of information.
The travelling salesman problem (TSP) asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city and returns to the origin city?" It is an NP-hard problem in combinatorial optimization, important in operations research and theoretical computer science.
A Turing machine is a mathematical model of computation that defines an abstract machine, which manipulates symbols on a strip of tape according to a table of rules.
In the mathematical discipline of graph theory, a vertex cover (sometimes node cover) of a graph is a set of vertices such that each edge of the graph is incident to at least one vertex of the set.
In complexity theory, ZPP (zero-error probabilistic polynomial time) is the complexity class of problems for which a probabilistic Turing machine exists with these properties.
Asymptotic complexity, Calculation complexity, Complexity of algorithms, Complexity theory (computation), Complexity theory in computation, Computational complexity analysis, Computational intractability, Computational intractablity, Computationally efficient, Computationally infeasible, Computationally intractable, Continuous complexity theory, Efficient procedure, Efficiently-computable, Feasible computability, Feasible computation, Hierarchy theorem, Input size, Intractability (complexity), Intractable problem, Intractableness, Intractably, Levin reduction, Order of complexity, Order of computation, Space complexity theory, Tractable problem.