Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Install
Faster access than browser!
 

Vapnik–Chervonenkis theory

Index Vapnik–Chervonenkis theory

Vapnik–Chervonenkis theory (also known as VC theory) was developed during 1960–1990 by Vladimir Vapnik and Alexey Chervonenkis. [1]

26 relations: Alexey Chervonenkis, Central limit theorem, Computational learning theory, Consistency (statistics), Covering number, Dirac measure, Dudley's theorem, Empirical process, Empirical risk minimization, Hoeffding's inequality, Hypograph (mathematics), Jensen's inequality, John Wiley & Sons, Law of large numbers, Loss function, Machine learning, Rademacher complexity, Richard M. Dudley, Sauer–Shelah lemma, Shattered set, Slutsky's theorem, Springer Science+Business Media, Stability (learning theory), Statistical learning theory, VC dimension, Vladimir Vapnik.

Alexey Chervonenkis

Alexey Yakovlevich Chervonenkis (Алексей Яковлевич Червоненкис; 7 September 1938 – 22 September 2014) was a Soviet and Russian mathematician, and, with Vladimir Vapnik, was one of the main developers of the Vapnik–Chervonenkis theory, also known as the "fundamental theory of learning" an important part of computational learning theory.

New!!: Vapnik–Chervonenkis theory and Alexey Chervonenkis · See more »

Central limit theorem

In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed.

New!!: Vapnik–Chervonenkis theory and Central limit theorem · See more »

Computational learning theory

In computer science, computational learning theory (or just learning theory) is a subfield of Artificial Intelligence devoted to studying the design and analysis of machine learning algorithms.

New!!: Vapnik–Chervonenkis theory and Computational learning theory · See more »

Consistency (statistics)

In statistics, consistency of procedures, such as computing confidence intervals or conducting hypothesis tests, is a desired property of their behaviour as the number of items in the data set to which they are applied increases indefinitely.

New!!: Vapnik–Chervonenkis theory and Consistency (statistics) · See more »

Covering number

In mathematics, a covering number is the number of spherical balls of a given size needed to completely cover a given space, with possible overlaps.

New!!: Vapnik–Chervonenkis theory and Covering number · See more »

Dirac measure

In mathematics, a Dirac measure assigns a size to a set based solely on whether it contains a fixed element x or not.

New!!: Vapnik–Chervonenkis theory and Dirac measure · See more »

Dudley's theorem

In probability theory, Dudley’s theorem is a result relating the expected upper bound and regularity properties of a Gaussian process to its entropy and covariance structure.

New!!: Vapnik–Chervonenkis theory and Dudley's theorem · See more »

Empirical process

In probability theory, an empirical process is a stochastic process that describes the proportion of objects in a system in a given state.

New!!: Vapnik–Chervonenkis theory and Empirical process · See more »

Empirical risk minimization

Empirical risk minimization (ERM) is a principle in statistical learning theory which defines a family of learning algorithms and is used to give theoretical bounds on their performance.

New!!: Vapnik–Chervonenkis theory and Empirical risk minimization · See more »

Hoeffding's inequality

In probability theory, Hoeffding's inequality provides an upper bound on the probability that the sum of bounded independent random variables deviates from its expected value by more than a certain amount.

New!!: Vapnik–Chervonenkis theory and Hoeffding's inequality · See more »

Hypograph (mathematics)

In mathematics, the hypograph or subgraph of a function f: Rn → R is the set of points lying on or below its graph: and the strict hypograph of the function is: The set is empty if f \equiv -\infty.

New!!: Vapnik–Chervonenkis theory and Hypograph (mathematics) · See more »

Jensen's inequality

In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function.

New!!: Vapnik–Chervonenkis theory and Jensen's inequality · See more »

John Wiley & Sons

John Wiley & Sons, Inc., also referred to as Wiley, is a global publishing company that specializes in academic publishing.

New!!: Vapnik–Chervonenkis theory and John Wiley & Sons · See more »

Law of large numbers

In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times.

New!!: Vapnik–Chervonenkis theory and Law of large numbers · See more »

Loss function

In mathematical optimization, statistics, econometrics, decision theory, machine learning and computational neuroscience, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event.

New!!: Vapnik–Chervonenkis theory and Loss function · See more »

Machine learning

Machine learning is a subset of artificial intelligence in the field of computer science that often uses statistical techniques to give computers the ability to "learn" (i.e., progressively improve performance on a specific task) with data, without being explicitly programmed.

New!!: Vapnik–Chervonenkis theory and Machine learning · See more »

Rademacher complexity

In computational learning theory (machine learning and theory of computation), Rademacher complexity, named after Hans Rademacher, measures richness of a class of real-valued functions with respect to a probability distribution.

New!!: Vapnik–Chervonenkis theory and Rademacher complexity · See more »

Richard M. Dudley

Richard Mansfield Dudley (born 1938) is Professor of Mathematics at the Massachusetts Institute of Technology.

New!!: Vapnik–Chervonenkis theory and Richard M. Dudley · See more »

Sauer–Shelah lemma

In combinatorial mathematics and extremal set theory, the Sauer–Shelah lemma states that every family of sets with small VC dimension consists of a small number of sets.

New!!: Vapnik–Chervonenkis theory and Sauer–Shelah lemma · See more »

Shattered set

The concept of shattered sets plays an important role in Vapnik–Chervonenkis theory, also known as VC-theory.

New!!: Vapnik–Chervonenkis theory and Shattered set · See more »

Slutsky's theorem

In probability theory, Slutsky’s theorem extends some properties of algebraic operations on convergent sequences of real numbers to sequences of random variables.

New!!: Vapnik–Chervonenkis theory and Slutsky's theorem · See more »

Springer Science+Business Media

Springer Science+Business Media or Springer, part of Springer Nature since 2015, is a global publishing company that publishes books, e-books and peer-reviewed journals in science, humanities, technical and medical (STM) publishing.

New!!: Vapnik–Chervonenkis theory and Springer Science+Business Media · See more »

Stability (learning theory)

Stability, also known as algorithmic stability, is a notion in computational learning theory of how a machine learning algorithm is perturbed by small changes to its inputs.

New!!: Vapnik–Chervonenkis theory and Stability (learning theory) · See more »

Statistical learning theory

Statistical learning theory is a framework for machine learning drawing from the fields of statistics and functional analysis.

New!!: Vapnik–Chervonenkis theory and Statistical learning theory · See more »

VC dimension

In Vapnik–Chervonenkis theory, the VC dimension (for Vapnik–Chervonenkis dimension) is a measure of the capacity (complexity, expressive power, richness, or flexibility) of a space of functions that can be learned by a statistical classification algorithm.

New!!: Vapnik–Chervonenkis theory and VC dimension · See more »

Vladimir Vapnik

Vladimir Naumovich Vapnik (Владимир Наумович Вапник; born 6 December 1936) is one of the main developers of the Vapnik–Chervonenkis theory of statistical learning, and the co-inventor of the support vector machine method, and support vector clustering algorithm.

New!!: Vapnik–Chervonenkis theory and Vladimir Vapnik · See more »

Redirects here:

VC theory, Vapnik Chervonenkis theory, Vapnik-Chervonenkis theory.

References

[1] https://en.wikipedia.org/wiki/Vapnik–Chervonenkis_theory

OutgoingIncoming
Hey! We are on Facebook now! »