76 relations: Abraham de Moivre, Asymptotic distribution, Bayesian inference, Bernoulli distribution, Bernoulli process, Bernoulli trial, Beta distribution, Beta function, Beta-binomial distribution, Binomial coefficient, Binomial sum variance inequality, Binomial test, Binomial theorem, Bit, Blaise Pascal, Boolean-valued function, Central limit theorem, Chernoff bound, Continuity correction, Covariance, Cumulative distribution function, De Moivre–Laplace theorem, Expected value, Experiment (probability theory), Failure, Fair coin, False (logic), Floor and ceiling functions, Hoeffding's inequality, Hypergeometric distribution, Independence (probability theory), Jacob Bernoulli, Kullback–Leibler divergence, Law of total probability, Logistic regression, Marginal distribution, Measure (mathematics), Median, Mode (statistics), Multifractal system, Multinomial distribution, Nat (unit), National Institute of Standards and Technology, Natural number, Negative binomial distribution, Normal distribution, Outcome (probability), Poisson binomial distribution, Poisson distribution, Poisson limit theorem, ..., Prior probability, Probability, Probability distribution, Probability mass function, Probability theory, Probit, Pseudorandom number generator, Quantile, Random number generation, Random variable, Rounding, Rule of thumb, SEMATECH, Shannon (unit), Statistical hypothesis testing, Statistical mechanics, Statistical significance, Statistics, The Doctrine of Chances, The Mathematical Gazette, Truth, Variance, Yes and no, Yes–no question, 0, 1. Expand index (26 more) » « Shrink index
Abraham de Moivre (26 May 166727 November 1754) was a French mathematician known for de Moivre's formula, a formula that links complex numbers and trigonometry, and for his work on the normal distribution and probability theory.
In mathematics and statistics, an asymptotic distribution is a probability distribution that is in a sense the "limiting" distribution of a sequence of distributions.
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.
In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with probability q.
In probability and statistics, a Bernoulli process is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1.
In the theory of probability and statistics, a Bernoulli trial (or binomial trial) is a random experiment with exactly two possible outcomes, "success" and "failure", in which the probability of success is the same every time the experiment is conducted.
In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval parametrized by two positive shape parameters, denoted by α and β, that appear as exponents of the random variable and control the shape of the distribution.
In mathematics, the beta function, also called the Euler integral of the first kind, is a special function defined by for.
In probability theory and statistics, the beta-binomial distribution is a family of discrete probability distributions on a finite support of non-negative integers arising when the probability of success in each of a fixed or known number of Bernoulli trials is either unknown or random.
In mathematics, any of the positive integers that occurs as a coefficient in the binomial theorem is a binomial coefficient.
The binomial sum variance inequality states that the variance of the sum of binomially distributed random variables will always be less than or equal to the variance of a binomial variable with the same ''n'' and ''p'' parameters.
In statistics, the binomial test is an exact test of the statistical significance of deviations from a theoretically expected distribution of observations into two categories.
In elementary algebra, the binomial theorem (or binomial expansion) describes the algebraic expansion of powers of a binomial.
The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications.
Blaise Pascal (19 June 1623 – 19 August 1662) was a French mathematician, physicist, inventor, writer and Catholic theologian.
A Boolean-valued function (sometimes called a predicate or a proposition) is a function of the type f: X → B, where X is an arbitrary set and where B is a Boolean domain, i.e. a generic two-element set, (for example B.
In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed.
In probability theory, the Chernoff bound, named after Herman Chernoff but due to Herman Rubin, gives exponentially decreasing bounds on tail distributions of sums of independent random variables.
In probability theory, a continuity correction is an adjustment that is made when a discrete distribution is approximated by a continuous distribution.
In probability theory and statistics, covariance is a measure of the joint variability of two random variables.
In probability theory and statistics, the cumulative distribution function (CDF, also cumulative density function) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. In the case of a continuous distribution, it gives the area under the probability density function from minus infinity to x. Cumulative distribution functions are also used to specify the distribution of multivariate random variables.
In probability theory, the de Moivre–Laplace theorem, which is a special case of the central limit theorem, states that the normal distribution may be used as an approximation to the binomial distribution under certain conditions.
In probability theory, the expected value of a random variable, intuitively, is the long-run average value of repetitions of the experiment it represents.
In probability theory, an experiment or trial (see below) is any procedure that can be infinitely repeated and has a well-defined set of possible outcomes, known as the sample space.
Failure is the state or condition of not meeting a desirable or intended objective, and may be viewed as the opposite of success.
In probability theory and statistics, a sequence of independent Bernoulli trials with probability 1/2 of success on each trial is metaphorically called a fair coin.
In logic, false or untrue is the state of possessing negative truth value or a nullary logical connective.
In mathematics and computer science, the floor function is the function that takes as input a real number x and gives as output the greatest integer less than or equal to x, denoted \operatorname(x).
In probability theory, Hoeffding's inequality provides an upper bound on the probability that the sum of bounded independent random variables deviates from its expected value by more than a certain amount.
In probability theory and statistics, the hypergeometric distribution is a discrete probability distribution that describes the probability of k successes (random draws for which the object drawn has a specified feature) in n draws, without replacement, from a finite population of size N that contains exactly K objects with that feature, wherein each draw is either a success or a failure.
In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of the other.
Jacob Bernoulli (also known as James or Jacques; – 16 August 1705) was one of the many prominent mathematicians in the Bernoulli family.
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy) is a measure of how one probability distribution diverges from a second, expected probability distribution.
In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.
In statistics, the logistic model (or logit model) is a statistical model that is usually taken to apply to a binary dependent variable.
In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.
In mathematical analysis, a measure on a set is a systematic way to assign a number to each suitable subset of that set, intuitively interpreted as its size.
The median is the value separating the higher half of a data sample, a population, or a probability distribution, from the lower half.
The mode of a set of data values is the value that appears most often.
A multifractal system is a generalization of a fractal system in which a single exponent (the fractal dimension) is not enough to describe its dynamics; instead, a continuous spectrum of exponents (the so-called singularity spectrum) is needed.
In probability theory, the multinomial distribution is a generalization of the binomial distribution.
The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the bit.
The National Institute of Standards and Technology (NIST) is one of the oldest physical science laboratories in the United States.
In mathematics, the natural numbers are those used for counting (as in "there are six coins on the table") and ordering (as in "this is the third largest city in the country").
In probability theory and statistics, the negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of independent and identically distributed Bernoulli trials before a specified (non-random) number of failures (denoted r) occurs.
In probability theory, the normal (or Gaussian or Gauss or Laplace–Gauss) distribution is a very common continuous probability distribution.
In probability theory, an outcome is a possible result of an experiment.
In probability theory and statistics, the Poisson binomial distribution is the discrete probability distribution of a sum of independent Bernoulli trials that are not necessarily identically distributed.
In probability theory and statistics, the Poisson distribution (in English often rendered), named after French mathematician Siméon Denis Poisson, is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant rate and independently of the time since the last event.
In probability theory, the law of rare events or Poisson limit theorem states that the Poisson distribution may be used as an approximation to the binomial distribution, under certain conditions.
In Bayesian statistical inference, a prior probability distribution, often simply called the prior, of an uncertain quantity is the probability distribution that would express one's beliefs about this quantity before some evidence is taken into account.
Probability is the measure of the likelihood that an event will occur.
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.
In probability and statistics, a probability mass function (pmf) is a function that gives the probability that a discrete random variable is exactly equal to some value.
Probability theory is the branch of mathematics concerned with probability.
In probability theory and statistics, the probit function is the quantile function associated with the standard normal distribution, which is commonly denoted as N(0,1).
A pseudorandom number generator (PRNG), also known as a deterministic random bit generator (DRBG), is an algorithm for generating a sequence of numbers whose properties approximate the properties of sequences of random numbers.
In statistics and probability quantiles are cut points dividing the range of a probability distribution into contiguous intervals with equal probabilities, or dividing the observations in a sample in the same way.
Random number generation is the generation of a sequence of numbers or symbols that cannot be reasonably predicted better than by a random chance, usually through a hardware random-number generator (RNG).
In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is a variable whose possible values are outcomes of a random phenomenon.
Rounding a numerical value means replacing it by another value that is approximately equal but has a shorter, simpler, or more explicit representation; for example, replacing $ with $, or the fraction 312/937 with 1/3, or the expression with.
The English phrase rule of thumb refers to a principle with broad application that is not intended to be strictly accurate or reliable for every situation.
SEMATECH (from Semiconductor Manufacturing Technology) is a not-for-profit consortium that performs research and development to advance chip manufacturing.
The shannon (symbol: Sh), more commonly known as the bit, is a unit of information and of entropy defined by IEC 80000-13.
A statistical hypothesis, sometimes called confirmatory data analysis, is a hypothesis that is testable on the basis of observing a process that is modeled via a set of random variables.
Statistical mechanics is one of the pillars of modern physics.
In statistical hypothesis testing, a result has statistical significance when it is very unlikely to have occurred given the null hypothesis.
Statistics is a branch of mathematics dealing with the collection, analysis, interpretation, presentation, and organization of data.
The Doctrine of Chances was the first textbook on probability theory, written by 18th-century French mathematician Abraham de Moivre and first published in 1718.
The Mathematical Gazette is an academic journal of mathematics education, published three times yearly, that publishes "articles about the teaching and learning of mathematics with a focus on the 15–20 age range and expositions of attractive areas of mathematics." It was established in 1894 by Edward Mann Langley as the successor to the Reports of the Association for the Improvement of Geometrical Teaching.
Truth is most often used to mean being in accord with fact or reality, or fidelity to an original or standard.
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.
Yes and no, or word pairs with a similar usage, are expressions of the affirmative and the negative, respectively, in several languages including English.
In linguistics, a yes–no question, formally known as a polar question or a general question, is a question whose expected answer is either "yes" or "no".
0 (zero) is both a number and the numerical digit used to represent that number in numerals.
1 (one, also called unit, unity, and (multiplicative) identity) is a number, numeral, and glyph.
Binomial Distribution, Binomial Probability Distribution, Binomial data, Binomial frequency distribution, Binomial model, Binomial pmf, Binomial probability, Binomial probability distribution, Binomial probability function, Binomial random variable, Binomial variable, BinomialDistribution, BinomialDistribution/Revisited, Binomially distributed, Bionomial expectation, Poisson approximation.