We are working to restore the Unionpedia app on the Google Play Store
OutgoingIncoming
🌟We've simplified our design for better navigation!
Instagram Facebook X LinkedIn

Bernoulli distribution

Index Bernoulli distribution

In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with probability q. [1]

Table of Contents

  1. 39 relations: Bernoulli process, Bernoulli sampling, Bernoulli trial, Beta distribution, Binary code, Binary decision diagram, Binary entropy function, Binomial distribution, Bit, Boolean-valued function, Categorical distribution, Coin flipping, Conjugate prior, Expected value, Experiment, Exponential family, False (logic), Geometric distribution, Independence (probability theory), Independent and identically distributed random variables, Jacob Bernoulli, John Tsitsiklis, Kurtosis, Maximum likelihood estimation, Outcome (probability), Probability, Probability distribution, Probability mass function, Probability theory, Random variable, Relationships among probability distributions, Sample mean and covariance, Skewness, Statistics, Stochastic process, Truth value, Variance, Yes and no, Yes–no question.

  2. Conjugate prior distributions
  3. Exponential family distributions

Bernoulli process

In probability and statistics, a Bernoulli process (named after Jacob Bernoulli) is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1.

See Bernoulli distribution and Bernoulli process

Bernoulli sampling

In the theory of finite population sampling, Bernoulli sampling is a sampling process where each element of the population is subjected to an independent Bernoulli trial which determines whether the element becomes part of the sample.

See Bernoulli distribution and Bernoulli sampling

Bernoulli trial

In the theory of probability and statistics, a Bernoulli trial (or binomial trial) is a random experiment with exactly two possible outcomes, "success" and "failure", in which the probability of success is the same every time the experiment is conducted. Bernoulli distribution and Bernoulli trial are discrete distributions.

See Bernoulli distribution and Bernoulli trial

Beta distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval or (0, 1) in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution. Bernoulli distribution and beta distribution are conjugate prior distributions and exponential family distributions.

See Bernoulli distribution and Beta distribution

Binary code

A binary code represents text, computer processor instructions, or any other data using a two-symbol system.

See Bernoulli distribution and Binary code

Binary decision diagram

In computer science, a binary decision diagram (BDD) or branching program is a data structure that is used to represent a Boolean function.

See Bernoulli distribution and Binary decision diagram

Binary entropy function

In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process (i.i.d. binary variable) with probability p of one of two values, and is given by the formula: The base of the logarithm corresponds to the choice of units of information; base corresponds to nats and is mathematically convenient, while base 2 (binary logarithm) corresponds to shannons and is conventional (as shown in the graph); explicitly: Note that the values at 0 and 1 are given by the limit \textstyle 0 \log 0.

See Bernoulli distribution and Binary entropy function

Binomial distribution

In probability theory and statistics, the binomial distribution with parameters and is the discrete probability distribution of the number of successes in a sequence of independent experiments, each asking a yes–no question, and each with its own Boolean-valued outcome: success (with probability) or failure (with probability). Bernoulli distribution and binomial distribution are conjugate prior distributions, discrete distributions and exponential family distributions.

See Bernoulli distribution and Binomial distribution

Bit

The bit is the most basic unit of information in computing and digital communication.

See Bernoulli distribution and Bit

Boolean-valued function

A Boolean-valued function (sometimes called a predicate or a proposition) is a function of the type f: X → B, where X is an arbitrary set and where B is a Boolean domain, i.e. a generic two-element set, (for example B.

See Bernoulli distribution and Boolean-valued function

Categorical distribution

In probability theory and statistics, a categorical distribution (also called a generalized Bernoulli distribution, multinoulli distribution) is a discrete probability distribution that describes the possible results of a random variable that can take on one of K possible categories, with the probability of each category separately specified. Bernoulli distribution and categorical distribution are discrete distributions and exponential family distributions.

See Bernoulli distribution and Categorical distribution

Coin flipping

Coin flipping, coin tossing, or heads or tails is the practice of throwing a coin in the air and checking which side is showing when it lands, in order to randomly choose between two alternatives, heads or tails, sometimes used to resolve a dispute between two parties.

See Bernoulli distribution and Coin flipping

Conjugate prior

In Bayesian probability theory, if, given a likelihood function p(x \mid \theta), the posterior distribution p(\theta \mid x) is in the same probability distribution family as the prior probability distribution p(\theta), the prior and posterior are then called conjugate distributions with respect to that likelihood function and the prior is called a conjugate prior for the likelihood function p(x \mid \theta). Bernoulli distribution and conjugate prior are conjugate prior distributions.

See Bernoulli distribution and Conjugate prior

Expected value

In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average.

See Bernoulli distribution and Expected value

Experiment

An experiment is a procedure carried out to support or refute a hypothesis, or determine the efficacy or likelihood of something previously untried.

See Bernoulli distribution and Experiment

Exponential family

In probability and statistics, an exponential family is a parametric set of probability distributions of a certain form, specified below. Bernoulli distribution and exponential family are discrete distributions.

See Bernoulli distribution and Exponential family

False (logic)

In logic, false or untrue is the state of possessing negative truth value and is a nullary logical connective.

See Bernoulli distribution and False (logic)

Geometric distribution

In probability theory and statistics, the geometric distribution is either one of two discrete probability distributions. Bernoulli distribution and geometric distribution are discrete distributions and exponential family distributions.

See Bernoulli distribution and Geometric distribution

Independence (probability theory)

Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes.

See Bernoulli distribution and Independence (probability theory)

Independent and identically distributed random variables

In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent.

See Bernoulli distribution and Independent and identically distributed random variables

Jacob Bernoulli

Jacob Bernoulli (also known as James in English or Jacques in French; – 16 August 1705) was one of the many prominent mathematicians in the Swiss Bernoulli family.

See Bernoulli distribution and Jacob Bernoulli

John Tsitsiklis

John N. Tsitsiklis (Γιάννης Ν.; born 1958) is a Greek-American probabilist.

See Bernoulli distribution and John Tsitsiklis

Kurtosis

In probability theory and statistics, kurtosis (from κυρτός, kyrtos or kurtos, meaning "curved, arching") is a measure of the "tailedness" of the probability distribution of a real-valued random variable.

See Bernoulli distribution and Kurtosis

Maximum likelihood estimation

In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data.

See Bernoulli distribution and Maximum likelihood estimation

Outcome (probability)

In probability theory, an outcome is a possible result of an experiment or trial.

See Bernoulli distribution and Outcome (probability)

Probability

Probability is the branch of mathematics concerning events and numerical descriptions of how likely they are to occur.

See Bernoulli distribution and Probability

Probability distribution

In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of possible outcomes for an experiment.

See Bernoulli distribution and Probability distribution

Probability mass function

In probability and statistics, a probability mass function (sometimes called probability function or frequency function) is a function that gives the probability that a discrete random variable is exactly equal to some value.

See Bernoulli distribution and Probability mass function

Probability theory

Probability theory or probability calculus is the branch of mathematics concerned with probability.

See Bernoulli distribution and Probability theory

Random variable

A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.

See Bernoulli distribution and Random variable

Relationships among probability distributions

In probability theory and statistics, there are several relationships among probability distributions.

See Bernoulli distribution and Relationships among probability distributions

Sample mean and covariance

The sample mean (sample average) or empirical mean (empirical average), and the sample covariance or empirical covariance are statistics computed from a sample of data on one or more random variables.

See Bernoulli distribution and Sample mean and covariance

Skewness

In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean.

See Bernoulli distribution and Skewness

Statistics

Statistics (from German: Statistik, "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data.

See Bernoulli distribution and Statistics

Stochastic process

In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a sequence of random variables in a probability space, where the index of the sequence often has the interpretation of time.

See Bernoulli distribution and Stochastic process

Truth value

In logic and mathematics, a truth value, sometimes called a logical value, is a value indicating the relation of a proposition to truth, which in classical logic has only two possible values (true or false).

See Bernoulli distribution and Truth value

Variance

In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable.

See Bernoulli distribution and Variance

Yes and no

Yes and no, or similar word pairs, are expressions of the affirmative and the negative, respectively, in several languages, including English.

See Bernoulli distribution and Yes and no

Yes–no question

In linguistics, a yes–no question, also known as a binary question, a polar question, or a general question, is a question whose expected answer is one of two choices, one that provides an affirmative answer to the question versus one that provides a negative answer to the question.

See Bernoulli distribution and Yes–no question

See also

Conjugate prior distributions

Exponential family distributions

References

[1] https://en.wikipedia.org/wiki/Bernoulli_distribution

Also known as Bernouli random variable, Bernoulli RV, Bernoulli Random Variable, Two point distribution.