Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Install
Faster access than browser!
 

Differential entropy

Index Differential entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. [1]

58 relations: Almost everywhere, Beta distribution, Beta function, Bit, Calculus of variations, Cauchy distribution, Change of variables, Chi distribution, Chi-squared distribution, Conditional entropy, Covariance, Digamma function, Edwin Thompson Jaynes, Elementary Principles in Statistical Mechanics, Entropy (information theory), Entropy estimation, Erlang distribution, Estimator, Euler–Mascheroni constant, Exponential distribution, F-distribution, Gamma distribution, Gamma function, Generalized normal distribution, Homeomorphism, If and only if, Independence (probability theory), Information theory, Invariant measure, Jacobian matrix and determinant, Joint entropy, Journal of the Royal Statistical Society, Kullback–Leibler divergence, Laplace distribution, Limiting density of discrete points, Log-normal distribution, Logarithm, Logarithmic scale, Logistic distribution, Maxwell–Boltzmann distribution, Multivariate normal distribution, Mutual information, Nat (unit), Normal distribution, Pareto distribution, Physical Review E, Probability density function, Probability distribution, Quantile function, Quantization (signal processing), ..., Random variable, Rayleigh distribution, Self-information, Student's t-distribution, Support (mathematics), Triangular distribution, Uniform distribution (continuous), Weibull distribution. Expand index (8 more) »

Almost everywhere

In measure theory (a branch of mathematical analysis), a property holds almost everywhere if, in a technical sense, the set for which the property holds takes up nearly all possibilities.

New!!: Differential entropy and Almost everywhere · See more »

Beta distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval parametrized by two positive shape parameters, denoted by α and β, that appear as exponents of the random variable and control the shape of the distribution.

New!!: Differential entropy and Beta distribution · See more »

Beta function

In mathematics, the beta function, also called the Euler integral of the first kind, is a special function defined by for.

New!!: Differential entropy and Beta function · See more »

Bit

The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications.

New!!: Differential entropy and Bit · See more »

Calculus of variations

Calculus of variations is a field of mathematical analysis that uses variations, which are small changes in functions and functionals, to find maxima and minima of functionals: mappings from a set of functions to the real numbers.

New!!: Differential entropy and Calculus of variations · See more »

Cauchy distribution

The Cauchy distribution, named after Augustin Cauchy, is a continuous probability distribution.

New!!: Differential entropy and Cauchy distribution · See more »

Change of variables

In mathematics, a change of variables is a basic technique used to simplify problems in which the original variables are replaced with functions of other variables.

New!!: Differential entropy and Change of variables · See more »

Chi distribution

No description.

New!!: Differential entropy and Chi distribution · See more »

Chi-squared distribution

No description.

New!!: Differential entropy and Chi-squared distribution · See more »

Conditional entropy

In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.

New!!: Differential entropy and Conditional entropy · See more »

Covariance

In probability theory and statistics, covariance is a measure of the joint variability of two random variables.

New!!: Differential entropy and Covariance · See more »

Digamma function

In mathematics, the digamma function is defined as the logarithmic derivative of the gamma function: It is the first of the polygamma functions.

New!!: Differential entropy and Digamma function · See more »

Edwin Thompson Jaynes

Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis.

New!!: Differential entropy and Edwin Thompson Jaynes · See more »

Elementary Principles in Statistical Mechanics

Elementary Principles in Statistical Mechanics, published in March 1902, is a work of scientific literature by Josiah Willard Gibbs which is considered to be the foundation of modern statistical mechanics.

New!!: Differential entropy and Elementary Principles in Statistical Mechanics · See more »

Entropy (information theory)

Information entropy is the average rate at which information is produced by a stochastic source of data.

New!!: Differential entropy and Entropy (information theory) · See more »

Entropy estimation

In various science/engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, evaluation of the status of biological systems and time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations.

New!!: Differential entropy and Entropy estimation · See more »

Erlang distribution

No description.

New!!: Differential entropy and Erlang distribution · See more »

Estimator

In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished.

New!!: Differential entropy and Estimator · See more »

Euler–Mascheroni constant

The Euler–Mascheroni constant (also called Euler's constant) is a mathematical constant recurring in analysis and number theory, usually denoted by the lowercase Greek letter gamma.

New!!: Differential entropy and Euler–Mascheroni constant · See more »

Exponential distribution

No description.

New!!: Differential entropy and Exponential distribution · See more »

F-distribution

No description.

New!!: Differential entropy and F-distribution · See more »

Gamma distribution

In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions.

New!!: Differential entropy and Gamma distribution · See more »

Gamma function

In mathematics, the gamma function (represented by, the capital Greek alphabet letter gamma) is an extension of the factorial function, with its argument shifted down by 1, to real and complex numbers.

New!!: Differential entropy and Gamma function · See more »

Generalized normal distribution

The generalized normal distribution or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line.

New!!: Differential entropy and Generalized normal distribution · See more »

Homeomorphism

In the mathematical field of topology, a homeomorphism or topological isomorphism or bi continuous function is a continuous function between topological spaces that has a continuous inverse function.

New!!: Differential entropy and Homeomorphism · See more »

If and only if

In logic and related fields such as mathematics and philosophy, if and only if (shortened iff) is a biconditional logical connective between statements.

New!!: Differential entropy and If and only if · See more »

Independence (probability theory)

In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of the other.

New!!: Differential entropy and Independence (probability theory) · See more »

Information theory

Information theory studies the quantification, storage, and communication of information.

New!!: Differential entropy and Information theory · See more »

Invariant measure

In mathematics, an invariant measure is a measure that is preserved by some function.

New!!: Differential entropy and Invariant measure · See more »

Jacobian matrix and determinant

In vector calculus, the Jacobian matrix is the matrix of all first-order partial derivatives of a vector-valued function.

New!!: Differential entropy and Jacobian matrix and determinant · See more »

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

New!!: Differential entropy and Joint entropy · See more »

Journal of the Royal Statistical Society

The Journal of the Royal Statistical Society is a peer-reviewed scientific journal of statistics.

New!!: Differential entropy and Journal of the Royal Statistical Society · See more »

Kullback–Leibler divergence

In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy) is a measure of how one probability distribution diverges from a second, expected probability distribution.

New!!: Differential entropy and Kullback–Leibler divergence · See more »

Laplace distribution

In probability theory and statistics, the Laplace distribution is a continuous probability distribution named after Pierre-Simon Laplace.

New!!: Differential entropy and Laplace distribution · See more »

Limiting density of discrete points

In information theory, the limiting density of discrete points is an adjustment to the formula of Claude Shannon for differential entropy.

New!!: Differential entropy and Limiting density of discrete points · See more »

Log-normal distribution

In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed.

New!!: Differential entropy and Log-normal distribution · See more »

Logarithm

In mathematics, the logarithm is the inverse function to exponentiation.

New!!: Differential entropy and Logarithm · See more »

Logarithmic scale

A logarithmic scale is a nonlinear scale used when there is a large range of quantities.

New!!: Differential entropy and Logarithmic scale · See more »

Logistic distribution

In probability theory and statistics, the logistic distribution is a continuous probability distribution.

New!!: Differential entropy and Logistic distribution · See more »

Maxwell–Boltzmann distribution

In physics (in particular in statistical mechanics), the Maxwell–Boltzmann distribution is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann.

New!!: Differential entropy and Maxwell–Boltzmann distribution · See more »

Multivariate normal distribution

In probability theory and statistics, the multivariate normal distribution or multivariate Gaussian distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions.

New!!: Differential entropy and Multivariate normal distribution · See more »

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

New!!: Differential entropy and Mutual information · See more »

Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the bit.

New!!: Differential entropy and Nat (unit) · See more »

Normal distribution

In probability theory, the normal (or Gaussian or Gauss or Laplace–Gauss) distribution is a very common continuous probability distribution.

New!!: Differential entropy and Normal distribution · See more »

Pareto distribution

No description.

New!!: Differential entropy and Pareto distribution · See more »

Physical Review E

Physical Review E is a peer-reviewed, scientific journal, published monthly by the American Physical Society.

New!!: Differential entropy and Physical Review E · See more »

Probability density function

In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function, whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample.

New!!: Differential entropy and Probability density function · See more »

Probability distribution

In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.

New!!: Differential entropy and Probability distribution · See more »

Quantile function

In probability and statistics, the quantile function specifies, for a given probability in the probability distribution of a random variable, the value at which the probability of the random variable is less than or equal to the given probability.

New!!: Differential entropy and Quantile function · See more »

Quantization (signal processing)

Quantization, in mathematics and digital signal processing, is the process of mapping input values from a large set (often a continuous set) to output values in a (countable) smaller set.

New!!: Differential entropy and Quantization (signal processing) · See more »

Random variable

In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is a variable whose possible values are outcomes of a random phenomenon.

New!!: Differential entropy and Random variable · See more »

Rayleigh distribution

No description.

New!!: Differential entropy and Rayleigh distribution · See more »

Self-information

In information theory, self-information or surprisal is the surprise when a random variable is sampled.

New!!: Differential entropy and Self-information · See more »

Student's t-distribution

In probability and statistics, Student's t-distribution (or simply the t-distribution) is any member of a family of continuous probability distributions that arises when estimating the mean of a normally distributed population in situations where the sample size is small and population standard deviation is unknown.

New!!: Differential entropy and Student's t-distribution · See more »

Support (mathematics)

In mathematics, the support of a real-valued function f is the subset of the domain containing those elements which are not mapped to zero.

New!!: Differential entropy and Support (mathematics) · See more »

Triangular distribution

In probability theory and statistics, the triangular distribution is a continuous probability distribution with lower limit a, upper limit b and mode c, where a \left.\begin f(x) &.

New!!: Differential entropy and Triangular distribution · See more »

Uniform distribution (continuous)

In probability theory and statistics, the continuous uniform distribution or rectangular distribution is a family of symmetric probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable.

New!!: Differential entropy and Uniform distribution (continuous) · See more »

Weibull distribution

No description.

New!!: Differential entropy and Weibull distribution · See more »

Redirects here:

Continuous entropy.

References

[1] https://en.wikipedia.org/wiki/Differential_entropy

OutgoingIncoming
Hey! We are on Facebook now! »