Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Androidâ„¢ device!
Free
Faster access than browser!
 

Entropic uncertainty

Index Entropic uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. [1]

26 relations: American Journal of Mathematics, Annals of Mathematics, Babenko–Beckner inequality, Bit, Calculus of variations, Communications in Mathematical Physics, Differential entropy, Entropy (information theory), Fourier analysis, Fourier transform, Hermite polynomials, Hugh Everett III, Inequalities in information theory, Information theory, Isidore Isaac Hirschman Jr., Lebesgue measure, Nat (unit), Normal distribution, Phase space, Plancherel theorem, Rényi entropy, Riesz–Thorin theorem, Uncertainty principle, Variance, Von Neumann entropy, William Beckner (mathematician).

American Journal of Mathematics

The American Journal of Mathematics is a bimonthly mathematics journal published by the Johns Hopkins University Press.

New!!: Entropic uncertainty and American Journal of Mathematics · See more »

Annals of Mathematics

The Annals of Mathematics is a bimonthly mathematical journal published by Princeton University and the Institute for Advanced Study.

New!!: Entropic uncertainty and Annals of Mathematics · See more »

Babenko–Beckner inequality

In mathematics, the Babenko–Beckner inequality (after K. Ivan Babenko and William E. Beckner) is a sharpened form of the Hausdorff–Young inequality having applications to uncertainty principles in the Fourier analysis of Lp spaces.

New!!: Entropic uncertainty and Babenko–Beckner inequality · See more »

Bit

The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications.

New!!: Entropic uncertainty and Bit · See more »

Calculus of variations

Calculus of variations is a field of mathematical analysis that uses variations, which are small changes in functions and functionals, to find maxima and minima of functionals: mappings from a set of functions to the real numbers.

New!!: Entropic uncertainty and Calculus of variations · See more »

Communications in Mathematical Physics

Communications in Mathematical Physics is a peer-reviewed academic journal published by Springer.

New!!: Entropic uncertainty and Communications in Mathematical Physics · See more »

Differential entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions.

New!!: Entropic uncertainty and Differential entropy · See more »

Entropy (information theory)

Information entropy is the average rate at which information is produced by a stochastic source of data.

New!!: Entropic uncertainty and Entropy (information theory) · See more »

Fourier analysis

In mathematics, Fourier analysis is the study of the way general functions may be represented or approximated by sums of simpler trigonometric functions.

New!!: Entropic uncertainty and Fourier analysis · See more »

Fourier transform

The Fourier transform (FT) decomposes a function of time (a signal) into the frequencies that make it up, in a way similar to how a musical chord can be expressed as the frequencies (or pitches) of its constituent notes.

New!!: Entropic uncertainty and Fourier transform · See more »

Hermite polynomials

In mathematics, the Hermite polynomials are a classical orthogonal polynomial sequence.

New!!: Entropic uncertainty and Hermite polynomials · See more »

Hugh Everett III

Hugh Everett III (November 11, 1930 – July 19, 1982) was an American physicist who first proposed the many-worlds interpretation (MWI) of quantum physics, which he termed his "relative state" formulation.

New!!: Entropic uncertainty and Hugh Everett III · See more »

Inequalities in information theory

Inequalities are very important in the study of information theory.

New!!: Entropic uncertainty and Inequalities in information theory · See more »

Information theory

Information theory studies the quantification, storage, and communication of information.

New!!: Entropic uncertainty and Information theory · See more »

Isidore Isaac Hirschman Jr.

Isidore Isaac Hirschman Jr. (1922–1990) was an American mathematician, and professor at Washington University in St. Louis working on analysis.

New!!: Entropic uncertainty and Isidore Isaac Hirschman Jr. · See more »

Lebesgue measure

In measure theory, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of n-dimensional Euclidean space.

New!!: Entropic uncertainty and Lebesgue measure · See more »

Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the bit.

New!!: Entropic uncertainty and Nat (unit) · See more »

Normal distribution

In probability theory, the normal (or Gaussian or Gauss or Laplace–Gauss) distribution is a very common continuous probability distribution.

New!!: Entropic uncertainty and Normal distribution · See more »

Phase space

In dynamical system theory, a phase space is a space in which all possible states of a system are represented, with each possible state corresponding to one unique point in the phase space.

New!!: Entropic uncertainty and Phase space · See more »

Plancherel theorem

In mathematics, the Plancherel theorem is a result in harmonic analysis, proven by Michel Plancherel in 1910.

New!!: Entropic uncertainty and Plancherel theorem · See more »

Rényi entropy

In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min entropy.

New!!: Entropic uncertainty and Rényi entropy · See more »

Riesz–Thorin theorem

In mathematics, the Riesz–Thorin theorem, often referred to as the Riesz–Thorin interpolation theorem or the Riesz–Thorin convexity theorem is a result about interpolation of operators.

New!!: Entropic uncertainty and Riesz–Thorin theorem · See more »

Uncertainty principle

In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables, such as position x and momentum p, can be known.

New!!: Entropic uncertainty and Uncertainty principle · See more »

Variance

In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.

New!!: Entropic uncertainty and Variance · See more »

Von Neumann entropy

In quantum statistical mechanics, the von Neumann entropy, named after John von Neumann, is the extension of classical Gibbs entropy concepts to the field of quantum mechanics.

New!!: Entropic uncertainty and Von Neumann entropy · See more »

William Beckner (mathematician)

William Beckner (born September 15, 1941) is an American mathematician, known for his work in harmonic analysis, especially geometric inequalities.

New!!: Entropic uncertainty and William Beckner (mathematician) · See more »

Redirects here:

Hirchman uncertainty, Hirschman uncertainty.

References

[1] https://en.wikipedia.org/wiki/Entropic_uncertainty

OutgoingIncoming
Hey! We are on Facebook now! »