Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Free
Faster access than browser!
 

Entropy in thermodynamics and information theory

Index Entropy in thermodynamics and information theory

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. [1]

76 relations: A Mathematical Theory of Communication, Algorithmic cooling, Anton Zeilinger, Base (exponentiation), Bit, Black hole, Black hole information paradox, Black hole thermodynamics, Boltzmann constant, Boltzmann's entropy formula, Brownian motion, Cardinality, Claude Shannon, Commutative property, David Callaway, E (mathematical constant), Entropic uncertainty, Entropy, Entropy (information theory), Entropy (order and disorder), Entropy (statistical thermodynamics), Entropy in thermodynamics and information theory, First law of thermodynamics, Fluctuation theorem, Fundamental thermodynamic relation, Gibbs algorithm, Gilbert N. Lewis, Hartley (unit), Hartley function, Information theory, Intensive and extensive properties, Jacob Bekenstein, Jarzynski equality, Joint entropy, Joint probability distribution, Josiah Willard Gibbs, Landauer's principle, Léon Brillouin, Leo Szilard, Light-year, Logarithm, Ludwig Boltzmann, Maximum entropy thermodynamics, Maxwell's demon, Microcanonical ensemble, Microstate (statistical mechanics), Nat (unit), Natural logarithm, Negentropy, Nuclear magnetic resonance, ..., Orders of magnitude (entropy), Physica (journal), Physical information, Physical Review, Physical Review E, Physical Review Letters, Planck units, Principle of maximum entropy, Probability distribution, Quantum computing, Quantum decoherence, Quantum discord, Quantum entanglement, Quantum mechanics, Ralph Hartley, Rolf Landauer, Rudolf Clausius, Second law of thermodynamics, Shannon (unit), Statistical mechanics, Stephen Hawking, Thermodynamics, Thought experiment, Tim Palmer (physicist), Uncertainty principle, Von Neumann entropy. Expand index (26 more) »

A Mathematical Theory of Communication

"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.

New!!: Entropy in thermodynamics and information theory and A Mathematical Theory of Communication · See more »

Algorithmic cooling

Algorithmic cooling is an algorithmic method for transferring heat (or entropy) from some qubits to others or outside the system and into the environment, which results in a cooling effect.

New!!: Entropy in thermodynamics and information theory and Algorithmic cooling · See more »

Anton Zeilinger

Anton Zeilinger (born 20 May 1945) is an Austrian quantum physicist who in 2008 received the Inaugural Isaac Newton Medal of the Institute of Physics (UK) for "his pioneering conceptual and experimental contributions to the foundations of quantum physics, which have become the cornerstone for the rapidly-evolving field of quantum information".

New!!: Entropy in thermodynamics and information theory and Anton Zeilinger · See more »

Base (exponentiation)

In exponentiation, the base is the number b in an expression of the form bn.

New!!: Entropy in thermodynamics and information theory and Base (exponentiation) · See more »

Bit

The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications.

New!!: Entropy in thermodynamics and information theory and Bit · See more »

Black hole

A black hole is a region of spacetime exhibiting such strong gravitational effects that nothing—not even particles and electromagnetic radiation such as light—can escape from inside it.

New!!: Entropy in thermodynamics and information theory and Black hole · See more »

Black hole information paradox

The black hole information paradox is a puzzle resulting from the combination of quantum mechanics and general relativity.

New!!: Entropy in thermodynamics and information theory and Black hole information paradox · See more »

Black hole thermodynamics

In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black-hole event horizons.

New!!: Entropy in thermodynamics and information theory and Black hole thermodynamics · See more »

Boltzmann constant

The Boltzmann constant, which is named after Ludwig Boltzmann, is a physical constant relating the average kinetic energy of particles in a gas with the temperature of the gas.

New!!: Entropy in thermodynamics and information theory and Boltzmann constant · See more »

Boltzmann's entropy formula

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy S of an ideal gas to the quantity W, the number of real microstates corresponding to the gas' macrostate: where kB is the Boltzmann constant (also written as simply k) and equal to 1.38065 × 10−23 J/K.

New!!: Entropy in thermodynamics and information theory and Boltzmann's entropy formula · See more »

Brownian motion

Brownian motion or pedesis (from πήδησις "leaping") is the random motion of particles suspended in a fluid (a liquid or a gas) resulting from their collision with the fast-moving molecules in the fluid.

New!!: Entropy in thermodynamics and information theory and Brownian motion · See more »

Cardinality

In mathematics, the cardinality of a set is a measure of the "number of elements of the set".

New!!: Entropy in thermodynamics and information theory and Cardinality · See more »

Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory".

New!!: Entropy in thermodynamics and information theory and Claude Shannon · See more »

Commutative property

In mathematics, a binary operation is commutative if changing the order of the operands does not change the result.

New!!: Entropy in thermodynamics and information theory and Commutative property · See more »

David Callaway

David J. E. Callaway is a biological nanophysicist in the New York University School of Medicine, where he is Professor and Laboratory Director.

New!!: Entropy in thermodynamics and information theory and David Callaway · See more »

E (mathematical constant)

The number is a mathematical constant, approximately equal to 2.71828, which appears in many different settings throughout mathematics.

New!!: Entropy in thermodynamics and information theory and E (mathematical constant) · See more »

Entropic uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies.

New!!: Entropy in thermodynamics and information theory and Entropic uncertainty · See more »

Entropy

In statistical mechanics, entropy is an extensive property of a thermodynamic system.

New!!: Entropy in thermodynamics and information theory and Entropy · See more »

Entropy (information theory)

Information entropy is the average rate at which information is produced by a stochastic source of data.

New!!: Entropy in thermodynamics and information theory and Entropy (information theory) · See more »

Entropy (order and disorder)

In thermodynamics, entropy is commonly associated with the amount of order, disorder, or chaos in a thermodynamic system.

New!!: Entropy in thermodynamics and information theory and Entropy (order and disorder) · See more »

Entropy (statistical thermodynamics)

In classical statistical mechanics, the entropy function earlier introduced by Rudolf Clausius is interpreted as statistical entropy using probability theory.

New!!: Entropy in thermodynamics and information theory and Entropy (statistical thermodynamics) · See more »

Entropy in thermodynamics and information theory

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s.

New!!: Entropy in thermodynamics and information theory and Entropy in thermodynamics and information theory · See more »

First law of thermodynamics

The first law of thermodynamics is a version of the law of conservation of energy, adapted for thermodynamic systems.

New!!: Entropy in thermodynamics and information theory and First law of thermodynamics · See more »

Fluctuation theorem

The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium (i.e., maximum entropy) will increase or decrease over a given amount of time.

New!!: Entropy in thermodynamics and information theory and Fluctuation theorem · See more »

Fundamental thermodynamic relation

In thermodynamics, the fundamental thermodynamic relation is generally expressed as a microscopic change in internal energy in terms of microscopic changes in entropy, and volume for a closed system in thermal equilibrium in the following way.

New!!: Entropy in thermodynamics and information theory and Fundamental thermodynamic relation · See more »

Gibbs algorithm

Josiah Willard Gibbs In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1902, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by minimizing the average log probability subject to the probability distribution satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities.

New!!: Entropy in thermodynamics and information theory and Gibbs algorithm · See more »

Gilbert N. Lewis

Gilbert Newton Lewis (October 25 (or 23), 1875 – March 23, 1946) was an American physical chemist known for the discovery of the covalent bond and his concept of electron pairs; his Lewis dot structures and other contributions to valence bond theory have shaped modern theories of chemical bonding.

New!!: Entropy in thermodynamics and information theory and Gilbert N. Lewis · See more »

Hartley (unit)

The hartley (symbol Hart), also called a ban, or a dit (short for decimal digit), is a logarithmic unit which measures information or entropy, based on base 10 logarithms and powers of 10, rather than the powers of 2 and base 2 logarithms which define the bit, or shannon.

New!!: Entropy in thermodynamics and information theory and Hartley (unit) · See more »

Hartley function

The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928.

New!!: Entropy in thermodynamics and information theory and Hartley function · See more »

Information theory

Information theory studies the quantification, storage, and communication of information.

New!!: Entropy in thermodynamics and information theory and Information theory · See more »

Intensive and extensive properties

Physical properties of materials and systems can often be categorized as being either intensive or extensive quantities, according to how the property changes when the size (or extent) of the system changes.

New!!: Entropy in thermodynamics and information theory and Intensive and extensive properties · See more »

Jacob Bekenstein

Jacob David Bekenstein (יעקב בקנשטיין; May 1, 1947 – August 16, 2015) was a Mexican-born Israeli-American theoretical physicist who made fundamental contributions to the foundation of black hole thermodynamics and to other aspects of the connections between information and gravitation.

New!!: Entropy in thermodynamics and information theory and Jacob Bekenstein · See more »

Jarzynski equality

The Jarzynski equality (JE) is an equation in statistical mechanics that relates free energy differences between two states and the irreversible work along an ensemble of trajectories joining the same states.

New!!: Entropy in thermodynamics and information theory and Jarzynski equality · See more »

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

New!!: Entropy in thermodynamics and information theory and Joint entropy · See more »

Joint probability distribution

Given random variables X, Y,..., that are defined on a probability space, the joint probability distribution for X, Y,...

New!!: Entropy in thermodynamics and information theory and Joint probability distribution · See more »

Josiah Willard Gibbs

Josiah Willard Gibbs (February 11, 1839 – April 28, 1903) was an American scientist who made important theoretical contributions to physics, chemistry, and mathematics.

New!!: Entropy in thermodynamics and information theory and Josiah Willard Gibbs · See more »

Landauer's principle

Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation.

New!!: Entropy in thermodynamics and information theory and Landauer's principle · See more »

Léon Brillouin

Léon Nicolas Brillouin (August 7, 1889 – October 4, 1969) was a French physicist.

New!!: Entropy in thermodynamics and information theory and Léon Brillouin · See more »

Leo Szilard

Leo Szilard (Szilárd Leó; Leo Spitz until age 2; February 11, 1898 – May 30, 1964) was a Hungarian-German-American physicist and inventor.

New!!: Entropy in thermodynamics and information theory and Leo Szilard · See more »

Light-year

The light-year is a unit of length used to express astronomical distances and measures about 9.5 trillion kilometres or 5.9 trillion miles.

New!!: Entropy in thermodynamics and information theory and Light-year · See more »

Logarithm

In mathematics, the logarithm is the inverse function to exponentiation.

New!!: Entropy in thermodynamics and information theory and Logarithm · See more »

Ludwig Boltzmann

Ludwig Eduard Boltzmann (February 20, 1844 – September 5, 1906) was an Austrian physicist and philosopher whose greatest achievement was in the development of statistical mechanics, which explains and predicts how the properties of atoms (such as mass, charge, and structure) determine the physical properties of matter (such as viscosity, thermal conductivity, and diffusion).

New!!: Entropy in thermodynamics and information theory and Ludwig Boltzmann · See more »

Maximum entropy thermodynamics

In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes.

New!!: Entropy in thermodynamics and information theory and Maximum entropy thermodynamics · See more »

Maxwell's demon

In the philosophy of thermal and statistical physics, Maxwell's demon is a thought experiment created by the physicist James Clerk Maxwell in which he suggested how the second law of thermodynamics might hypothetically be violated.

New!!: Entropy in thermodynamics and information theory and Maxwell's demon · See more »

Microcanonical ensemble

In statistical mechanics, a microcanonical ensemble is the statistical ensemble that is used to represent the possible states of a mechanical system which has an exactly specified total energy.

New!!: Entropy in thermodynamics and information theory and Microcanonical ensemble · See more »

Microstate (statistical mechanics)

In statistical mechanics, a microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of its thermal fluctuations.

New!!: Entropy in thermodynamics and information theory and Microstate (statistical mechanics) · See more »

Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the bit.

New!!: Entropy in thermodynamics and information theory and Nat (unit) · See more »

Natural logarithm

The natural logarithm of a number is its logarithm to the base of the mathematical constant ''e'', where e is an irrational and transcendental number approximately equal to.

New!!: Entropy in thermodynamics and information theory and Natural logarithm · See more »

Negentropy

The negentropy has different meanings in information theory and theoretical biology.

New!!: Entropy in thermodynamics and information theory and Negentropy · See more »

Nuclear magnetic resonance

Nuclear magnetic resonance (NMR) is a physical phenomenon in which nuclei in a magnetic field absorb and re-emit electromagnetic radiation.

New!!: Entropy in thermodynamics and information theory and Nuclear magnetic resonance · See more »

Orders of magnitude (entropy)

The following list shows different orders of magnitude of entropy.

New!!: Entropy in thermodynamics and information theory and Orders of magnitude (entropy) · See more »

Physica (journal)

Physica is a Dutch series of peer-reviewed, scientific journals of physics by Elsevier.

New!!: Entropy in thermodynamics and information theory and Physica (journal) · See more »

Physical information

In physics, physical information refers generally to the information that is contained in a physical system.

New!!: Entropy in thermodynamics and information theory and Physical information · See more »

Physical Review

Physical Review is an American peer-reviewed scientific journal established in 1893 by Edward Nichols.

New!!: Entropy in thermodynamics and information theory and Physical Review · See more »

Physical Review E

Physical Review E is a peer-reviewed, scientific journal, published monthly by the American Physical Society.

New!!: Entropy in thermodynamics and information theory and Physical Review E · See more »

Physical Review Letters

Physical Review Letters (PRL), established in 1958, is a peer-reviewed, scientific journal that is published 52 times per year by the American Physical Society.

New!!: Entropy in thermodynamics and information theory and Physical Review Letters · See more »

Planck units

In particle physics and physical cosmology, Planck units are a set of units of measurement defined exclusively in terms of five universal physical constants, in such a manner that these five physical constants take on the numerical value of 1 when expressed in terms of these units.

New!!: Entropy in thermodynamics and information theory and Planck units · See more »

Principle of maximum entropy

The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information).

New!!: Entropy in thermodynamics and information theory and Principle of maximum entropy · See more »

Probability distribution

In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.

New!!: Entropy in thermodynamics and information theory and Probability distribution · See more »

Quantum computing

Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement.

New!!: Entropy in thermodynamics and information theory and Quantum computing · See more »

Quantum decoherence

Quantum decoherence is the loss of quantum coherence.

New!!: Entropy in thermodynamics and information theory and Quantum decoherence · See more »

Quantum discord

In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system.

New!!: Entropy in thermodynamics and information theory and Quantum discord · See more »

Quantum entanglement

Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole.

New!!: Entropy in thermodynamics and information theory and Quantum entanglement · See more »

Quantum mechanics

Quantum mechanics (QM; also known as quantum physics, quantum theory, the wave mechanical model, or matrix mechanics), including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles.

New!!: Entropy in thermodynamics and information theory and Quantum mechanics · See more »

Ralph Hartley

Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an electronics researcher.

New!!: Entropy in thermodynamics and information theory and Ralph Hartley · See more »

Rolf Landauer

Rolf William Landauer (February 4, 1927 – April 28, 1999) was a German-American physicist who made important contributions in diverse areas of the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media.

New!!: Entropy in thermodynamics and information theory and Rolf Landauer · See more »

Rudolf Clausius

Rudolf Julius Emanuel Clausius (2 January 1822 – 24 August 1888) was a German physicist and mathematician and is considered one of the central founders of the science of thermodynamics.

New!!: Entropy in thermodynamics and information theory and Rudolf Clausius · See more »

Second law of thermodynamics

The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time.

New!!: Entropy in thermodynamics and information theory and Second law of thermodynamics · See more »

Shannon (unit)

The shannon (symbol: Sh), more commonly known as the bit, is a unit of information and of entropy defined by IEC 80000-13.

New!!: Entropy in thermodynamics and information theory and Shannon (unit) · See more »

Statistical mechanics

Statistical mechanics is one of the pillars of modern physics.

New!!: Entropy in thermodynamics and information theory and Statistical mechanics · See more »

Stephen Hawking

Stephen William Hawking (8 January 1942 – 14 March 2018) was an English theoretical physicist, cosmologist, and author, who was director of research at the Centre for Theoretical Cosmology at the University of Cambridge at the time of his death.

New!!: Entropy in thermodynamics and information theory and Stephen Hawking · See more »

Thermodynamics

Thermodynamics is the branch of physics concerned with heat and temperature and their relation to energy and work.

New!!: Entropy in thermodynamics and information theory and Thermodynamics · See more »

Thought experiment

A thought experiment (Gedankenexperiment, Gedanken-Experiment or Gedankenerfahrung) considers some hypothesis, theory, or principle for the purpose of thinking through its consequences.

New!!: Entropy in thermodynamics and information theory and Thought experiment · See more »

Tim Palmer (physicist)

Timothy Noel Palmer CBE FRS (born 31 December 1952) is a mathematical physicist by training.

New!!: Entropy in thermodynamics and information theory and Tim Palmer (physicist) · See more »

Uncertainty principle

In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables, such as position x and momentum p, can be known.

New!!: Entropy in thermodynamics and information theory and Uncertainty principle · See more »

Von Neumann entropy

In quantum statistical mechanics, the von Neumann entropy, named after John von Neumann, is the extension of classical Gibbs entropy concepts to the field of quantum mechanics.

New!!: Entropy in thermodynamics and information theory and Von Neumann entropy · See more »

Redirects here:

Szilard engine, Szilard's engine, Zeilinger's principle, Zeilinger’s principle.

References

[1] https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

OutgoingIncoming
Hey! We are on Facebook now! »