We are working to restore the Unionpedia app on the Google Play Store
OutgoingIncoming
🌟We've simplified our design for better navigation!
Instagram Facebook X LinkedIn

Information theory

Index Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. [1]

Table of Contents

  1. 216 relations: A Mathematical Theory of Communication, Acta Astronautica, Active networking, Additive white Gaussian noise, Alan Turing, Algorithmic information theory, Algorithmic probability, Andrey Kolmogorov, Anomaly detection, Bayesian approaches to brain function, Bayesian inference, Bell Labs, Bell Labs Technical Journal, Binary erasure channel, Binary logarithm, Binary symmetric channel, Binding problem, Bioinformatics, Bit, Black hole, Black hole information paradox, Block cipher, Boltzmann constant, Broadcasting, Brute-force attack, Byte, Causality, Channel capacity, Charles Sanders Peirce, Charles Seife, Cipher, Ciphertext, Claude Shannon, Code (cryptography), Coding theory, Cognitive neuroscience, Cognitive science, Coin flipping, Common logarithm, Communication channel, Communication source, Communication theory, Compact disc, Computer network, Computer science, Conditional entropy, Conditional mutual information, Conditional probability, Constructor theory, Content similarity detection, ... Expand index (166 more) »

  2. Claude Shannon
  3. Computer-related introductions in 1948
  4. Formal sciences

A Mathematical Theory of Communication

"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948. Information theory and a Mathematical Theory of Communication are Claude Shannon.

See Information theory and A Mathematical Theory of Communication

Acta Astronautica

Acta Astronautica is a monthly peer-reviewed scientific journal covering all fields of physical, engineering, life, and social sciences related to the peaceful scientific exploration of space.

See Information theory and Acta Astronautica

Active networking

Active networking is a communication pattern that allows packets flowing through a telecommunications network to dynamically modify the operation of the network.

See Information theory and Active networking

Additive white Gaussian noise

Additive white Gaussian noise (AWGN) is a basic noise model used in information theory to mimic the effect of many random processes that occur in nature.

See Information theory and Additive white Gaussian noise

Alan Turing

Alan Mathison Turing (23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist.

See Information theory and Alan Turing

Algorithmic information theory

Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure.

See Information theory and Algorithmic information theory

Algorithmic probability

In algorithmic information theory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability to a given observation.

See Information theory and Algorithmic probability

Andrey Kolmogorov

Andrey Nikolaevich Kolmogorov (a, 25 April 1903 – 20 October 1987) was a Soviet mathematician who contributed to the mathematics of probability theory, topology, intuitionistic logic, turbulence, classical mechanics, algorithmic information theory and computational complexity.

See Information theory and Andrey Kolmogorov

Anomaly detection

In data analysis, anomaly detection (also referred to as outlier detection and sometimes as novelty detection) is generally understood to be the identification of rare items, events or observations which deviate significantly from the majority of the data and do not conform to a well defined notion of normal behavior.

See Information theory and Anomaly detection

Bayesian approaches to brain function

Bayesian approaches to brain function investigate the capacity of the nervous system to operate in situations of uncertainty in a fashion that is close to the optimal prescribed by Bayesian statistics.

See Information theory and Bayesian approaches to brain function

Bayesian inference

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.

See Information theory and Bayesian inference

Bell Labs

Bell Labs is an American industrial research and scientific development company credited with the development of radio astronomy, the transistor, the laser, the photovoltaic cell, the charge-coupled device (CCD), information theory, the Unix operating system, and the programming languages B, C, C++, S, SNOBOL, AWK, AMPL, and others.

See Information theory and Bell Labs

Bell Labs Technical Journal

The Bell Labs Technical Journal was the in-house scientific journal for scientists of Nokia Bell Labs, published yearly by the IEEE society.

See Information theory and Bell Labs Technical Journal

Binary erasure channel

In coding theory and information theory, a binary erasure channel (BEC) is a communications channel model.

See Information theory and Binary erasure channel

Binary logarithm

In mathematics, the binary logarithm is the power to which the number must be raised to obtain the value.

See Information theory and Binary logarithm

Binary symmetric channel

A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory.

See Information theory and Binary symmetric channel

Binding problem

The consciousness and binding problem is the problem of how objects, background, and abstract or emotional features are combined into a single experience.

See Information theory and Binding problem

Bioinformatics

Bioinformatics is an interdisciplinary field of science that develops methods and software tools for understanding biological data, especially when the data sets are large and complex.

See Information theory and Bioinformatics

Bit

The bit is the most basic unit of information in computing and digital communication.

See Information theory and Bit

Black hole

A black hole is a region of spacetime where gravity is so strong that nothing, not even light and other electromagnetic waves, is capable of possessing enough energy to escape it.

See Information theory and Black hole

Black hole information paradox

The black hole information paradox is a paradox that appears when the predictions of quantum mechanics and general relativity are combined.

See Information theory and Black hole information paradox

Block cipher

In cryptography, a block cipher is a deterministic algorithm that operates on fixed-length groups of bits, called blocks.

See Information theory and Block cipher

Boltzmann constant

The Boltzmann constant is the proportionality factor that relates the average relative thermal energy of particles in a gas with the thermodynamic temperature of the gas.

See Information theory and Boltzmann constant

Broadcasting

Broadcasting is the distribution of audio or video content to a dispersed audience via any electronic mass communications medium, but typically one using the electromagnetic spectrum (radio waves), in a one-to-many model.

See Information theory and Broadcasting

Brute-force attack

In cryptography, a brute-force attack consists of an attacker submitting many passwords or passphrases with the hope of eventually guessing correctly.

See Information theory and Brute-force attack

Byte

The byte is a unit of digital information that most commonly consists of eight bits.

See Information theory and Byte

Causality

Causality is an influence by which one event, process, state, or object (a cause) contributes to the production of another event, process, state, or object (an effect) where the cause is partly responsible for the effect, and the effect is partly dependent on the cause.

See Information theory and Causality

Channel capacity

Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel.

See Information theory and Channel capacity

Charles Sanders Peirce

Charles Sanders Peirce (September 10, 1839 – April 19, 1914) was an American scientist, mathematician, logician, and philosopher who is sometimes known as "the father of pragmatism".

See Information theory and Charles Sanders Peirce

Charles Seife

Charles Seife is an American author, journalist, and professor at New York University.

See Information theory and Charles Seife

Cipher

In cryptography, a cipher (or cypher) is an algorithm for performing encryption or decryption—a series of well-defined steps that can be followed as a procedure.

See Information theory and Cipher

Ciphertext

In cryptography, ciphertext or cyphertext is the result of encryption performed on plaintext using an algorithm, called a cipher.

See Information theory and Ciphertext

Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist and cryptographer known as the "father of information theory" and as the "father of the Information Age".

See Information theory and Claude Shannon

Code (cryptography)

In cryptology, a code is a method used to encrypt a message that operates at the level of meaning; that is, words or phrases are converted into something else.

See Information theory and Code (cryptography)

Coding theory

Coding theory is the study of the properties of codes and their respective fitness for specific applications.

See Information theory and Coding theory

Cognitive neuroscience

Cognitive neuroscience is the scientific field that is concerned with the study of the biological processes and aspects that underlie cognition, with a specific focus on the neural connections in the brain which are involved in mental processes.

See Information theory and Cognitive neuroscience

Cognitive science

Cognitive science is the interdisciplinary, scientific study of the mind and its processes.

See Information theory and Cognitive science

Coin flipping

Coin flipping, coin tossing, or heads or tails is the practice of throwing a coin in the air and checking which side is showing when it lands, in order to randomly choose between two alternatives, heads or tails, sometimes used to resolve a dispute between two parties.

See Information theory and Coin flipping

Common logarithm

In mathematics, the common logarithm is the logarithm with base 10.

See Information theory and Common logarithm

Communication channel

A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking.

See Information theory and Communication channel

Communication source

A source or sender is one of the basic concepts of communication and information processing.

See Information theory and Communication source

Communication theory

Communication theory is a proposed description of communication phenomena, the relationships among them, a storyline describing these relationships, and an argument for these three elements.

See Information theory and Communication theory

Compact disc

The compact disc (CD) is a digital optical disc data storage format that was codeveloped by Philips and Sony to store and play digital audio recordings.

See Information theory and Compact disc

Computer network

A computer network is a set of computers sharing resources located on or provided by network nodes.

See Information theory and Computer network

Computer science

Computer science is the study of computation, information, and automation. Information theory and Computer science are formal sciences.

See Information theory and Computer science

Conditional entropy

In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.

See Information theory and Conditional entropy

Conditional mutual information

In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third.

See Information theory and Conditional mutual information

Conditional probability

In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) is already known to have occurred.

See Information theory and Conditional probability

Constructor theory

Constructor theory is a proposal for a new mode of explanation in fundamental physics in the language of ergodic theory, developed by physicists David Deutsch and Chiara Marletto, at the University of Oxford, since 2012.

See Information theory and Constructor theory

Content similarity detection

Plagiarism detection or content similarity detection is the process of locating instances of plagiarism or copyright infringement within a work or document.

See Information theory and Content similarity detection

Covert channel

In computer security, a covert channel is a type of attack that creates a capability to transfer information objects between processes that are not supposed to be allowed to communicate by the computer security policy.

See Information theory and Covert channel

Cross-entropy

In information theory, the cross-entropy between two probability distributions p and q, over the same underlying set of events, measures the average number of bits needed to identify an event drawn from the set when the coding scheme used for the set is optimized for an estimated probability distribution q, rather than the true distribution p.

See Information theory and Cross-entropy

Cryptanalysis

Cryptanalysis (from the Greek kryptós, "hidden", and analýein, "to analyze") refers to the process of analyzing information systems in order to understand hidden aspects of the systems.

See Information theory and Cryptanalysis

Cryptanalysis of the Enigma

Cryptanalysis of the Enigma ciphering system enabled the western Allies in World War II to read substantial amounts of Morse-coded radio communications of the Axis powers that had been enciphered using Enigma machines.

See Information theory and Cryptanalysis of the Enigma

Cryptographically secure pseudorandom number generator

A cryptographically secure pseudorandom number generator (CSPRNG) or cryptographic pseudorandom number generator (CPRNG) is a pseudorandom number generator (PRNG) with properties that make it suitable for use in cryptography.

See Information theory and Cryptographically secure pseudorandom number generator

Cryptography

Cryptography, or cryptology (from κρυπτός|translit. Information theory and Cryptography are formal sciences.

See Information theory and Cryptography

Cybernetics

Cybernetics is the transdisciplinary study of circular processes such as feedback systems where outputs are also inputs.

See Information theory and Cybernetics

Data compression

In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation.

See Information theory and Data compression

Data storage

Data storage is the recording (storing) of information (data) in a storage medium.

See Information theory and Data storage

David J. C. MacKay

Sir David John Cameron MacKay (22 April 1967 – 14 April 2016) was a British physicist, mathematician, and academic.

See Information theory and David J. C. MacKay

Decoding the Universe

Decoding the Universe: How the New Science of Information Is Explaining Everything in the Cosmos, from Our Brains to Black Holes is the third non-fiction book by American author and journalist Charles Seife.

See Information theory and Decoding the Universe

Detection theory

Detection theory or signal detection theory is a means to measure the ability to differentiate between information-bearing patterns (called stimulus in living organisms, signal in machines) and random patterns that distract from the information (called noise, consisting of background stimuli and random activity of the detection machine and of the nervous system of the operator).

See Information theory and Detection theory

Dice

Dice (die or dice) are small, throwable objects with marked sides that can rest in multiple positions.

See Information theory and Dice

Differential entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy (a measure of average surprisal) of a random variable, to continuous probability distributions.

See Information theory and Differential entropy

Digital signal processing

Digital signal processing (DSP) is the use of digital processing, such as by computers or more specialized digital signal processors, to perform a wide variety of signal processing operations.

See Information theory and Digital signal processing

Digital subscriber line

Digital subscriber line (DSL; originally digital subscriber loop) is a family of technologies that are used to transmit digital data over telephone lines.

See Information theory and Digital subscriber line

Directed information

Directed information is an information theory measure that quantifies the information flow from the random string X^n.

See Information theory and Directed information

E (mathematical constant)

The number is a mathematical constant approximately equal to 2.71828 that can be characterized in many ways.

See Information theory and E (mathematical constant)

Electrical engineering

Electrical engineering is an engineering discipline concerned with the study, design, and application of equipment, devices, and systems which use electricity, electronics, and electromagnetism.

See Information theory and Electrical engineering

Electronic engineering

Electronic engineering is a sub-discipline of electrical engineering that emerged in the early 20th century and is distinguished by the additional use of active components such as semiconductor devices to amplify and control electric current flow.

See Information theory and Electronic engineering

Enigma machine

The Enigma machine is a cipher device developed and used in the early- to mid-20th century to protect commercial, diplomatic, and military communication.

See Information theory and Enigma machine

Entropy (information theory)

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Information theory and entropy (information theory) are data compression.

See Information theory and Entropy (information theory)

Entropy in thermodynamics and information theory

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

See Information theory and Entropy in thermodynamics and information theory

Entropy rate

In the mathematical theory of probability, the entropy rate or source information rate is a function assigning an entropy to a stochastic process.

See Information theory and Entropy rate

Epistemology

Epistemology is the branch of philosophy concerned with knowledge.

See Information theory and Epistemology

Ergodic theory

Ergodic theory is a branch of mathematics that studies statistical properties of deterministic dynamical systems; it is the study of ergodicity.

See Information theory and Ergodic theory

Error correction code

In computing, telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

See Information theory and Error correction code

Error detection and correction

In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communication channels.

See Information theory and Error detection and correction

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of the code. Information theory and error exponent are data compression.

See Information theory and Error exponent

Estimation theory

Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component.

See Information theory and Estimation theory

Expected value

In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average.

See Information theory and Expected value

Extractor (mathematics)

An (N,M,D,K,\epsilon) -extractor is a bipartite graph with N nodes on the left and M nodes on the right such that each node on the left has D neighbors (on the right), which has the added property that for any subset A of the left vertices of size at least K, the distribution on right vertices obtained by choosing a random node in A and then following a random edge to get a node x on the right side is \epsilon-close to the uniform distribution in terms of total variation distance.

See Information theory and Extractor (mathematics)

Fazlollah Reza

Fazlollah Reza (فضل‌الله رضا; January 1, 1915 – November 19, 2019) was an Iranian university professor.

See Information theory and Fazlollah Reza

Fisher information

In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.

See Information theory and Fisher information

Formal science

Formal science is a branch of science studying disciplines concerned with abstract structures described by formal systems, such as logic, mathematics, statistics, theoretical computer science, artificial intelligence, information theory, game theory, systems theory, decision theory and theoretical linguistics. Information theory and formal science are formal sciences.

See Information theory and Formal science

Free energy principle

The free energy principle is a theoretical framework suggesting that the brain reduces surprise or uncertainty by making predictions based on internal models and updating them using sensory input.

See Information theory and Free energy principle

Fungible information

Fungible information is the information for which the means of encoding is not important.

See Information theory and Fungible information

Gambling

Gambling (also known as betting or gaming) is the wagering of something of value ("the stakes") on a random event with the intent of winning something else of value, where instances of strategy are discounted.

See Information theory and Gambling

Gambling and information theory

Statistical inference might be thought of as gambling theory applied to the world around us.

See Information theory and Gambling and information theory

Gaussian noise

In signal processing theory, Gaussian noise, named after Carl Friedrich Gauss, is a kind of signal noise that has a probability density function (pdf) equal to that of the normal distribution (which is also known as the Gaussian distribution).

See Information theory and Gaussian noise

Gerald Edelman

Gerald Maurice Edelman (July 1, 1929 – May 17, 2014) was an American biologist who shared the 1972 Nobel Prize in Physiology or Medicine for work with Rodney Robert Porter on the immune system.

See Information theory and Gerald Edelman

Giulio Tononi

Giulio Tononi is a neuroscientist and psychiatrist who holds the David P. White Chair in Sleep Medicine, as well as a Distinguished Chair in Consciousness Science, at the University of Wisconsin.

See Information theory and Giulio Tononi

Grammatical Man

Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by Jeremy Campbell, then Washington correspondent for the Evening Standard.

See Information theory and Grammatical Man

Hamming distance

In information theory, the Hamming distance between two strings or vectors of equal length is the number of positions at which the corresponding symbols are different.

See Information theory and Hamming distance

Harry Nyquist

Harry Nyquist (February 7, 1889 – April 4, 1976) was a Swedish-American physicist and electronic engineer who made important contributions to communication theory.

See Information theory and Harry Nyquist

Hartley (unit)

The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10.

See Information theory and Hartley (unit)

History of information theory

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

See Information theory and History of information theory

Hubert Yockey

Hubert Palmer Yockey (April 15, 1916 – January 31, 2016) was an American physicist and information theorist.

See Information theory and Hubert Yockey

Independence (probability theory)

Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes.

See Information theory and Independence (probability theory)

Independent and identically distributed random variables

In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent.

See Information theory and Independent and identically distributed random variables

Inductive probability

Inductive probability attempts to give the probability of future events based on past events.

See Information theory and Inductive probability

Info-metrics

Info-metrics is an interdisciplinary approach to scientific modeling, inference and efficient information processing.

See Information theory and Info-metrics

Information

Information is an abstract concept that refers to something which has the power to inform.

See Information theory and Information

Information algebra

The term "information algebra" refers to mathematical techniques of information processing.

See Information theory and Information algebra

Information asymmetry

In contract theory, mechanism design, and economics, an information asymmetry is a situation where one party has more or better information than the other.

See Information theory and Information asymmetry

Information content

In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable.

See Information theory and Information content

Information field theory

Information field theory (IFT) is a Bayesian statistical field theory relating to signal reconstruction, cosmography, and other related areas.

See Information theory and Information field theory

Information fluctuation complexity

Information fluctuation complexity is an information-theoretic quantity defined as the fluctuation of information about entropy.

See Information theory and Information fluctuation complexity

Information geometry

Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics.

See Information theory and Information geometry

Information retrieval

Information retrieval (IR) in computing and information science is the task of identifying and retrieving information system resources that are relevant to an information need.

See Information theory and Information retrieval

Information theory and measure theory

This article discusses how information theory (a branch of mathematics studying the transmission, processing and storage of information) is related to measure theory (a branch of mathematics related to integration and probability).

See Information theory and Information theory and measure theory

Information-theoretic security

A cryptosystem is considered to have information-theoretic security (also called unconditional security) if the system is secure against adversaries with unlimited computing resources and time.

See Information theory and Information-theoretic security

Institute of Electrical and Electronics Engineers

The Institute of Electrical and Electronics Engineers (IEEE) is an American 501(c)(3) professional association for electronics engineering, electrical engineering, and other related disciplines.

See Information theory and Institute of Electrical and Electronics Engineers

Integrated information theory

Integrated information theory (IIT) proposes a mathematical model for the consciousness of a system.

See Information theory and Integrated information theory

Intelligence assessment

Intelligence assessment, or simply intel, is the development of behavior forecasts or recommended courses of action to the leadership of an organisation, based on wide ranges of available overt and covert information (intelligence).

See Information theory and Intelligence assessment

International Journal of Computer Mathematics

The International Journal of Computer Mathematics is a monthly peer-reviewed scientific journal covering numerical analysis and scientific computing.

See Information theory and International Journal of Computer Mathematics

James Gleick

James Gleick (born August 1, 1954) is an American author and historian of science whose work has chronicled the cultural impact of modern technology.

See Information theory and James Gleick

James Massey

James Lee Massey (February 11, 1934 – June 16, 2013) was an American information theorist and cryptographer, Professor Emeritus of Digital Technology at ETH Zurich.

See Information theory and James Massey

John R. Pierce

John Robinson Pierce (March 27, 1910 – April 2, 2002), was an American engineer and author.

See Information theory and John R. Pierce

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

See Information theory and Joint entropy

Josiah Willard Gibbs

Josiah Willard Gibbs (February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics.

See Information theory and Josiah Willard Gibbs

Karl J. Friston

Karl John Friston FRS FMedSci FRSB (born 12 July 1959) is a British neuroscientist and theoretician at University College London.

See Information theory and Karl J. Friston

Key (cryptography)

A key in cryptography is a piece of information, usually a string of numbers or letters that are stored in a file, which, when processed through a cryptographic algorithm, can encode or decode cryptographic data.

See Information theory and Key (cryptography)

Kolmogorov complexity

In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. Information theory and Kolmogorov complexity are data compression.

See Information theory and Kolmogorov complexity

Kullback–Leibler divergence

In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution is different from a second, reference probability distribution.

See Information theory and Kullback–Leibler divergence

Likelihood-ratio test

In statistics, the likelihood-ratio test is a hypothesis test that involves comparing the goodness of fit of two competing statistical models, typically one found by maximization over the entire parameter space and another found after imposing some constraint, based on the ratio of their likelihoods.

See Information theory and Likelihood-ratio test

Linear network coding

In computer networking, linear network coding is a program in which intermediate nodes transmit data from source nodes to sink nodes by means of linear combinations.

See Information theory and Linear network coding

List of unsolved problems in information theory

This article lists notable unsolved problems in information theory.

See Information theory and List of unsolved problems in information theory

Logic of information

The logic of information, or the logical theory of information, considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce.

See Information theory and Logic of information

Lossless compression

Lossless compression is a class of data compression that allows the original data to be perfectly reconstructed from the compressed data with no loss of information. Information theory and Lossless compression are data compression.

See Information theory and Lossless compression

Lossy compression

In information technology, lossy compression or irreversible compression is the class of data compression methods that uses inexact approximations and partial data discarding to represent the content. Information theory and lossy compression are data compression.

See Information theory and Lossy compression

Ludwig Boltzmann

Ludwig Eduard Boltzmann (20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher.

See Information theory and Ludwig Boltzmann

Mathematics

Mathematics is a field of study that discovers and organizes abstract objects, methods, theories and theorems that are developed and proved for the needs of empirical sciences and mathematics itself. Information theory and mathematics are formal sciences.

See Information theory and Mathematics

Memorylessness

In probability and statistics, memorylessness is a property of certain probability distributions.

See Information theory and Memorylessness

Metric space

In mathematics, a metric space is a set together with a notion of distance between its elements, usually called points.

See Information theory and Metric space

Min-entropy

The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome.

See Information theory and Min-entropy

Minimum description length

Minimum Description Length (MDL) is a model selection principle where the shortest description of the data is the best model.

See Information theory and Minimum description length

Minimum message length

Minimum message length (MML) is a Bayesian information-theoretic method for statistical model comparison and selection.

See Information theory and Minimum message length

Molecular dynamics

Molecular dynamics (MD) is a computer simulation method for analyzing the physical movements of atoms and molecules.

See Information theory and Molecular dynamics

Multinomial distribution

In probability theory, the multinomial distribution is a generalization of the binomial distribution.

See Information theory and Multinomial distribution

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

See Information theory and Mutual information

Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the shannon.

See Information theory and Nat (unit)

Natural logarithm

The natural logarithm of a number is its logarithm to the base of the mathematical constant e, which is an irrational and transcendental number approximately equal to.

See Information theory and Natural logarithm

Neuroscience

Neuroscience is the scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions and disorders.

See Information theory and Neuroscience

Noise (electronics)

In electronics, noise is an unwanted disturbance in an electrical signal.

See Information theory and Noise (electronics)

Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.

See Information theory and Noisy-channel coding theorem

Numerical digit

A numerical digit (often shortened to just digit) or numeral is a single symbol used alone (such as "1") or in combinations (such as "15"), to represent numbers in a positional numeral system.

See Information theory and Numerical digit

One-time pad

In cryptography, the one-time pad (OTP) is an encryption technique that cannot be cracked, but requires the use of a single-use pre-shared key that is larger than or equal to the size of the message being sent.

See Information theory and One-time pad

Pattern recognition

Pattern recognition is the task of assigning a class to an observation based on patterns extracted from data. Information theory and pattern recognition are formal sciences.

See Information theory and Pattern recognition

Pearson's chi-squared test

Pearson's chi-squared test or Pearson's \chi^2 test is a statistical test applied to sets of categorical data to evaluate how likely it is that any observed difference between the sets arose by chance.

See Information theory and Pearson's chi-squared test

Perception

Perception is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.

See Information theory and Perception

Perplexity

In information theory, perplexity is a measure of uncertainty in the value of a sample from a discrete probability distribution.

See Information theory and Perplexity

Philosophy of information

The philosophy of information (PI) is a branch of philosophy that studies topics relevant to information processing, representational system and consciousness, cognitive science, computer science, information science and information technology.

See Information theory and Philosophy of information

Physics

Physics is the natural science of matter, involving the study of matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force.

See Information theory and Physics

Plaintext

In cryptography, plaintext usually means unencrypted information pending input into cryptographic algorithms, usually encryption algorithms.

See Information theory and Plaintext

Pointwise mutual information

In statistics, probability theory and information theory, pointwise mutual information (PMI), or point mutual information, is a measure of association.

See Information theory and Pointwise mutual information

Posterior probability

The posterior probability is a type of conditional probability that results from updating the prior probability with information summarized by the likelihood via an application of Bayes' rule.

See Information theory and Posterior probability

Pragmatic theory of information

The pragmatic theory of information is derived from Charles Sanders Peirce's general theory of signs and inquiry.

See Information theory and Pragmatic theory of information

Prior probability

A prior probability distribution of an uncertain quantity, often simply called the prior, is its assumed probability distribution before some evidence is taken into account.

See Information theory and Prior probability

Probability distribution

In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of possible outcomes for an experiment.

See Information theory and Probability distribution

Probability mass function

In probability and statistics, a probability mass function (sometimes called probability function or frequency function) is a function that gives the probability that a discrete random variable is exactly equal to some value.

See Information theory and Probability mass function

Probability theory

Probability theory or probability calculus is the branch of mathematics concerned with probability.

See Information theory and Probability theory

Pseudorandom number generator

A pseudorandom number generator (PRNG), also known as a deterministic random bit generator (DRBG), is an algorithm for generating a sequence of numbers whose properties approximate the properties of sequences of random numbers.

See Information theory and Pseudorandom number generator

Public-key cryptography

Public-key cryptography, or asymmetric cryptography, is the field of cryptographic systems that use pairs of related keys.

See Information theory and Public-key cryptography

Quantification (science)

In mathematics and empirical science, quantification (or quantitation) is the act of counting and measuring that maps human sense observations and experiences into quantities.

See Information theory and Quantification (science)

Quantities of information

The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information.

See Information theory and Quantities of information

Quantum computing

A quantum computer is a computer that exploits quantum mechanical phenomena.

See Information theory and Quantum computing

Quantum information science

Quantum information science is a field that combines the principles of quantum mechanics with information theory to study the processing, analysis, and transmission of information.

See Information theory and Quantum information science

Ralph Hartley

Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an American electronics researcher.

See Information theory and Ralph Hartley

Random seed

A random seed (or seed state, or just seed) is a number (or vector) used to initialize a pseudorandom number generator.

See Information theory and Random seed

Random variable

A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.

See Information theory and Random variable

Rate–distortion theory

Rate–distortion theory is a major branch of information theory which provides the theoretical foundations for lossy data compression; it addresses the problem of determining the minimal number of bits per symbol, as measured by the rate R, that should be communicated over a channel, so that the source (input signal) can be approximately reconstructed at the receiver (output signal) without exceeding an expected distortion D. Information theory and Rate–distortion theory are data compression.

See Information theory and Rate–distortion theory

Rényi entropy

In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy.

See Information theory and Rényi entropy

Real-time computing

Real-time computing (RTC) is the computer science term for hardware and software systems subject to a "real-time constraint", for example from event to system response.

See Information theory and Real-time computing

Receiver (information theory)

The receiver in information theory is the receiving end of a communication channel.

See Information theory and Receiver (information theory)

Redundancy (information theory)

In information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value \log(|\mathcal_X|). Information theory and redundancy (information theory) are data compression.

See Information theory and Redundancy (information theory)

Reflection seismology

Reflection seismology (or seismic reflection) is a method of exploration geophysics that uses the principles of seismology to estimate the properties of the Earth's subsurface from reflected seismic waves.

See Information theory and Reflection seismology

Relay channel

In information theory, a relay channel is a probability model of the communication between a sender and a receiver aided by one or more intermediate relay nodes.

See Information theory and Relay channel

Robert K. Logan

Robert K. Logan (born August 31, 1939), originally trained as a physicist, is a media ecologist.

See Information theory and Robert K. Logan

Robert McEliece

Robert J. McEliece (May 21, 1942 – May 8, 2019) was the Allen E. Puckett Professor and a professor of electrical engineering at the California Institute of Technology (Caltech) best known for his work in error-correcting coding and information theory.

See Information theory and Robert McEliece

Rolf Landauer

Rolf William Landauer (February 4, 1927 – April 27, 1999) was a German-American physicist who made important contributions in diverse areas of the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media.

See Information theory and Rolf Landauer

Scientific American

Scientific American, informally abbreviated SciAm or sometimes SA, is an American popular science magazine.

See Information theory and Scientific American

Search for extraterrestrial intelligence

The search for extraterrestrial intelligence (SETI) is a collective term for scientific searches for intelligent extraterrestrial life, for example, monitoring electromagnetic radiation for signs of transmissions from civilizations on other planets.

See Information theory and Search for extraterrestrial intelligence

Semiotics

Semiotics is the systematic study of sign processes and the communication of meaning. Information theory and Semiotics are cybernetics.

See Information theory and Semiotics

Shannon (unit)

The shannon (symbol: Sh) is a unit of information named after Claude Shannon, the founder of information theory. Information theory and shannon (unit) are Claude Shannon.

See Information theory and Shannon (unit)

Shannon's source coding theorem

In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the statistical limits to possible data compression for data whose source is an independent identically-distributed random variable, and the operational meaning of the Shannon entropy. Information theory and Shannon's source coding theorem are data compression.

See Information theory and Shannon's source coding theorem

Shannon–Hartley theorem

In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. Information theory and Shannon–Hartley theorem are Claude Shannon.

See Information theory and Shannon–Hartley theorem

Signal

Signal refers to both the process and the result of transmission of data over some media accomplished by embedding some variation.

See Information theory and Signal

Stationary process

In mathematics and statistics, a stationary process (or a strict/strictly stationary process or strong/strongly stationary process) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time.

See Information theory and Stationary process

Statistical inference

Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability.

See Information theory and Statistical inference

Statistics

Statistics (from German: Statistik, "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. Information theory and Statistics are formal sciences.

See Information theory and Statistics

Stochastic process

In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a sequence of random variables in a probability space, where the index of the sequence often has the interpretation of time.

See Information theory and Stochastic process

String (computer science)

In computer programming, a string is traditionally a sequence of characters, either as a literal constant or as some kind of variable.

See Information theory and String (computer science)

Symmetric function

In mathematics, a function of n variables is symmetric if its value is the same no matter the order of its arguments.

See Information theory and Symmetric function

Symmetric-key algorithm

Symmetric-key algorithms are algorithms for cryptography that use the same cryptographic keys for both the encryption of plaintext and the decryption of ciphertext.

See Information theory and Symmetric-key algorithm

Telecommunications

Telecommunication, often used in its plural form or abbreviated as telecom, is the transmission of information with an immediacy comparable to face-to-face communication.

See Information theory and Telecommunications

The Information: A History, a Theory, a Flood

The Information: A History, a Theory, a Flood is a book by science history writer James Gleick, published in March 2011, which covers the genesis of the current Information Age.

See Information theory and The Information: A History, a Theory, a Flood

Thermal physics

Thermal physics is the combined study of thermodynamics, statistical mechanics, and kinetic theory of gases.

See Information theory and Thermal physics

Thermodynamics

Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation.

See Information theory and Thermodynamics

Timeline of information theory

A timeline of events related to information theory, quantum information theory and statistical physics, data compression, error correcting codes and related subjects.

See Information theory and Timeline of information theory

Triangle inequality

In mathematics, the triangle inequality states that for any triangle, the sum of the lengths of any two sides must be greater than or equal to the length of the remaining side.

See Information theory and Triangle inequality

Tsallis entropy

In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy.

See Information theory and Tsallis entropy

Ultra (cryptography)

adopted by British military intelligence in June 1941 for wartime signals intelligence obtained by breaking high-level encrypted enemy radio and teleprinter communications at the Government Code and Cypher School (GC&CS) at Bletchley Park.

See Information theory and Ultra (cryptography)

Umberto Eco

Umberto Eco (5 January 1932 – 19 February 2016) was an Italian medievalist, philosopher, semiotician, novelist, cultural critic, and political and social commentator.

See Information theory and Umberto Eco

Unicity distance

In cryptography, unicity distance is the length of an original ciphertext needed to break the cipher by reducing the number of possible spurious keys to zero in a brute force attack.

See Information theory and Unicity distance

Unit of measurement

A unit of measurement, or unit of measure, is a definite magnitude of a quantity, defined and adopted by convention or by law, that is used as a standard for measurement of the same kind of quantity.

See Information theory and Unit of measurement

University of Illinois Press

The University of Illinois Press (UIP) is an American university press and is part of the University of Illinois system.

See Information theory and University of Illinois Press

Urbana, Illinois

Urbana is a city in and the county seat of Champaign County, Illinois, United States.

See Information theory and Urbana, Illinois

Vannevar Bush

Vannevar Bush (March 11, 1890 – June 28, 1974) was an American engineer, inventor and science administrator, who during World War II headed the U.S. Office of Scientific Research and Development (OSRD), through which almost all wartime military R&D was carried out, including important developments in radar and the initiation and early administration of the Manhattan Project.

See Information theory and Vannevar Bush

Variety (cybernetics)

In cybernetics, the term variety denotes the total number of distinguishable elements of a set, most often the set of states, inputs, or outputs of a finite-state machine or transformation, or the binary logarithm of the same quantity. Information theory and variety (cybernetics) are cybernetics.

See Information theory and Variety (cybernetics)

Venona project

The Venona project was a United States counterintelligence program initiated during World War II by the United States Army's Signal Intelligence Service and later absorbed by the National Security Agency (NSA), that ran from February 1, 1943, until October 1, 1980.

See Information theory and Venona project

Victory in Europe Day

Victory in Europe Day is the day celebrating the formal acceptance by the Allies of World War II of Germany's unconditional surrender of its armed forces on Tuesday, 8 May 1945; it marked the official end of World War II in Europe in the Eastern Front, with the last known shots fired on 11 May.

See Information theory and Victory in Europe Day

Voyager program

The Voyager program is an American scientific program that employs two interstellar probes, Voyager 1 and Voyager 2.

See Information theory and Voyager program

Wiley (publisher)

John Wiley & Sons, Inc., commonly known as Wiley, is an American multinational publishing company that focuses on academic publishing and instructional materials.

See Information theory and Wiley (publisher)

Winfried Nöth

Winfried Nöth (born September 12, 1944 in Gerolzhofen) is a German linguist and semiotician.

See Information theory and Winfried Nöth

ZIP (file format)

ZIP is an archive file format that supports lossless data compression.

See Information theory and ZIP (file format)

See also

Claude Shannon

Formal sciences

References

[1] https://en.wikipedia.org/wiki/Information_theory

Also known as Applications of information theory, Classical information theory, Information theorist, Information-theoretic, Semiotic information, Semiotic information theory, Shannon information theory, Shannon theory, Shannon's information theory, Shannons theory.

, Covert channel, Cross-entropy, Cryptanalysis, Cryptanalysis of the Enigma, Cryptographically secure pseudorandom number generator, Cryptography, Cybernetics, Data compression, Data storage, David J. C. MacKay, Decoding the Universe, Detection theory, Dice, Differential entropy, Digital signal processing, Digital subscriber line, Directed information, E (mathematical constant), Electrical engineering, Electronic engineering, Enigma machine, Entropy (information theory), Entropy in thermodynamics and information theory, Entropy rate, Epistemology, Ergodic theory, Error correction code, Error detection and correction, Error exponent, Estimation theory, Expected value, Extractor (mathematics), Fazlollah Reza, Fisher information, Formal science, Free energy principle, Fungible information, Gambling, Gambling and information theory, Gaussian noise, Gerald Edelman, Giulio Tononi, Grammatical Man, Hamming distance, Harry Nyquist, Hartley (unit), History of information theory, Hubert Yockey, Independence (probability theory), Independent and identically distributed random variables, Inductive probability, Info-metrics, Information, Information algebra, Information asymmetry, Information content, Information field theory, Information fluctuation complexity, Information geometry, Information retrieval, Information theory and measure theory, Information-theoretic security, Institute of Electrical and Electronics Engineers, Integrated information theory, Intelligence assessment, International Journal of Computer Mathematics, James Gleick, James Massey, John R. Pierce, Joint entropy, Josiah Willard Gibbs, Karl J. Friston, Key (cryptography), Kolmogorov complexity, Kullback–Leibler divergence, Likelihood-ratio test, Linear network coding, List of unsolved problems in information theory, Logic of information, Lossless compression, Lossy compression, Ludwig Boltzmann, Mathematics, Memorylessness, Metric space, Min-entropy, Minimum description length, Minimum message length, Molecular dynamics, Multinomial distribution, Mutual information, Nat (unit), Natural logarithm, Neuroscience, Noise (electronics), Noisy-channel coding theorem, Numerical digit, One-time pad, Pattern recognition, Pearson's chi-squared test, Perception, Perplexity, Philosophy of information, Physics, Plaintext, Pointwise mutual information, Posterior probability, Pragmatic theory of information, Prior probability, Probability distribution, Probability mass function, Probability theory, Pseudorandom number generator, Public-key cryptography, Quantification (science), Quantities of information, Quantum computing, Quantum information science, Ralph Hartley, Random seed, Random variable, Rate–distortion theory, Rényi entropy, Real-time computing, Receiver (information theory), Redundancy (information theory), Reflection seismology, Relay channel, Robert K. Logan, Robert McEliece, Rolf Landauer, Scientific American, Search for extraterrestrial intelligence, Semiotics, Shannon (unit), Shannon's source coding theorem, Shannon–Hartley theorem, Signal, Stationary process, Statistical inference, Statistics, Stochastic process, String (computer science), Symmetric function, Symmetric-key algorithm, Telecommunications, The Information: A History, a Theory, a Flood, Thermal physics, Thermodynamics, Timeline of information theory, Triangle inequality, Tsallis entropy, Ultra (cryptography), Umberto Eco, Unicity distance, Unit of measurement, University of Illinois Press, Urbana, Illinois, Vannevar Bush, Variety (cybernetics), Venona project, Victory in Europe Day, Voyager program, Wiley (publisher), Winfried Nöth, ZIP (file format).