We are working to restore the Unionpedia app on the Google Play Store
OutgoingIncoming
🌟We've simplified our design for better navigation!
Instagram Facebook X LinkedIn

Channel capacity

Index Channel capacity

Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel. [1]

Table of Contents

  1. 51 relations: Additive white Gaussian noise, Alphabet (formal languages), Autoregressive model, Bandwidth (computing), Bandwidth (signal processing), Bit rate, Blahut–Arimoto algorithm, Claude Shannon, Code rate, Communication channel, Computer science, Conditional probability distribution, Cooperative diversity, Data compression, Decibel, Deep learning, Directed information, Electrical engineering, Entropy (information theory), Error correction code, Error exponent, Fading, Generative adversarial network, Graph (discrete mathematics), Hertz, IEEE Transactions on Information Theory, Infimum and supremum, Information, Information theory, James Massey, Joint probability distribution, Logarithm, Lovász number, Marginal distribution, Markov decision process, MIMO, Mutual information, Nat (unit), Natural logarithm, Negentropy, Network throughput, Noisy-channel coding theorem, Nyquist rate, Receiver (information theory), Redundancy (information theory), Sender (telephony), Shannon–Hartley theorem, Signal-to-noise ratio, Spectral density, Spectral efficiency, ... Expand index (1 more) »

Additive white Gaussian noise

Additive white Gaussian noise (AWGN) is a basic noise model used in information theory to mimic the effect of many random processes that occur in nature.

See Channel capacity and Additive white Gaussian noise

Alphabet (formal languages)

In formal language theory, an alphabet, sometimes called a vocabulary, is a non-empty set of indivisible symbols/characters/glyphs, typically thought of as representing letters, characters, digits, phonemes, or even words.

See Channel capacity and Alphabet (formal languages)

Autoregressive model

In statistics, econometrics, and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it can be used to describe certain time-varying processes in nature, economics, behavior, etc.

See Channel capacity and Autoregressive model

Bandwidth (computing)

In computing, bandwidth is the maximum rate of data transfer across a given path. Channel capacity and bandwidth (computing) are information theory.

See Channel capacity and Bandwidth (computing)

Bandwidth (signal processing)

Bandwidth is the difference between the upper and lower frequencies in a continuous band of frequencies. Channel capacity and Bandwidth (signal processing) are telecommunication theory.

See Channel capacity and Bandwidth (signal processing)

Bit rate

In telecommunications and computing, bit rate (bitrate or as a variable R) is the number of bits that are conveyed or processed per unit of time.

See Channel capacity and Bit rate

Blahut–Arimoto algorithm

The term Blahut–Arimoto algorithm is often used to refer to a class of algorithms for computing numerically either the information theoretic capacity of a channel, the rate-distortion function of a source or a source encoding (i.e. compression to remove the redundancy).

See Channel capacity and Blahut–Arimoto algorithm

Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist and cryptographer known as the "father of information theory" and as the "father of the Information Age". Channel capacity and Claude Shannon are information theory.

See Channel capacity and Claude Shannon

Code rate

In telecommunication and information theory, the code rate (or information rate) of a forward error correction code is the proportion of the data-stream that is useful (non-redundant). Channel capacity and code rate are information theory.

See Channel capacity and Code rate

Communication channel

A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. Channel capacity and communication channel are information theory, telecommunication theory and Television terminology.

See Channel capacity and Communication channel

Computer science

Computer science is the study of computation, information, and automation.

See Channel capacity and Computer science

Conditional probability distribution

In probability theory and statistics, the conditional probability distribution is a probability distribution that describes the probability of an outcome given the occurrence of a particular event.

See Channel capacity and Conditional probability distribution

Cooperative diversity

Cooperative diversity is a cooperative multiple antenna technique for improving or maximising total network channel capacities for any given set of bandwidths which exploits user diversity by decoding the combined signal of the relayed signal and the direct signal in wireless multihop networks.

See Channel capacity and Cooperative diversity

Data compression

In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Channel capacity and data compression are information theory.

See Channel capacity and Data compression

Decibel

The decibel (symbol: dB) is a relative unit of measurement equal to one tenth of a bel (B).

See Channel capacity and Decibel

Deep learning

Deep learning is the subset of machine learning methods based on neural networks with representation learning.

See Channel capacity and Deep learning

Directed information

Directed information is an information theory measure that quantifies the information flow from the random string X^n. Channel capacity and Directed information are information theory.

See Channel capacity and Directed information

Electrical engineering

Electrical engineering is an engineering discipline concerned with the study, design, and application of equipment, devices, and systems which use electricity, electronics, and electromagnetism.

See Channel capacity and Electrical engineering

Entropy (information theory)

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Channel capacity and entropy (information theory) are information theory.

See Channel capacity and Entropy (information theory)

Error correction code

In computing, telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

See Channel capacity and Error correction code

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of the code. Channel capacity and error exponent are information theory.

See Channel capacity and Error exponent

Fading

In wireless communications, fading is the variation of signal attenuation over variables like time, geographical position, and radio frequency.

See Channel capacity and Fading

Generative adversarial network

A generative adversarial network (GAN) is a class of machine learning frameworks and a prominent framework for approaching generative AI.

See Channel capacity and Generative adversarial network

Graph (discrete mathematics)

In discrete mathematics, particularly in graph theory, a graph is a structure consisting of a set of objects where some pairs of the objects are in some sense "related".

See Channel capacity and Graph (discrete mathematics)

Hertz

The hertz (symbol: Hz) is the unit of frequency in the International System of Units (SI), equivalent to one event (or cycle) per second.

See Channel capacity and Hertz

IEEE Transactions on Information Theory

IEEE Transactions on Information Theory is a monthly peer-reviewed scientific journal published by the IEEE Information Theory Society. Channel capacity and IEEE Transactions on Information Theory are information theory.

See Channel capacity and IEEE Transactions on Information Theory

Infimum and supremum

In mathematics, the infimum (abbreviated inf;: infima) of a subset S of a partially ordered set P is the greatest element in P that is less than or equal to each element of S, if such an element exists.

See Channel capacity and Infimum and supremum

Information

Information is an abstract concept that refers to something which has the power to inform. Channel capacity and Information are information theory.

See Channel capacity and Information

Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information.

See Channel capacity and Information theory

James Massey

James Lee Massey (February 11, 1934 – June 16, 2013) was an American information theorist and cryptographer, Professor Emeritus of Digital Technology at ETH Zurich.

See Channel capacity and James Massey

Joint probability distribution

Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs.

See Channel capacity and Joint probability distribution

Logarithm

In mathematics, the logarithm is the inverse function to exponentiation.

See Channel capacity and Logarithm

Lovász number

In graph theory, the Lovász number of a graph is a real number that is an upper bound on the Shannon capacity of the graph. Channel capacity and Lovász number are information theory.

See Channel capacity and Lovász number

Marginal distribution

In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.

See Channel capacity and Marginal distribution

Markov decision process

In mathematics, a Markov decision process (MDP) is a discrete-time stochastic control process.

See Channel capacity and Markov decision process

MIMO

In radio, multiple-input and multiple-output (MIMO) is a method for multiplying the capacity of a radio link using multiple transmission and receiving antennas to exploit multipath propagation. Channel capacity and MIMO are information theory.

See Channel capacity and MIMO

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. Channel capacity and mutual information are information theory.

See Channel capacity and Mutual information

Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the shannon.

See Channel capacity and Nat (unit)

Natural logarithm

The natural logarithm of a number is its logarithm to the base of the mathematical constant e, which is an irrational and transcendental number approximately equal to.

See Channel capacity and Natural logarithm

Negentropy

In information theory and statistics, negentropy is used as a measure of distance to normality.

See Channel capacity and Negentropy

Network throughput

Network throughput (or just throughput, when in context) refers to the rate of message delivery over a communication channel, such as Ethernet or packet radio, in a communication network. Channel capacity and network throughput are information theory.

See Channel capacity and Network throughput

Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. Channel capacity and noisy-channel coding theorem are information theory and telecommunication theory.

See Channel capacity and Noisy-channel coding theorem

Nyquist rate

In signal processing, the Nyquist rate, named after Harry Nyquist, is a value equal to twice the highest frequency (bandwidth) of a given function or signal. Channel capacity and Nyquist rate are telecommunication theory.

See Channel capacity and Nyquist rate

Receiver (information theory)

The receiver in information theory is the receiving end of a communication channel. Channel capacity and receiver (information theory) are information theory.

See Channel capacity and Receiver (information theory)

Redundancy (information theory)

In information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value \log(|\mathcal_X|). Channel capacity and redundancy (information theory) are information theory.

See Channel capacity and Redundancy (information theory)

Sender (telephony)

A sender is a type of circuit and system module in 20th-century electromechanical telephone exchanges.

See Channel capacity and Sender (telephony)

Shannon–Hartley theorem

In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. Channel capacity and Shannon–Hartley theorem are information theory and telecommunication theory.

See Channel capacity and Shannon–Hartley theorem

Signal-to-noise ratio

Signal-to-noise ratio (SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise.

See Channel capacity and Signal-to-noise ratio

Spectral density

In signal processing, the power spectrum S_(f) of a continuous time signal x(t) describes the distribution of power into frequency components f composing that signal.

See Channel capacity and Spectral density

Spectral efficiency

Spectral efficiency, spectrum efficiency or bandwidth efficiency refers to the information rate that can be transmitted over a given bandwidth in a specific communication system. Channel capacity and Spectral efficiency are information theory and telecommunication theory.

See Channel capacity and Spectral efficiency

Water filling algorithm

Water filling algorithm is a general name given to the ideas in communication systems design and practice for equalization strategies on communications channels. Channel capacity and Water filling algorithm are information theory and telecommunication theory.

See Channel capacity and Water filling algorithm

References

[1] https://en.wikipedia.org/wiki/Channel_capacity

Also known as Capacity (information theory), Information capacity, Shannon capacity, System capacity.

, Water filling algorithm.