Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Free
Faster access than browser!
 

Binary symmetric channel

Index Binary symmetric channel

A binary symmetric channel (or BSC) is a common communications channel model used in coding theory and information theory. [1]

27 relations: Binary entropy function, Binary erasure channel, Bit, Boole's inequality, Channel capacity, Chernoff bound, Code, Coding theory, Communication channel, Communication theory, Concatenated error correction code, Conditional entropy, Conditional probability, Data transmission, Expected value, Hamming distance, Information theory, Linear code, Low-density parity-check code, Markov's inequality, Mutual information, Noise (electronics), Probabilistic method, Probability, Random variable, Reduction (complexity), Z-channel (information theory).

Binary entropy function

In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values.

New!!: Binary symmetric channel and Binary entropy function · See more »

Binary erasure channel

disambiguation: Landauer's principle A binary erasure channel (or BEC) is a common communications channel model used in coding theory and information theory.

New!!: Binary symmetric channel and Binary erasure channel · See more »

Bit

The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications.

New!!: Binary symmetric channel and Bit · See more »

Boole's inequality

In probability theory, Boole's inequality, also known as the union bound, says that for any finite or countable set of events, the probability that at least one of the events happens is no greater than the sum of the probabilities of the individual events.

New!!: Binary symmetric channel and Boole's inequality · See more »

Channel capacity

Channel capacity, in electrical engineering, computer science and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

New!!: Binary symmetric channel and Channel capacity · See more »

Chernoff bound

In probability theory, the Chernoff bound, named after Herman Chernoff but due to Herman Rubin, gives exponentially decreasing bounds on tail distributions of sums of independent random variables.

New!!: Binary symmetric channel and Chernoff bound · See more »

Code

In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form or representation, sometimes shortened or secret, for communication through a communication channel or storage in a storage medium.

New!!: Binary symmetric channel and Code · See more »

Coding theory

Coding theory is the study of the properties of codes and their respective fitness for specific applications.

New!!: Binary symmetric channel and Coding theory · See more »

Communication channel

A communication channel or simply channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking.

New!!: Binary symmetric channel and Communication channel · See more »

Communication theory

Communication theory is a field of information theory and mathematics that studies the technical process of information and the process of human communication.

New!!: Binary symmetric channel and Communication theory · See more »

Concatenated error correction code

In coding theory, concatenated codes form a class of error-correcting codes that are derived by combining an inner code and an outer code.

New!!: Binary symmetric channel and Concatenated error correction code · See more »

Conditional entropy

In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.

New!!: Binary symmetric channel and Conditional entropy · See more »

Conditional probability

In probability theory, conditional probability is a measure of the probability of an event (some particular situation occurring) given that (by assumption, presumption, assertion or evidence) another event has occurred.

New!!: Binary symmetric channel and Conditional probability · See more »

Data transmission

Data transmission (also data communication or digital communications) is the transfer of data (a digital bitstream or a digitized analog signal) over a point-to-point or point-to-multipoint communication channel.

New!!: Binary symmetric channel and Data transmission · See more »

Expected value

In probability theory, the expected value of a random variable, intuitively, is the long-run average value of repetitions of the experiment it represents.

New!!: Binary symmetric channel and Expected value · See more »

Hamming distance

In information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different.

New!!: Binary symmetric channel and Hamming distance · See more »

Information theory

Information theory studies the quantification, storage, and communication of information.

New!!: Binary symmetric channel and Information theory · See more »

Linear code

In coding theory, a linear code is an error-correcting code for which any linear combination of codewords is also a codeword.

New!!: Binary symmetric channel and Linear code · See more »

Low-density parity-check code

In information theory, a low-density parity-check (LDPC) code is a linear error correcting code, a method of transmitting a message over a noisy transmission channel.

New!!: Binary symmetric channel and Low-density parity-check code · See more »

Markov's inequality

In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant.

New!!: Binary symmetric channel and Markov's inequality · See more »

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

New!!: Binary symmetric channel and Mutual information · See more »

Noise (electronics)

In electronics, noise is an unwanted disturbance in an electrical signal.

New!!: Binary symmetric channel and Noise (electronics) · See more »

Probabilistic method

The probabilistic method is a nonconstructive method, primarily used in combinatorics and pioneered by Paul Erdős, for proving the existence of a prescribed kind of mathematical object.

New!!: Binary symmetric channel and Probabilistic method · See more »

Probability

Probability is the measure of the likelihood that an event will occur.

New!!: Binary symmetric channel and Probability · See more »

Random variable

In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is a variable whose possible values are outcomes of a random phenomenon.

New!!: Binary symmetric channel and Random variable · See more »

Reduction (complexity)

In computability theory and computational complexity theory, a reduction is an algorithm for transforming one problem into another problem.

New!!: Binary symmetric channel and Reduction (complexity) · See more »

Z-channel (information theory)

A Z-channel is a communications channel used in coding theory and information theory to model the behaviour of some data storage systems.

New!!: Binary symmetric channel and Z-channel (information theory) · See more »

References

[1] https://en.wikipedia.org/wiki/Binary_symmetric_channel

OutgoingIncoming
Hey! We are on Facebook now! »