We are working to restore the Unionpedia app on the Google Play Store
OutgoingIncoming
🌟We've simplified our design for better navigation!
Instagram Facebook X LinkedIn

Binary symmetric channel

Index Binary symmetric channel

A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. [1]

Table of Contents

  1. 30 relations: Binary entropy function, Binary erasure channel, Bit, Boole's inequality, Cell division, Channel capacity, Chernoff bound, Code, Coding theory, Communication channel, Communication theory, Concatenated error correction code, Conditional entropy, Conditional probability, Disk storage, DNA, Expected value, Hamming distance, Information theory, Linear code, Low-density parity-check code, Markov's inequality, Mutual information, Noise (electronics), Noisy-channel coding theorem, Probabilistic method, Probability, Random variable, Reduction (complexity), Z-channel (information theory).

Binary entropy function

In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process (i.i.d. binary variable) with probability p of one of two values, and is given by the formula: The base of the logarithm corresponds to the choice of units of information; base corresponds to nats and is mathematically convenient, while base 2 (binary logarithm) corresponds to shannons and is conventional (as shown in the graph); explicitly: Note that the values at 0 and 1 are given by the limit \textstyle 0 \log 0.

See Binary symmetric channel and Binary entropy function

Binary erasure channel

In coding theory and information theory, a binary erasure channel (BEC) is a communications channel model. Binary symmetric channel and binary erasure channel are coding theory.

See Binary symmetric channel and Binary erasure channel

Bit

The bit is the most basic unit of information in computing and digital communication.

See Binary symmetric channel and Bit

Boole's inequality

In probability theory, Boole's inequality, also known as the union bound, says that for any finite or countable set of events, the probability that at least one of the events happens is no greater than the sum of the probabilities of the individual events.

See Binary symmetric channel and Boole's inequality

Cell division

Cell division is the process by which a parent cell divides into two daughter cells.

See Binary symmetric channel and Cell division

Channel capacity

Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel.

See Binary symmetric channel and Channel capacity

Chernoff bound

In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function.

See Binary symmetric channel and Chernoff bound

Code

In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication channel or storage in a storage medium.

See Binary symmetric channel and Code

Coding theory

Coding theory is the study of the properties of codes and their respective fitness for specific applications.

See Binary symmetric channel and Coding theory

Communication channel

A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking.

See Binary symmetric channel and Communication channel

Communication theory

Communication theory is a proposed description of communication phenomena, the relationships among them, a storyline describing these relationships, and an argument for these three elements.

See Binary symmetric channel and Communication theory

Concatenated error correction code

In coding theory, concatenated codes form a class of error-correcting codes that are derived by combining an inner code and an outer code. Binary symmetric channel and concatenated error correction code are coding theory.

See Binary symmetric channel and Concatenated error correction code

Conditional entropy

In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.

See Binary symmetric channel and Conditional entropy

Conditional probability

In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) is already known to have occurred.

See Binary symmetric channel and Conditional probability

Disk storage

Disk storage (also sometimes called drive storage) is a data storage mechanism based on a rotating disk.

See Binary symmetric channel and Disk storage

DNA

Deoxyribonucleic acid (DNA) is a polymer composed of two polynucleotide chains that coil around each other to form a double helix.

See Binary symmetric channel and DNA

Expected value

In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average.

See Binary symmetric channel and Expected value

Hamming distance

In information theory, the Hamming distance between two strings or vectors of equal length is the number of positions at which the corresponding symbols are different. Binary symmetric channel and Hamming distance are coding theory.

See Binary symmetric channel and Hamming distance

Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information.

See Binary symmetric channel and Information theory

Linear code

In coding theory, a linear code is an error-correcting code for which any linear combination of codewords is also a codeword. Binary symmetric channel and linear code are coding theory.

See Binary symmetric channel and Linear code

Low-density parity-check code

In information theory, a low-density parity-check (LDPC) code is a linear error correcting code, a method of transmitting a message over a noisy transmission channel. Binary symmetric channel and low-density parity-check code are coding theory.

See Binary symmetric channel and Low-density parity-check code

Markov's inequality

In probability theory, Markov's inequality gives an upper bound on the probability that a non-negative random variable is greater than or equal to some positive constant.

See Binary symmetric channel and Markov's inequality

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

See Binary symmetric channel and Mutual information

Noise (electronics)

In electronics, noise is an unwanted disturbance in an electrical signal.

See Binary symmetric channel and Noise (electronics)

Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. Binary symmetric channel and noisy-channel coding theorem are coding theory.

See Binary symmetric channel and Noisy-channel coding theorem

Probabilistic method

In mathematics, the probabilistic method is a nonconstructive method, primarily used in combinatorics and pioneered by Paul Erdős, for proving the existence of a prescribed kind of mathematical object.

See Binary symmetric channel and Probabilistic method

Probability

Probability is the branch of mathematics concerning events and numerical descriptions of how likely they are to occur.

See Binary symmetric channel and Probability

Random variable

A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.

See Binary symmetric channel and Random variable

Reduction (complexity)

In computability theory and computational complexity theory, a reduction is an algorithm for transforming one problem into another problem.

See Binary symmetric channel and Reduction (complexity)

Z-channel (information theory)

In coding theory and information theory, a Z-channel or binary asymmetric channel is a communications channel used to model the behaviour of some data storage systems. Binary symmetric channel and z-channel (information theory) are coding theory.

See Binary symmetric channel and Z-channel (information theory)

References

[1] https://en.wikipedia.org/wiki/Binary_symmetric_channel