Table of Contents
10 relations: Bernoulli distribution, Binary entropy function, Binary symmetric channel, Channel capacity, Code, Coding theory, Communication channel, Conditional probability, Information theory, Random variable.
Bernoulli distribution
In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with probability q.
See Z-channel (information theory) and Bernoulli distribution
Binary entropy function
In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process (i.i.d. binary variable) with probability p of one of two values, and is given by the formula: The base of the logarithm corresponds to the choice of units of information; base corresponds to nats and is mathematically convenient, while base 2 (binary logarithm) corresponds to shannons and is conventional (as shown in the graph); explicitly: Note that the values at 0 and 1 are given by the limit \textstyle 0 \log 0.
See Z-channel (information theory) and Binary entropy function
Binary symmetric channel
A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. Z-channel (information theory) and binary symmetric channel are coding theory.
See Z-channel (information theory) and Binary symmetric channel
Channel capacity
Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel. Z-channel (information theory) and channel capacity are information theory.
See Z-channel (information theory) and Channel capacity
Code
In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication channel or storage in a storage medium.
See Z-channel (information theory) and Code
Coding theory
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Z-channel (information theory) and Coding theory are information theory.
See Z-channel (information theory) and Coding theory
Communication channel
A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. Z-channel (information theory) and communication channel are information theory.
See Z-channel (information theory) and Communication channel
Conditional probability
In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) is already known to have occurred.
See Z-channel (information theory) and Conditional probability
Information theory
Information theory is the mathematical study of the quantification, storage, and communication of information.
See Z-channel (information theory) and Information theory
Random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.
See Z-channel (information theory) and Random variable
References
Also known as Binary asymmetric channel, Z Channel (information theory).