Table of Contents
134 relations: A Mathematical Theory of Communication, Adversary (cryptography), Algebraic geometry code, Alphabet (formal languages), Analog signal, Analog signal processing, Analogue electronics, Authentication, Automated teller machine, Automatic repeat request, Baseband, BCH code, Bell Labs, Binary Golay code, Bipolar encoding, Brain, Cambridge University Press, Claude Shannon, Code, Code word (communication), Code-division multiple access, Coding gain, Communication protocol, Communications system, Compact Disc Digital Audio, Computational hardness assumption, Computer, Computer data storage, Computer science, Confidentiality, Convolution, Covering code, Cross-interleaved Reed–Solomon coding, Cryptography, Cyclic code, Data communication, Data compression, Data integrity, David J. C. MacKay, Decoding methods, Digital data, Digital signal, Dirty paper coding, Discrete cosine transform, Distance, E-commerce, Electrical engineering, Elwyn Berlekamp, Encryption, Entropy (information theory), ... Expand index (84 more) »
A Mathematical Theory of Communication
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.
See Coding theory and A Mathematical Theory of Communication
Adversary (cryptography)
In cryptography, an adversary (rarely opponent, enemy) is a malicious entity whose aim is to prevent the users of the cryptosystem from achieving their goal (primarily privacy, integrity, and availability of data).
See Coding theory and Adversary (cryptography)
Algebraic geometry code
Algebraic geometry codes, often abbreviated AG codes, are a type of linear code that generalize Reed–Solomon codes.
See Coding theory and Algebraic geometry code
Alphabet (formal languages)
In formal language theory, an alphabet, sometimes called a vocabulary, is a non-empty set of indivisible symbols/characters/glyphs, typically thought of as representing letters, characters, digits, phonemes, or even words.
See Coding theory and Alphabet (formal languages)
Analog signal
An analog signal is any continuous-time signal representing some other quantity, i.e., analogous to another quantity.
See Coding theory and Analog signal
Analog signal processing
Analog signal processing is a type of signal processing conducted on continuous analog signals by some analog means (as opposed to the discrete digital signal processing where the signal processing is carried out by a digital process).
See Coding theory and Analog signal processing
Analogue electronics
Analogue electronics (analog electronics) are electronic systems with a continuously variable signal, in contrast to digital electronics where signals usually take only two levels.
See Coding theory and Analogue electronics
Authentication
Authentication (from authentikos, "real, genuine", from αὐθέντης authentes, "author") is the act of proving an assertion, such as the identity of a computer system user.
See Coding theory and Authentication
Automated teller machine
An automated teller machine (ATM) is an electronic telecommunications device that enables customers of financial institutions to perform financial transactions, such as cash withdrawals, deposits, funds transfers, balance inquiries or account information inquiries, at any time and without the need for direct interaction with bank staff.
See Coding theory and Automated teller machine
Automatic repeat request
Automatic repeat request (ARQ), also known as automatic repeat query, is an error-control method for data transmission that uses acknowledgements (messages sent by the receiver indicating that it has correctly received a message) and timeouts (specified periods of time allowed to elapse before an acknowledgment is to be received) to achieve reliable data transmission over an unreliable communication channel.
See Coding theory and Automatic repeat request
Baseband
In telecommunications and signal processing, baseband is the range of frequencies occupied by a signal that has not been modulated to higher frequencies.
See Coding theory and Baseband
BCH code
In coding theory, the Bose–Chaudhuri–Hocquenghem codes (BCH codes) form a class of cyclic error-correcting codes that are constructed using polynomials over a finite field (also called a Galois field).
See Coding theory and BCH code
Bell Labs
Bell Labs is an American industrial research and scientific development company credited with the development of radio astronomy, the transistor, the laser, the photovoltaic cell, the charge-coupled device (CCD), information theory, the Unix operating system, and the programming languages B, C, C++, S, SNOBOL, AWK, AMPL, and others.
See Coding theory and Bell Labs
Binary Golay code
In mathematics and electronics engineering, a binary Golay code is a type of linear error-correcting code used in digital communications.
See Coding theory and Binary Golay code
Bipolar encoding
In telecommunication, bipolar encoding is a type of return-to-zero (RZ) line code, where two nonzero values are used, so that the three values are +, −, and zero.
See Coding theory and Bipolar encoding
Brain
The brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals.
Cambridge University Press
Cambridge University Press is the university press of the University of Cambridge.
See Coding theory and Cambridge University Press
Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist and cryptographer known as the "father of information theory" and as the "father of the Information Age".
See Coding theory and Claude Shannon
Code
In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication channel or storage in a storage medium.
Code word (communication)
In communication, a code word is an element of a standardized code or protocol.
See Coding theory and Code word (communication)
Code-division multiple access
Code-division multiple access (CDMA) is a channel access method used by various radio communication technologies.
See Coding theory and Code-division multiple access
Coding gain
In coding theory, telecommunications engineering and other related engineering problems, coding gain is the measure in the difference between the signal-to-noise ratio (SNR) levels between the uncoded system and coded system required to reach the same bit error rate (BER) levels when used with the error correcting code (ECC).
See Coding theory and Coding gain
Communication protocol
A communication protocol is a system of rules that allows two or more entities of a communications system to transmit information via any variation of a physical quantity.
See Coding theory and Communication protocol
Communications system
A communications system or communication system is a collection of individual telecommunications networks systems, relay stations, tributary stations, and terminal equipment usually capable of interconnection and interoperation to form an integrated whole.
See Coding theory and Communications system
Compact Disc Digital Audio
Compact Disc Digital Audio (CDDA or CD-DA), also known as Digital Audio Compact Disc or simply as Audio CD, is the standard format for audio compact discs.
See Coding theory and Compact Disc Digital Audio
Computational hardness assumption
In computational complexity theory, a computational hardness assumption is the hypothesis that a particular problem cannot be solved efficiently (where efficiently typically means "in polynomial time").
See Coding theory and Computational hardness assumption
Computer
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation).
See Coding theory and Computer
Computer data storage
Computer data storage or digital data storage is a technology consisting of computer components and recording media that are used to retain digital data.
See Coding theory and Computer data storage
Computer science
Computer science is the study of computation, information, and automation.
See Coding theory and Computer science
Confidentiality
Confidentiality involves a set of rules or a promise usually executed through confidentiality agreements that limits the access to or places restrictions on distribution of certain types of information.
See Coding theory and Confidentiality
Convolution
In mathematics (in particular, functional analysis), convolution is a mathematical operation on two functions (f and g) that produces a third function (f*g).
See Coding theory and Convolution
Covering code
In coding theory, a covering code is a set of elements (called codewords) in a space, with the property that every element of the space is within a fixed distance of some codeword.
See Coding theory and Covering code
Cross-interleaved Reed–Solomon coding
In the compact disc system, cross-interleaved Reed–Solomon code (CIRC) provides error detection and error correction.
See Coding theory and Cross-interleaved Reed–Solomon coding
Cryptography
Cryptography, or cryptology (from κρυπτός|translit.
See Coding theory and Cryptography
Cyclic code
In coding theory, a cyclic code is a block code, where the circular shifts of each codeword gives another word that belongs to the code.
See Coding theory and Cyclic code
Data communication
Data communication, including data transmission and data reception, is the transfer of data, transmitted and received over a point-to-point or point-to-multipoint communication channel.
See Coding theory and Data communication
Data compression
In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation.
See Coding theory and Data compression
Data integrity
Data integrity is the maintenance of, and the assurance of, data accuracy and consistency over its entire life-cycle.
See Coding theory and Data integrity
David J. C. MacKay
Sir David John Cameron MacKay (22 April 1967 – 14 April 2016) was a British physicist, mathematician, and academic.
See Coding theory and David J. C. MacKay
Decoding methods
In coding theory, decoding is the process of translating received messages into codewords of a given code.
See Coding theory and Decoding methods
Digital data
Digital data, in information theory and information systems, is information represented as a string of discrete symbols, each of which can take on one of only a finite number of values from some alphabet, such as letters or digits.
See Coding theory and Digital data
Digital signal
A digital signal is a signal that represents data as a sequence of discrete values; at any given time it can only take on, at most, one of a finite number of values.
See Coding theory and Digital signal
Dirty paper coding
In telecommunications, dirty paper coding (DPC) or Costa precoding is a technique for efficient transmission of digital data through a channel subjected to some interference known to the transmitter.
See Coding theory and Dirty paper coding
Discrete cosine transform
A discrete cosine transform (DCT) expresses a finite sequence of data points in terms of a sum of cosine functions oscillating at different frequencies.
See Coding theory and Discrete cosine transform
Distance
Distance is a numerical or occasionally qualitative measurement of how far apart objects, points, people, or ideas are.
See Coding theory and Distance
E-commerce
E-commerce (electronic commerce) is the activity of electronically buying or selling products on online services or over the Internet.
See Coding theory and E-commerce
Electrical engineering
Electrical engineering is an engineering discipline concerned with the study, design, and application of equipment, devices, and systems which use electricity, electronics, and electromagnetism.
See Coding theory and Electrical engineering
Elwyn Berlekamp
Elwyn Ralph Berlekamp (September 6, 1940 – April 9, 2019) was a professor of mathematics and computer science at the University of California, Berkeley.
See Coding theory and Elwyn Berlekamp
Encryption
In cryptography, encryption is the process of transforming (more specifically, encoding) information in a way that, ideally, only authorized parties can decode.
See Coding theory and Encryption
Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.
See Coding theory and Entropy (information theory)
Entropy coding
In information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared by Shannon's source coding theorem, which states that any lossless data compression method must have an expected code length greater than or equal to the entropy of the source.
See Coding theory and Entropy coding
Error correction code
In computing, telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.
See Coding theory and Error correction code
Error detection and correction
In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communication channels.
See Coding theory and Error detection and correction
Fading
In wireless communications, fading is the variation of signal attenuation over variables like time, geographical position, and radio frequency.
Fax
Fax (short for facsimile), sometimes called telecopying or telefax (short for telefacsimile), is the telephonic transmission of scanned printed material (both text and images), normally to a telephone number connected to a printer or other output device.
Folded Reed–Solomon code
In coding theory, folded Reed–Solomon codes are like Reed–Solomon codes, which are obtained by mapping m Reed–Solomon codewords over a larger alphabet by careful bundling of codeword symbols.
See Coding theory and Folded Reed–Solomon code
Group testing
In statistics and combinatorial mathematics, group testing is any procedure that breaks up the task of identifying certain objects into tests on groups of items, rather than on individual ones.
See Coding theory and Group testing
Hamming bound
In mathematics and computer science, in the field of coding theory, the Hamming bound is a limit on the parameters of an arbitrary block code: it is also known as the sphere-packing bound or the volume bound from an interpretation in terms of packing balls in the Hamming metric into the space of all possible words.
See Coding theory and Hamming bound
Hamming code
In computer science and telecommunication, Hamming codes are a family of linear error-correcting codes.
See Coding theory and Hamming code
Hamming distance
In information theory, the Hamming distance between two strings or vectors of equal length is the number of positions at which the corresponding symbols are different.
See Coding theory and Hamming distance
Hamming weight
The Hamming weight of a string is the number of symbols that are different from the zero-symbol of the alphabet used.
See Coding theory and Hamming weight
Information
Information is an abstract concept that refers to something which has the power to inform.
See Coding theory and Information
Information security
Information security, sometimes shortened to infosec, is the practice of protecting information by mitigating information risks.
See Coding theory and Information security
Information theory
Information theory is the mathematical study of the quantification, storage, and communication of information.
See Coding theory and Information theory
Information-theoretic security
A cryptosystem is considered to have information-theoretic security (also called unconditional security) if the system is secure against adversaries with unlimited computing resources and time.
See Coding theory and Information-theoretic security
Injective function
In mathematics, an injective function (also known as injection, or one-to-one function) is a function that maps distinct elements of its domain to distinct elements; that is, implies.
See Coding theory and Injective function
Integer factorization
In number theory, integer factorization is the decomposition of a positive integer into a product of integers.
See Coding theory and Integer factorization
Internet Engineering Task Force
The Internet Engineering Task Force (IETF) is a standards organization for the Internet and is responsible for the technical standards that make up the Internet protocol suite (TCP/IP).
See Coding theory and Internet Engineering Task Force
Introduction to the Theory of Error-Correcting Codes
Introduction to the Theory of Error-Correcting Codes is a textbook on error-correcting codes, by Vera Pless.
See Coding theory and Introduction to the Theory of Error-Correcting Codes
Johnson–Nyquist noise
Johnson–Nyquist noise (thermal noise, Johnson noise, or Nyquist noise) is the electronic noise generated by the thermal agitation of the charge carriers (usually the electrons) inside an electrical conductor at equilibrium, which happens regardless of any applied voltage.
See Coding theory and Johnson–Nyquist noise
Joint source and channel coding
In information theory, joint source–channel coding is the encoding of a redundant information source for transmission over a noisy channel, and the corresponding decoding, using a single code instead of the more conventional steps of source coding followed by channel coding.
See Coding theory and Joint source and channel coding
JPEG
JPEG (short for Joint Photographic Experts Group) is a commonly used method of lossy compression for digital images, particularly for those images produced by digital photography.
K. R. Rao
Kamisetty Ramamohan Rao (19312021) was an Indian-American electrical engineer.
See Coding theory and K. R. Rao
Lee distance
In coding theory, the Lee distance is a distance between two strings x_1 x_2 \dots x_n and y_1 y_2 \dots y_n of equal length n over the q-ary alphabet of size.
See Coding theory and Lee distance
Line code
In telecommunication, a line code is a pattern of voltage, current, or photons used to represent digital data transmitted down a communication channel or written to a storage medium.
See Coding theory and Line code
Linear time-invariant system
In system analysis, among other fields of study, a linear time-invariant (LTI) system is a system that produces an output signal from any input signal subject to the constraints of linearity and time-invariance; these terms are briefly defined in the overview below.
See Coding theory and Linear time-invariant system
Linearity
In mathematics, the term linear is used in two distinct senses for two different properties.
See Coding theory and Linearity
Linguistics
Linguistics is the scientific study of language.
See Coding theory and Linguistics
List of algebraic coding theory topics
This is a list of algebraic coding theory topics.
See Coding theory and List of algebraic coding theory topics
Lossy compression
In information technology, lossy compression or irreversible compression is the class of data compression methods that uses inexact approximations and partial data discarding to represent the content.
See Coding theory and Lossy compression
Low-density parity-check code
In information theory, a low-density parity-check (LDPC) code is a linear error correcting code, a method of transmitting a message over a noisy transmission channel.
See Coding theory and Low-density parity-check code
Manchester code
In telecommunication and data storage, Manchester code (also known as phase encoding, or PE) is a line code in which the encoding of each data bit is either low then high, or high then low, for equal time.
See Coding theory and Manchester code
Mathematics
Mathematics is a field of study that discovers and organizes abstract objects, methods, theories and theorems that are developed and proved for the needs of empirical sciences and mathematics itself.
See Coding theory and Mathematics
MIMO
In radio, multiple-input and multiple-output (MIMO) is a method for multiplying the capacity of a radio link using multiple transmission and receiving antennas to exploit multipath propagation.
Moving Picture Experts Group
The Moving Picture Experts Group (MPEG) is an alliance of working groups established jointly by ISO and IEC that sets standards for media coding, including compression coding of audio, video, graphics, and genomic data; and transmission and file formats for various applications.
See Coding theory and Moving Picture Experts Group
MP3
MP3 (formally MPEG-1 Audio Layer III or MPEG-2 Audio Layer III) is a coding format for digital audio developed largely by the Fraunhofer Society in Germany under the lead of Karlheinz Brandenburg, with support from other digital scientists in other countries.
NASA Deep Space Network
The NASA Deep Space Network (DSN) is a worldwide network of spacecraft communication ground segment facilities, located in the United States (California), Spain (Madrid), and Australia (Canberra), that supports NASA's interplanetary spacecraft missions.
See Coding theory and NASA Deep Space Network
Nasir Ahmed (engineer)
Nasir Ahmed (born 1940) is an Indian-American electrical engineer and computer scientist.
See Coding theory and Nasir Ahmed (engineer)
Neural coding
Neural coding (or neural representation) is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the neuronal responses, and the relationship among the electrical activities of the neurons in the ensemble.
See Coding theory and Neural coding
Neural network
A neural network is a group of interconnected units called neurons that send signals to one another.
See Coding theory and Neural network
Neuron
A neuron, neurone, or nerve cell is an excitable cell that fires electric signals called action potentials across a neural network in the nervous system.
Neuroscience
Neuroscience is the scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions and disorders.
See Coding theory and Neuroscience
Non-repudiation
In law, non-repudiation is a situation where a statement's author cannot successfully dispute its authorship or the validity of an associated contract.
See Coding theory and Non-repudiation
Non-return-to-zero
In telecommunication, a non-return-to-zero (NRZ) line code is a binary code in which ones are represented by one significant condition, usually a positive voltage, while zeros are represented by some other significant condition, usually a negative voltage, with no other neutral or rest condition.
See Coding theory and Non-return-to-zero
Nonsense
Nonsense is a form of communication, via speech, writing, or any other symbolic system, that lacks any coherent meaning.
See Coding theory and Nonsense
Norbert Wiener
Norbert Wiener (November 26, 1894 – March 18, 1964) was an American computer scientist, mathematician and philosopher.
See Coding theory and Norbert Wiener
One-time pad
In cryptography, the one-time pad (OTP) is an encryption technique that cannot be cracked, but requires the use of a single-use pre-shared key that is larger than or equal to the size of the message being sent.
See Coding theory and One-time pad
Parity bit
A parity bit, or check bit, is a bit added to a string of binary code.
See Coding theory and Parity bit
Password
A password, sometimes called a passcode, is secret data, typically a string of characters, usually used to confirm a user's identity.
See Coding theory and Password
Phase (waves)
In physics and mathematics, the phase (symbol φ or ϕ) of a wave or other periodic function F of some real variable t (such as time) is an angle-like quantity representing the fraction of the cycle covered up to t. It is expressed in such a scale that it varies by one full turn as the variable t goes through each period (and F(t) goes through each complete cycle).
See Coding theory and Phase (waves)
Polynomial code
In coding theory, a polynomial code is a type of linear code whose set of valid code words consists of those polynomials (usually of some fixed length) that are divisible by a given fixed polynomial (of shorter length, called the generator polynomial).
See Coding theory and Polynomial code
Random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.
See Coding theory and Random variable
Redundancy (information theory)
In information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value \log(|\mathcal_X|).
See Coding theory and Redundancy (information theory)
Reed–Muller code
Reed–Muller codes are error-correcting codes that are used in wireless communications applications, particularly in deep-space communication.
See Coding theory and Reed–Muller code
Reed–Solomon error correction
Reed–Solomon codes are a group of error-correcting codes that were introduced by Irving S. Reed and Gustave Solomon in 1960.
See Coding theory and Reed–Solomon error correction
Regular number
Regular numbers are numbers that evenly divide powers of 60 (or, equivalently, powers of 30).
See Coding theory and Regular number
Repetition code
In coding theory, the repetition code is one of the most basic linear error-correcting codes.
See Coding theory and Repetition code
Richard Hamming
Richard Wesley Hamming (February 11, 1915 – January 7, 1998) was an American mathematician whose work had many implications for computer engineering and telecommunications.
See Coding theory and Richard Hamming
Run-length encoding
Run-length encoding (RLE) is a form of lossless data compression in which runs of data (consecutive occurrences of the same data value) are stored as a single occurrence of that data value and a count of its consecutive occurrences, rather than as the original run.
See Coding theory and Run-length encoding
Secure communication
Secure communication is when two entities are communicating and do not want a third party to listen in.
See Coding theory and Secure communication
Shaping codes
In digital communications shaping codes are a method of encoding that changes the distribution of signals to improve efficiency.
See Coding theory and Shaping codes
Signal transmission
In telecommunications, transmission is the process of sending or propagating an analog or digital signal via a medium that is wired, wireless, or fiber-optic.
See Coding theory and Signal transmission
Space–time code
A space–time code (STC) is a method employed to improve the reliability of data transmission in wireless communication systems using multiple transmit antennas.
See Coding theory and Space–time code
Spatial multiplexing
Spatial multiplexing or space-division multiplexing (SM, SDM or SMX) is a multiplexing technique in MIMO wireless communication, fiber-optic communication and other communications technologies used to transmit independent channels separated in space.
See Coding theory and Spatial multiplexing
Sphere packing
In geometry, a sphere packing is an arrangement of non-overlapping spheres within a containing space.
See Coding theory and Sphere packing
Stimulus (physiology)
In physiology, a stimulus is a detectable change in the physical or chemical structure of an organism's internal or external environment.
See Coding theory and Stimulus (physiology)
Synchronization
Synchronization is the coordination of events to operate a system in unison.
See Coding theory and Synchronization
Synchronous Data Link Control
Synchronous Data Link Control (SDLC) is a computer serial communications protocol first introduced by IBM as part of its Systems Network Architecture (SNA).
See Coding theory and Synchronous Data Link Control
Syphilis
Syphilis is a sexually transmitted infection caused by the bacterium Treponema pallidum subspecies pallidum.
See Coding theory and Syphilis
Timeline of information theory
A timeline of events related to information theory, quantum information theory and statistical physics, data compression, error correcting codes and related subjects.
See Coding theory and Timeline of information theory
Transmission Control Protocol
The Transmission Control Protocol (TCP) is one of the main protocols of the Internet protocol suite.
See Coding theory and Transmission Control Protocol
Turbo code
In information theory, turbo codes (originally in French Turbocodes) are a class of high-performance forward error correction (FEC) codes developed around 1990–91, but first published in 1993.
See Coding theory and Turbo code
Turing Award
The ACM A. M. Turing Award is an annual prize given by the Association for Computing Machinery (ACM) for contributions of lasting and major technical importance to computer science.
See Coding theory and Turing Award
Unipolar encoding
Unipolar encoding is a line code.
See Coding theory and Unipolar encoding
United States Army Air Forces
The United States Army Air Forces (USAAF or AAF) was the major land-based aerial warfare service component of the United States Army and de facto aerial warfare service branch of the United States during and immediately after World War II (1941–1947).
See Coding theory and United States Army Air Forces
Vera Pless
Vera Pless (nee Stepen; March 5, 1931 – March 2, 2020) was an American mathematician who specialized in combinatorics and coding theory.
See Coding theory and Vera Pless
Viterbi algorithm
The Viterbi algorithm is a dynamic programming algorithm for obtaining the maximum a posteriori probability estimate of the most likely sequence of hidden states—called the Viterbi path—that results in a sequence of observed events.
See Coding theory and Viterbi algorithm
Waveform
In electronics, acoustics, and related fields, the waveform of a signal is the shape of its graph as a function of time, independent of its time and magnitude scales and of any displacement in time.
See Coding theory and Waveform
Wide area network
A wide area network (WAN) is a telecommunications network that extends over a large geographic area.
See Coding theory and Wide area network
Window function
In signal processing and statistics, a window function (also known as an apodization function or tapering function) is a mathematical function that is zero-valued outside of some chosen interval.
See Coding theory and Window function
World War II
World War II or the Second World War (1 September 1939 – 2 September 1945) was a global conflict between two alliances: the Allies and the Axis powers.
See Coding theory and World War II
X.25
X.25 is an ITU-T standard protocol suite for packet-switched data communication in wide area networks (WAN).
XOR gate
XOR gate (sometimes EOR, or EXOR and pronounced as Exclusive OR) is a digital logic gate that gives a true (1 or HIGH) output when the number of true inputs is odd.
See Coding theory and XOR gate
ZIP (file format)
ZIP is an archive file format that supports lossless data compression.
See Coding theory and ZIP (file format)
References
Also known as Algebraic Coding Theory, Analog coding, Analog encryption, Channel code, Code theory, Frequency coding theory.