49 relations: Adler-32, Algorithm, Analysis of algorithms, Authentication, Bank account, BSD checksum, Byte, Check digit, Cksum, Computer data storage, Cryptographic hash function, Cryptographic primitive, Cyclic redundancy check, Damm algorithm, Data, Data degradation, Data integrity, Digital data, Error correction code, Error detection and correction, Exclusive or, File verification, Fingerprint (computing), Fletcher's checksum, Frame check sequence, Gematria, Hamming code, Hash function, HMAC, IPv4 header checksum, Isopsephy, List of hash functions, Longitudinal redundancy check, Luhn algorithm, Md5sum, Parchive, Parity bit, Randomization function, Rolling hash, SAE J1708, Sha1sum, Social Security number, Sum (Unix), SYSV checksum, Telecommunication, Two's complement, Verhoeff algorithm, Word (computer architecture), ZFS.
Adler-32 is a checksum algorithm which was invented by Mark Adler in 1995, and is a modification of the Fletcher checksum.
In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems.
In computer science, the analysis of algorithms is the determination of the computational complexity of algorithms, that is the amount of time, storage and/or other resources necessary to execute them.
Authentication (from authentikos, "real, genuine", from αὐθέντης authentes, "author") is the act of confirming the truth of an attribute of a single piece of data claimed true by an entity.
A bank account is a financial account maintained by a bank for a customer.
The BSD checksum algorithm is a commonly used, legacy checksum algorithm.
The byte is a unit of digital information that most commonly consists of eight bits, representing a binary number.
A check digit is a form of redundancy check used for error detection on identification numbers, such as bank account numbers, which are used in an application where they will at least sometimes be input manually.
cksum is a command in Unix-like operating systems that generates a checksum value for a file or stream of data.
Computer data storage, often called storage or memory, is a technology consisting of computer components and recording media that are used to retain digital data.
A cryptographic hash function is a special class of hash function that has certain properties which make it suitable for use in cryptography.
Cryptographic primitives are well-established, low-level cryptographic algorithms that are frequently used to build cryptographic protocols for computer security systems.
A cyclic redundancy check (CRC) is an error-detecting code commonly used in digital networks and storage devices to detect accidental changes to raw data.
In error detection, the Damm algorithm is a check digit algorithm that detects all single-digit errors and all adjacent transposition errors.
Data is a set of values of qualitative or quantitative variables.
Data degradation is the gradual corruption of computer data due to an accumulation of non-critical failures in a data storage device.
Data integrity is the maintenance of, and the assurance of the accuracy and consistency of, data over its entire life-cycle, and is a critical aspect to the design, implementation and usage of any system which stores, processes, or retrieves data.
Digital data, in information theory and information systems, is the discrete, discontinuous representation of information or works.
In computing, telecommunication, information theory, and coding theory, an error correction code, sometimes error correcting code, (ECC) is used for controlling errors in data over unreliable or noisy communication channels.
In information theory and coding theory with applications in computer science and telecommunication, error detection and correction or error control are techniques that enable reliable delivery of digital data over unreliable communication channels.
Exclusive or or exclusive disjunction is a logical operation that outputs true only when inputs differ (one is true, the other is false).
File verification is the process of using an algorithm for verifying the integrity of a computer file.
In computer science, a fingerprinting algorithm is a procedure that maps an arbitrarily large data item (such as a computer file) to a much shorter bit string, its fingerprint, that uniquely identifies the original data for all practical purposesA.
The Fletcher checksum is an algorithm for computing a position-dependent checksum devised by John G. Fletcher (1934–2012) at Lawrence Livermore Labs in the late 1970s.
A frame check sequence (FCS) refers to the extra error-detecting code added to a frame in a communications protocol.
Gematria (גמטריא, plural or, gematriot) originated as an Assyro-Babylonian-Greek system of alphanumeric code or cipher later adopted into Jewish culture that assigns numerical value to a word, name, or phrase in the belief that words or phrases with identical numerical values bear some relation to each other or bear some relation to the number itself as it may apply to Nature, a person's age, the calendar year, or the like.
In telecommunication, Hamming codes are a family of linear error-correcting codes.
A hash function is any function that can be used to map data of arbitrary size to data of a fixed size.
In cryptography, an HMAC (sometimes disabbreviated as either keyed-hash message authentication code or hash-based message authentication code) is a specific type of message authentication code (MAC) involving a cryptographic hash function and a secret cryptographic key.
The IPv4 header checksum is a simple checksum used in version 4 of the Internet Protocol (IPv4) to protect the header of IPv4 data packets against data corruption.
Isopsephy (ἴσος isos meaning "equal" and ψῆφος psephos meaning "pebble") or isopsephism is the practice of adding up the number values of the letters in a word to form a single number.
This is a list of hash functions, including cyclic redundancy checks, checksum functions, and cryptographic hash functions.
In telecommunication, a longitudinal redundancy check (LRC), or horizontal redundancy check, is a form of redundancy check that is applied independently to each of a parallel group of bit streams.
The Luhn algorithm or Luhn formula, also known as the "modulus 10" or "mod 10" algorithm, is a simple checksum formula used to validate a variety of identification numbers, such as credit card numbers, IMEI numbers, National Provider Identifier numbers in the United States, Canadian Social Insurance Numbers, Israel ID Numbers and Greek Social Security Numbers (ΑΜΚΑ).
md5sum is a computer program that calculates and verifies 128-bit MD5 hashes, as described in RFC 1321.
Parchive (a portmanteau of parity archive, and formally known as Parity Volume Set Specification) is an erasure code system that produces par files for checksum verification of data integrity, with the capability to perform data recovery operations that can repair or regenerate corrupted or missing data.
A parity bit, or check bit, is a bit added to a string of binary code to ensure that the total number of 1-bits in the string is even or odd.
In computer science, a randomization function or randomizing function is an algorithm or procedure that implements a randomly chosen function between two specific sets, suitable for use in a randomized algorithm.
A rolling hash (also known as recursive hashing or rolling checksum) is a hash function where the input is hashed in a window that moves through the input.
SAE J1708 is a standard used for serial communications between ECUs on a heavy duty vehicle and also between a computer and the vehicle.
is a computer program that calculates and verifies SHA-1 hashes.
In the United States, a Social Security number (SSN) is a nine-digit number issued to U.S. citizens, permanent residents, and temporary (working) residents under section 205(c)(2) of the Social Security Act, codified as.
Sum is a core Unix utility available on all Unix and Linux distributions.
The SYSV checksum algorithm is a commonly used, legacy checksum algorithm.
Telecommunication is the transmission of signs, signals, messages, words, writings, images and sounds or information of any nature by wire, radio, optical or other electromagnetic systems.
Two's complement is a mathematical operation on binary numbers, best known for its role in computing as a method of signed number representation.
The Verhoeff algorithm is a checksum formula for error detection developed by the Dutch mathematician Jacobus Verhoeff and was first published in 1969.
In computing, a word is the natural unit of data used by a particular processor design.
ZFS is a combined file system and logical volume manager designed by Sun Microsystems and now owned by Oracle Corporation.