Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Install
Faster access than browser!
 

Data integrity

Index Data integrity

Data integrity is the maintenance of, and the assurance of the accuracy and consistency of, data over its entire life-cycle, and is a critical aspect to the design, implementation and usage of any system which stores, processes, or retrieves data. [1]

58 relations: ACID, Assertion (software development), Btrfs, Check constraint, Checksum, Clustered file system, Comparison of relational database management systems, Computing, Consistency model, Correctness (computer science), Corrosion, Cryptographic hash function, Damm algorithm, Data, Data corruption, Data quality, Data retention, Data security, Data validation, Database, Database schema, ECC memory, Electromechanics, Entity integrity, Exclusive or, Extended file system, Fatigue (material), File system, Food and Drug Administration, Foreign key, Forward error correction, G-force, Hash function, Human error, Information lifecycle management, ISO 13485, ISO 14155, JFS (file system), Luhn algorithm, Message authentication, Metadata, National Information Assurance Glossary, NTFS, Null (SQL), Primary key, Radiation hardening, RAID, Rationality, Redundancy (engineering), Referential integrity, ..., Relational database, Safety-critical system, Software bug, Uninterruptible power supply, Unix File System, Watchdog timer, XFS, ZFS. Expand index (8 more) »

ACID

In computer science, ACID (Atomicity, Consistency, Isolation, Durability) is a set of properties of database transactions intended to guarantee validity even in the event of errors, power failures, etc.

New!!: Data integrity and ACID · See more »

Assertion (software development)

In computer programming, an assertion is a statement that a predicate (Boolean-valued function, i.e. a true–false expression) is always true at that point in code execution.

New!!: Data integrity and Assertion (software development) · See more »

Btrfs

Btrfs (pronounced as "butter fuss", "better F S", "butter F S", "b-tree F S", or simply by spelling it out) is a file system based on the copy-on-write (COW) principle, initially designed at Oracle Corporation for use in Linux.

New!!: Data integrity and Btrfs · See more »

Check constraint

A check constraint is a type of integrity constraint in SQL which specifies a requirement that must be met by each row in a database table.

New!!: Data integrity and Check constraint · See more »

Checksum

A checksum is a small-sized datum derived from a block of digital data for the purpose of detecting errors which may have been introduced during its transmission or storage.

New!!: Data integrity and Checksum · See more »

Clustered file system

A clustered file system is a file system which is shared by being simultaneously mounted on multiple servers.

New!!: Data integrity and Clustered file system · See more »

Comparison of relational database management systems

The following tables compare general and technical information for a number of relational database management systems.

New!!: Data integrity and Comparison of relational database management systems · See more »

Computing

Computing is any goal-oriented activity requiring, benefiting from, or creating computers.

New!!: Data integrity and Computing · See more »

Consistency model

In computer science, Consistency models are used in distributed systems like distributed shared memory systems or distributed data stores (such as a filesystems, databases, optimistic replication systems or Web caching).

New!!: Data integrity and Consistency model · See more »

Correctness (computer science)

In theoretical computer science, correctness of an algorithm is asserted when it is said that the algorithm is correct with respect to a specification.

New!!: Data integrity and Correctness (computer science) · See more »

Corrosion

Corrosion is a natural process, which converts a refined metal to a more chemically-stable form, such as its oxide, hydroxide, or sulfide.

New!!: Data integrity and Corrosion · See more »

Cryptographic hash function

A cryptographic hash function is a special class of hash function that has certain properties which make it suitable for use in cryptography.

New!!: Data integrity and Cryptographic hash function · See more »

Damm algorithm

In error detection, the Damm algorithm is a check digit algorithm that detects all single-digit errors and all adjacent transposition errors.

New!!: Data integrity and Damm algorithm · See more »

Data

Data is a set of values of qualitative or quantitative variables.

New!!: Data integrity and Data · See more »

Data corruption

Data corruption refers to errors in computer data that occur during writing, reading, storage, transmission, or processing, which introduce unintended changes to the original data.

New!!: Data integrity and Data corruption · See more »

Data quality

Data quality refers to the condition of a set of values of qualitative or quantitative variables.

New!!: Data integrity and Data quality · See more »

Data retention

Data retention defines the policies of persistent data and records management for meeting legal and business data archival requirements; although sometimes interchangeable, not to be confused with the Data Protection Act 1998.

New!!: Data integrity and Data retention · See more »

Data security

Data security means protecting digital data, such as those in a database, from destructive forces and from the unwanted actions of unauthorized users, such as a cyberattack or a data breach.

New!!: Data integrity and Data security · See more »

Data validation

In computer science, data validation is the process of ensuring data have undergone data cleansing to ensure they have data quality, that is, that they are both correct and useful.

New!!: Data integrity and Data validation · See more »

Database

A database is an organized collection of data, stored and accessed electronically.

New!!: Data integrity and Database · See more »

Database schema

The database schema of a database system is its structure described in a formal language supported by the database management system (DBMS).

New!!: Data integrity and Database schema · See more »

ECC memory

Error-correcting code memory (ECC memory) is a type of computer data storage that can detect and correct the most common kinds of internal data corruption.

New!!: Data integrity and ECC memory · See more »

Electromechanics

In engineering, electromechanics combines processes and procedures drawn from electrical engineering and mechanical engineering.

New!!: Data integrity and Electromechanics · See more »

Entity integrity

Entity integrity is concerned with ensuring that each row of a table has a unique and non-null primary key value; this is the same as saying that each row in a table represents a single instance of the entity type modelled by the table.

New!!: Data integrity and Entity integrity · See more »

Exclusive or

Exclusive or or exclusive disjunction is a logical operation that outputs true only when inputs differ (one is true, the other is false).

New!!: Data integrity and Exclusive or · See more »

Extended file system

The extended file system, or ext, was implemented in April 1992 as the first file system created specifically for the Linux kernel.

New!!: Data integrity and Extended file system · See more »

Fatigue (material)

In materials science, fatigue is the weakening of a material caused by repeatedly applied loads.

New!!: Data integrity and Fatigue (material) · See more »

File system

In computing, a file system or filesystem controls how data is stored and retrieved.

New!!: Data integrity and File system · See more »

Food and Drug Administration

The Food and Drug Administration (FDA or USFDA) is a federal agency of the United States Department of Health and Human Services, one of the United States federal executive departments.

New!!: Data integrity and Food and Drug Administration · See more »

Foreign key

In the context of relational databases, a foreign key is a field (or collection of fields) in one table that uniquely identifies a row of another table or the same table.

New!!: Data integrity and Foreign key · See more »

Forward error correction

In telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

New!!: Data integrity and Forward error correction · See more »

G-force

The gravitational force, or more commonly, g-force, is a measurement of the type of acceleration that causes a perception of weight.

New!!: Data integrity and G-force · See more »

Hash function

A hash function is any function that can be used to map data of arbitrary size to data of a fixed size.

New!!: Data integrity and Hash function · See more »

Human error

Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power (e.g., the Three Mile Island accident), aviation (see pilot error), space exploration (e.g., the Space Shuttle Challenger Disaster and Space Shuttle Columbia disaster), and medicine (see medical error).

New!!: Data integrity and Human error · See more »

Information lifecycle management

Information lifecycle management (ILM) refers to strategies for administering storage systems on computing devices.

New!!: Data integrity and Information lifecycle management · See more »

ISO 13485

ISO 13485 Medical devices -- Quality management systems -- Requirements for regulatory purposes is an International Organization for Standardization (ISO) standard published for the first time in 1996; it represents the requirements for a comprehensive quality management system for the design and manufacture of medical devices.

New!!: Data integrity and ISO 13485 · See more »

ISO 14155

ISO 14155 Clinical investigation of medical devices for human subjects -- Good clinical practice This international standard addresses good clinical practices for the design, conduct, recording and reporting of clinical investigations carried out in human subjects to assess the safety and performance of medical devices for regulatory purposes.

New!!: Data integrity and ISO 14155 · See more »

JFS (file system)

Journaled File System or JFS is a 64-bit journaling file system created by IBM.

New!!: Data integrity and JFS (file system) · See more »

Luhn algorithm

The Luhn algorithm or Luhn formula, also known as the "modulus 10" or "mod 10" algorithm, is a simple checksum formula used to validate a variety of identification numbers, such as credit card numbers, IMEI numbers, National Provider Identifier numbers in the United States, Canadian Social Insurance Numbers, Israel ID Numbers and Greek Social Security Numbers (ΑΜΚΑ).

New!!: Data integrity and Luhn algorithm · See more »

Message authentication

In information security, message authentication or data origin authentication is a property that a message has not been modified while in transit (data integrity) and that the receiving party can verify the source of the message.

New!!: Data integrity and Message authentication · See more »

Metadata

Metadata is "data that provides information about other data".

New!!: Data integrity and Metadata · See more »

National Information Assurance Glossary

Committee on National Security Systems Instruction No.

New!!: Data integrity and National Information Assurance Glossary · See more »

NTFS

NTFS (New Technology File System) is a proprietary file system developed by Microsoft.

New!!: Data integrity and NTFS · See more »

Null (SQL)

Null (or NULL) is a special marker used in Structured Query Language to indicate that a data value does not exist in the database.

New!!: Data integrity and Null (SQL) · See more »

Primary key

In the relational model of databases, a primary key is a specific choice of a minimal set of attributes (columns) that uniquely specify a tuple (row) in a relation (table).

New!!: Data integrity and Primary key · See more »

Radiation hardening

Radiation hardening is the act of making electronic components and systems resistant to damage or malfunctions caused by ionizing radiation (particle radiation and high-energy electromagnetic radiation), such as those encountered in outer space and high-altitude flight, around nuclear reactors and particle accelerators, or during nuclear accidents or nuclear warfare.

New!!: Data integrity and Radiation hardening · See more »

RAID

RAID (Redundant Array of Independent Disks, originally Redundant Array of Inexpensive Disks) is a data storage virtualization technology that combines multiple physical disk drive components into one or more logical units for the purposes of data redundancy, performance improvement, or both.

New!!: Data integrity and RAID · See more »

Rationality

Rationality is the quality or state of being rational – that is, being based on or agreeable to reason.

New!!: Data integrity and Rationality · See more »

Redundancy (engineering)

In engineering, redundancy is the duplication of critical components or functions of a system with the intention of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve actual system performance, such as in the case of GNSS receivers, or multi-threaded computer processing.

New!!: Data integrity and Redundancy (engineering) · See more »

Referential integrity

Referential integrity is a property of data stating references within it are valid.

New!!: Data integrity and Referential integrity · See more »

Relational database

A relational database is a digital database based on the relational model of data, as proposed by E. F. Codd in 1970.

New!!: Data integrity and Relational database · See more »

Safety-critical system

A safety-critical system or life-critical system is a system whose failure or malfunction may result in one (or more) of the following outcomes.

New!!: Data integrity and Safety-critical system · See more »

Software bug

A software bug is an error, flaw, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways.

New!!: Data integrity and Software bug · See more »

Uninterruptible power supply

An uninterruptible power supply or uninterruptible power source (UPS) is an electrical apparatus that provides emergency power to a load when the input power source or mains power fails.

New!!: Data integrity and Uninterruptible power supply · See more »

Unix File System

The Unix file system (UFS; also called the Berkeley Fast File System, the BSD Fast File System or FFS) is a file system supported by many Unix and Unix-like operating systems.

New!!: Data integrity and Unix File System · See more »

Watchdog timer

A watchdog timer (sometimes called a computer operating properly or COP timer, or simply a watchdog) is an electronic timer that is used to detect and recover from computer malfunctions.

New!!: Data integrity and Watchdog timer · See more »

XFS

XFS is a high-performance 64-bit journaling file system created by Silicon Graphics, Inc (SGI) in 1993.

New!!: Data integrity and XFS · See more »

ZFS

ZFS is a combined file system and logical volume manager designed by Sun Microsystems and now owned by Oracle Corporation.

New!!: Data integrity and ZFS · See more »

Redirects here:

Database integrity, Domain integrity, Integrity constraint, Integrity constraints, Integrity protection, User-defined integrity.

References

[1] https://en.wikipedia.org/wiki/Data_integrity

OutgoingIncoming
Hey! We are on Facebook now! »