Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Free
Faster access than browser!
 

Information theory

Index Information theory

Information theory studies the quantification, storage, and communication of information. [1]

794 relations: A Mathematical Theory of Communication, Aaron D. Wyner, Abiogenesis, Abraham Lempel, Abraham Moles, Accelerando, Active listening, Additive white Gaussian noise, Adjusted mutual information, Aesthetics, Akaike information criterion, Alcatel-Lucent, Aleksandr Khinchin, Alex Hankey, Alfréd Rényi, Algorithmic cooling, Algorithmic information theory, Alignment-free sequence analysis, Alternating bit protocol, Andrew Targowski, Antiscience, Antony Garrett Lisi, Apoorva D. Patel, April 1916, Arbitrarily varying channel, Archives and Museum Informatics, Arieh Ben-Naim, Arnold Zellner, Artificial intelligence, Artificial intuition, Arturo Carsetti, Ascendency, Asher Peres, Association for Logic, Language and Information, Asymptotic equipartition property, Athanasios Papoulis, Babak Hassibi, Bachelor of Computer Science, Background noise, Backus–Naur form, Bandwidth (signal processing), Bar product, Bayesian estimation of templates in computational anatomy, Bayesian model of computational anatomy, Belief propagation, Bell Labs, Bell System Technical Journal, Benjamin Schumacher, Benjamin Weiss, Benoit Mandelbrot, ..., Bible code, Biclustering, Binary entropy function, Binary erasure channel, Binary logarithm, Binary symmetric channel, Binary tree, Bioinformatics, Biological network inference, Biomedical cybernetics, Bit, Bit error rate, Blackwell channel, Blahut–Arimoto algorithm, Blind signal separation, Block code, Brain, Branches of science, Brenda McCowan, Broadcasting (networking), Bursting, Business chess, Cable modem, Calcium encoding, Calvin Mooers, Carlo Kopp, Category utility, Channel capacity, Chaos theory, Charles H. Bennett (computer scientist), Cheung–Marks theorem, Chow–Liu tree, Chris Adami, Chris Mann (composer), Chris Wallace (computer scientist), Chunking (psychology), Claude E. Shannon Award, Claude Shannon, Cluster analysis, Cluster labeling, Code, Code rate, CODEC Acceleration, Coding theory, Coding tree unit, Cognitive bias, Cognitive philology, Cognitive psychology, Coherent information, Collisionless, Combinatorics, Communication, Communication channel, Communication source, Communication theory, Communication Theory of Secrecy Systems, Comparison sort, Complex random variable, Complex system, Complex systems biology, Complexity, Component video, Computational anatomy, Computational learning theory, Computational neuroscience, Computational science, Computational semiotics, Computational sociology, Computationally bounded adversary, Computer performance, Computer scientist, Concept learning, Conditional entropy, Conditional mutual information, Conditional quantum entropy, Conduit metaphor, Conference on Neural Information Processing Systems, Confidence, Confirmation bias, Conservatism (belief revision), Constraint (information theory), Constructor theory, Consumer education, Contingency table, Cross entropy, Cryptographically secure pseudorandom number generator, Cryptography, Cybernetical physics, Cybernetics, Cybernetics and Human Knowing, Cybernetics: Or Control and Communication in the Animal and the Machine, Damerau–Levenshtein distance, Danny Hillis, Data compression, Data conversion, Data differencing, Data processing inequality, Data synchronization, Data transmission, Dave Forney, David A. Huffman, David Blackwell, David Chalmers, David J. C. MacKay, David Wolpert, Deaths in January 2016, Decision tree learning, Decoding Reality, Decoding the Universe, Deletion channel, Descriptive knowledge, Determining the number of clusters in a data set, Dialectical materialism, Differential entropy, Differential geometry, Digital data, Digital philosophy, Digital physics, Digital signal processing, Dimensionality reduction, Directed information, Dirty paper coding, Discrete mathematics, Discrete Universal Denoiser, Distributed source coding, Distributive property, DNA binding site, Document dump, Domon group, Don Byrd, Donald MacCrimmon MacKay, Dual total correlation, Dutton Speedwords, Dynamic spectrum management, Dynamical neuroscience, Earth–Moon–Earth communication, Earthscore, Econophysics, Ecosystem model, Ed Posner, Edgar Morin, Educational toy, Edward Kofler, Edward Linfoot, Edwin Thompson Jaynes, Effective action, Efficient coding hypothesis, Elwyn Berlekamp, Encyclopedia of Cryptography and Security, Encyclopedia of Cybernetics, Engineer, Engineering physics, Ensemble average (statistical mechanics), Entropic uncertainty, Entropic vector, Entropy, Entropy (arrow of time), Entropy (journal), Entropy (statistical thermodynamics), Entropy and life, Entropy compression, Entropy encoding, Entropy in thermodynamics and information theory, Entropy of mixing, Entropy power inequality, Environment (systems), Erasure channel, Eric Lander, Error correction code, Error detection and correction, Error exponent, Error-correcting codes with feedback, Erwin Lutwak, Estimation theory, Etienne Vermeersch, Evolution of sexual reproduction, Exergy, Exformation, External memory algorithm, Extreme physical information, Fano's inequality, Fazlollah Reza, Feature Selection Toolbox, Fiber-optic communication, First-move advantage in chess, First-order inductive learner, Fisher information, Fisher information metric, Flemming Topsøe, Flow network, Forest informatics, Formal science, Formation matrix, Forward error correction, Foundations and Trends in Communications and Information Theory, Fourier–Motzkin elimination, Fred Dretske, Frederick Jelinek, Freedman's paradox, Frieder Nake, Functional decomposition, Functional load, Fungible information, Gabor wavelet, Gambling and information theory, Gary Anderson (designer), Gaylord, Michigan, Generalized distributive law, Generalized entropy index, Generalized relative entropy, Generative art, Generative science, Genetic code, George Armitage Miller, George Gilder, George Klir, Gibbs' inequality, Gleason's theorem, Glossary of electrical and electronics engineering, Golomb ruler, GOR method, Gottfried Ungerboeck, Gottfried Wilhelm Leibniz Prize, Grammatical Man, Graph coloring, Graph entropy, Gregory Raleigh, Group testing, Groupe des Dix, H. K. Kesavan, H1 neuron, Hamming distance, Hamming weight, Hanns Malissa, Hans Grassmann, Hard-core predicate, Harold Pender Award, Harry Nyquist, Harvey Jerome Brudner, Hendrik C. Ferreira, Hendrik Wade Bode, Henri Atlan, Henri Daniel Rathgeber, Henri Lefebvre, Henry Earl Singleton, Henry Landau, Henry O. Pollak, Henry Quastler, Hick's law, Hideki Imai, His Master's Voice (novel), History of artificial intelligence, History of communication studies, History of computer science, History of entropy, History of information theory, History of machine translation, History of mathematics, History of molecular biology, History of randomness, History of science, History of the Internet, History of thermodynamics, Holographic principle, Homeostasis, Horizontal correlation, Hubert Yockey, Huffman coding, Human performance modeling, Hypnosis, Ideal observer analysis, Ideal tasks, IEEE International Symposium on Information Theory, IEEE Transactions on Information Theory, IFISC, Ilan Sadeh, Illusory superiority, IMA Journal of Mathematical Control and Information, Implicit data structure, Imre Csiszár, Income inequality metrics, Incomplete Nature, Index of electrical engineering articles, Index of information theory articles, Index of optics articles, Index of philosophy articles (I–Q), Inductive probability, Inequalities in information theory, Info-metrics, Informatics, Information, Information (disambiguation), Information Age, Information algebra, Information bottleneck method, Information capital, Information diagram, Information dimension, Information distance, Information exchange, Information flow (disambiguation), Information flow (information theory), Information gain in decision trees, Information history, Information processor, Information projection, Information revolution, Information science, Information theory and measure theory, Information-theoretic death, Information-theoretic security, Ingleton's inequality, Inquiry, Intelligent design, Interactive media, International Conference on Information Processing in Sensor Networks, Iran Workshop on Communication and Information Theory, Irving S. Reed, István Vincze (mathematician), Ivan Sutherland, Jack Wolf, Jacob Wolfowitz, Jaikumar Radhakrishnan, James A. Krumhansl, James Massey, James Tenney, János Aczél (mathematician), Jim K. Omura, Joachim Hagenauer, John Larry Kelly Jr., John Preskill, John R. Pierce, John Scales Avery, John Wozencraft, Joint entropy, Joint quantum entropy, Joint source and channel coding, Jorma Rissanen, Joseph Kampé de Fériet, Josiah Willard Gibbs, Juan Gualterio Roederer, Kadir–Brady saliency detector, Katalin Marton, Kazimierz Urbanik, Kees Schouhamer Immink, Keith Martin Ball, Kenneth A. Loparo, Kenneth M. Sayre, Key size, Klaus Scherrer, Knowledge management, Knowledge retrieval, Kraft–McMillan inequality, Krichevsky–Trofimov estimator, Kullback's inequality, Landauer's principle, Large deviations theory, Lawrence J. Fogel, Léon Brillouin, Learning, Leo Szilard, Leonard Schulman, Leonid Levin, Levenshtein distance, Library of Congress Classification:Class Q -- Science, Library science, Limiting density of discrete points, Linear programming decoding, Linear response function, List of academic fields, List of African-American inventors and scientists, List of amateur radio modes, List of awards named after people, List of cognitive biases, List of Columbia University alumni, List of Columbia University alumni and attendees, List of computer scientists, List of cryptographers, List of game theorists, List of IEEE awards, List of important publications in computer science, List of important publications in cryptography, List of important publications in theoretical computer science, List of inequalities, List of Internet pioneers, List of mathematical theories, List of people considered father or mother of a scientific field, List of people from South Orange, New Jersey, List of people in systems and control, List of pioneers in computer science, List of quantitative analysts, List of Queens College people, List of Russian mathematicians, List of Russian scientists, List of social psychologists, List of statistics articles, List of Swedish Americans, List of theorems, List of University of Michigan alumni, List of University of North Dakota people, List of University of Utah people, List of unsolved problems in information theory, Lists of mathematics topics, Lloyd R. Welch, Lloyd's algorithm, Log probability, Log sum inequality, Logarithm, Logarithmic scale, Logarithmic Schrödinger equation, Logic of information, Lossless compression, Lossy compression, Low-density parity-check code, Lp space, Luciano Floridi, Ludwig von Bertalanffy, Macy conferences, Many-worlds interpretation, Map communication model, Marcel J. E. Golay, Marcel-Paul Schützenberger, Marcin Schroeder, Marcus Hutter, Mark Henry Hansen, Mark Semenovich Pinsker, Markov chain, Massachusetts Institute of Technology, Master of Science in Business Analytics, Mathematical beauty, Mathematical constant, Mathematical methods in electronics, Mathematical psychology, Mathematics, Mathematics of radio engineering, Max Bense, Maximum entropy, Maximum entropy probability distribution, Maximum entropy spectral estimation, Maximum entropy thermodynamics, Mean field particle methods, Meaning (semiotics), Measurement, Medford, Massachusetts, Media studies, Melodic expectation, Mental chronometry, Mesoeconomics, Middle European Cooperation in Statistical Physics, MIMO, Min entropy, Mind, Minimum description length, Minimum Fisher information, Minimum message length, Minivac 601, MIT Electrical Engineering and Computer Science Department, Models of collaborative tagging, Mohammad Reza Aref, Mooney Face Test, Morse code, Most frequent k characters, MPEG-1, MRK (visual artist), Multiple description coding, Multivariate mutual information, Music learning theory, Mutual information, N-gram, Nariman Farvardin, Nathaniel Rochester (computer scientist), Negentropy, Neural network, Nicolas J. Cerf, Ninoslav Marina, Noise (signal processing), Noisy-channel coding theorem, Noisy-storage model, Norbert Wiener, Norm (mathematics), Nucleic acid design, Numbers (season 2), Objections to evolution, Observer, Observer (special relativity), Occam's razor, Occupations in electrical/electronics engineering, Olivier Costa de Beauregard, One-time pad, Open system (systems theory), Optimal design, Orange Poodle, Ordinal data, Oregon State University, Outage probability, Outline of academic disciplines, Outline of automation, Outline of communication, Outline of computing, Outline of discrete mathematics, Outline of electrical engineering, Outline of mathematics, Outline of radio science, Outline of science, Parity (mathematics), Partition function (mathematics), Partition function (statistical mechanics), Password strength, Paul Vitányi, Pedometric mapping, Pedro Crespo, Perceptual paradox, Perplexity, Peter Elias, Peter Franaszek, Petoskey, Michigan, Phi Kappa Phi, Philip Woodward, Philosophy of information, Philosophy of thermal and statistical physics, Physical information, Pinsker's inequality, Plan, Planar separator theorem, Point estimation, Pointwise mutual information, Polar code (coding theory), Positive-definite kernel, Post-industrial society, Pragmatic theory of information, Prediction in language comprehension, Principle of maximum entropy, Prior probability, Probabilistic method, Problem domain, Protein structure prediction, Prototype theory, Pseudorandomness, Quantities of information, Quantum channel, Quantum entanglement, Quantum information, Quantum key distribution, Quantum reference frame, Quantum teleportation, R. Luke DuBois, Rafael Capurro, Ralph Hartley, Ramon Margalef, Randal A. Koene, Randall Dougherty, Randomness, Rate–distortion theory, Ray C. Dougherty, Rényi entropy, Receiver (information theory), Reductionism, Redundancy (information theory), Reinforcement learning, Relational quantum mechanics, Relational theory, Relay channel, Research Institute of Computer Science and Random Systems, Retina, Richard Blahut, Richard Leibler, Robert Calderbank, Robert Fano, Robert G. Gallager, Robert M. Gray, Robert McEliece, Robert Tienwen Chien, Robert Ulanowicz, Robert Vallée, Roland Dobrushin, Roulette, Rudolf Ahlswede, Sackur–Tetrode equation, Sanjeev Kulkarni, Sanov's theorem, Santa Fe Institute, Sara Imari Walker, Scene statistics, Science, Science and technology in Russia, Scientia Iranica, Scientific method, Score (statistics), Scoring rule, Secret sharing, Secure two-party computation, Self-information, Self-organization, Self-organized criticality, Semantic compression, Semiotic information theory, Semiotics, Sepp Hochreiter, Sergey Bobkov, Sergio Albeverio, Sergio Verdú, Set redundancy compression, Shannon's source coding theorem, Shannon–Fano coding, Shannon–Fano–Elias coding, Shannon–Hartley theorem, Shannon–Weaver model, Shearer's inequality, Shlomo Shamai, Sidney Dancoff, Signal, Signal processing, Signal-to-interference-plus-noise ratio, Sinc function, Slepian–Wolf coding, Smart grid, Sneakernet, Social network, Soft heap, Soft-decision decoder, Solèr's theorem, Solomon Kullback, Solving chess, Spatial correlation, Specific-information, Specified complexity, Spekkens toy model, Spiking neural network, Spin–spin relaxation, Splice site mutation, Squashed entanglement, Statistical distance, Statistical inference, Statistical machine translation, Steganography, Stephen O. Rice, Steven H. Simon, Stochastic, Stochastic geometry models of wireless networks, Stochastic process, Structural information theory, Structured expert judgment: the classical model, Subadditivity effect, Subhash Kak, Succinct data structure, Supertask, Surprisal analysis, Surround suppression, Symbol rate, Symbolic dynamics, Synergy, System analysis, Systems science, Szeged index, T-symmetry, Tadao Kasami, Te Sun Han, Telecommunications engineering, Television standards conversion, Temporal information retrieval, Terence Tao, Tf–idf, The Information: A History, a Theory, a Flood, The Ingenuity Gap, The Pattern on the Stone, Theil index, Theoretical computer science, Theoretical ecology, Theoretical physics, Theory, Theory of Visualization, Thermodynamic beta, Thomas Huang, Thomas Kailath, Thomas M. Cover, Timeline of communication technology, Timeline of cryptography, Timeline of information theory, Timeline of machine translation, Timeline of mathematics, Timeline of quantum computing, Timeline of scientific discoveries, Timeline of scientific thought, Timeline of thermodynamics, Toby Berger, Too Big to Know, Total correlation, Trust (emotion), Trusted system, Tsallis entropy, Tunstall coding, Turbo code, Twenty Questions, Typical set, Typical subspace, Ulam's game, Uncertainty, Uncertainty reduction theory, Unicycle, Units of information, Universal portfolio algorithm, University of Michigan, University of North Dakota, University of Rijeka, University of Utah College of Engineering, Uplift modelling, Useless machine, Value of information, Variable-order Markov model, Variant of uncertain significance, Variation of information, Variety (cybernetics), Venti, Visual Information Fidelity, Vitold Belevitch, Vladimir Kotelnikov, Vladimir Levenshtein, Volume of an n-ball, Von Neumann entropy, Warren Weaver, Wassim Michael Haddad, Watchmaker analogy, Weissman score, Werner Meyer-Eppler, William A. Dembski, William Bialek, William Lucas Root, Witsenhausen's counterexample, Wojciech Szpankowski, Worse-than-average effect, Xiaodong Wang (electrical engineer), Yaakov Ziv, Yuri Linnik, Z-channel (information theory), Zellig Harris, Zenon Pylyshyn, Zipf's law, 1889, 1916 in science, 1916 in the United States, 1948 in science, 1976, 2013 in science, 20th century in science. Expand index (744 more) »

A Mathematical Theory of Communication

"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.

New!!: Information theory and A Mathematical Theory of Communication · See more »

Aaron D. Wyner

Dr.

New!!: Information theory and Aaron D. Wyner · See more »

Abiogenesis

Abiogenesis, or informally the origin of life,Compare: Also occasionally called biopoiesis.

New!!: Information theory and Abiogenesis · See more »

Abraham Lempel

Abraham Lempel (אברהם למפל, born 10 February 1936) is an Israeli computer scientist and one of the fathers of the LZ family of lossless data compression algorithms.

New!!: Information theory and Abraham Lempel · See more »

Abraham Moles

Abraham Moles (1920 – 22 May 1992) was an engineer of electrical engineering and acoustics, and a doctor of physics and philosophy.

New!!: Information theory and Abraham Moles · See more »

Accelerando

Accelerando is a 2005 science fiction novel consisting of a series of interconnected short stories written by British author Charles Stross.

New!!: Information theory and Accelerando · See more »

Active listening

Active listening is a communication technique that is used in counseling, training, and conflict resolution.

New!!: Information theory and Active listening · See more »

Additive white Gaussian noise

Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature.

New!!: Information theory and Additive white Gaussian noise · See more »

Adjusted mutual information

In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings.

New!!: Information theory and Adjusted mutual information · See more »

Aesthetics

Aesthetics (also spelled esthetics) is a branch of philosophy that explores the nature of art, beauty, and taste, with the creation and appreciation of beauty.

New!!: Information theory and Aesthetics · See more »

Akaike information criterion

The Akaike information criterion (AIC) is an estimator of the relative quality of statistical models for a given set of data.

New!!: Information theory and Akaike information criterion · See more »

Alcatel-Lucent

Alcatel-Lucent S.A. was a French global telecommunications equipment company, headquartered in Boulogne-Billancourt, France.

New!!: Information theory and Alcatel-Lucent · See more »

Aleksandr Khinchin

Aleksandr Yakovlevich Khinchin (Алекса́ндр Я́ковлевич Хи́нчин, Alexandre Khintchine; July 19, 1894 – November 18, 1959) was a Soviet mathematician and one of the most significant people in the Soviet school of probability theory.

New!!: Information theory and Aleksandr Khinchin · See more »

Alex Hankey

Alex Hankey (born 18 August 1947) is a theoretical physicist trained at Massachusetts Institute of Technology and Cambridge University.

New!!: Information theory and Alex Hankey · See more »

Alfréd Rényi

Alfréd Rényi (20 March 1921 – 1 February 1970) was a Hungarian mathematician who made contributions in combinatorics, graph theory, number theory but mostly in probability theory.

New!!: Information theory and Alfréd Rényi · See more »

Algorithmic cooling

Algorithmic cooling is an algorithmic method for transferring heat (or entropy) from some qubits to others or outside the system and into the environment, which results in a cooling effect.

New!!: Information theory and Algorithmic cooling · See more »

Algorithmic information theory

Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information.

New!!: Information theory and Algorithmic information theory · See more »

Alignment-free sequence analysis

In bioinformatics, alignment-free sequence analysis approaches to molecular sequence and structure data provide alternatives over alignment-based approaches.

New!!: Information theory and Alignment-free sequence analysis · See more »

Alternating bit protocol

Alternating bit protocol (ABP) is a simple network protocol operating at the data link layer that retransmits lost or corrupted messages using FIFO semantics.

New!!: Information theory and Alternating bit protocol · See more »

Andrew Targowski

Andrew (Andrzej) Stanislaw Targowski (born October 9, 1937 in Warsaw, Poland) is a Polish-American computer scientist specializing in enterprise computing, societal computing, information technology impact upon civilization, information theory, wisdom theory, and civilization theory.

New!!: Information theory and Andrew Targowski · See more »

Antiscience

Antiscience is a position that rejects science and the scientific method.

New!!: Information theory and Antiscience · See more »

Antony Garrett Lisi

Antony Garrett Lisi (born January 24, 1968), known as Garrett Lisi, is an American theoretical physicist and adventure sports enthusiast.

New!!: Information theory and Antony Garrett Lisi · See more »

Apoorva D. Patel

Apoorva D. Patel is a Professor at the Centre for High Energy Physics, Indian Institute of Science, Bangalore.

New!!: Information theory and Apoorva D. Patel · See more »

April 1916

The following events occurred in April 1916.

New!!: Information theory and April 1916 · See more »

Arbitrarily varying channel

An arbitrarily varying channel (AVC) is a communication channel model used in coding theory, and was first introduced by Blackwell, Breiman, and Thomasian.

New!!: Information theory and Arbitrarily varying channel · See more »

Archives and Museum Informatics

Archives and Museum Informatics is a journal published by Springer.

New!!: Information theory and Archives and Museum Informatics · See more »

Arieh Ben-Naim

Arieh Ben-Naim (Hebrew: אריה בן-נאים; Jerusalem, 11 July 1934) is a professor of physical chemistry who retired in 2003 from the Hebrew University of Jerusalem.

New!!: Information theory and Arieh Ben-Naim · See more »

Arnold Zellner

Arnold Zellner (January 2, 1927 – August 11, 2010) was an American economist and statistician specializing in the fields of Bayesian probability and econometrics.

New!!: Information theory and Arnold Zellner · See more »

Artificial intelligence

Artificial intelligence (AI, also machine intelligence, MI) is intelligence demonstrated by machines, in contrast to the natural intelligence (NI) displayed by humans and other animals.

New!!: Information theory and Artificial intelligence · See more »

Artificial intuition

The theoretical concept of artificial intuition is the capacity of an artificial object or software to function with the factor of consciousness known as intuition: a machine-based system that has some capacity to function analogously to human intuition.

New!!: Information theory and Artificial intuition · See more »

Arturo Carsetti

Arturo Carsetti is an Italian Philosopher of sciences and former Professor of philosophy of science at the University of Bari and the University of Rome Tor Vergata.

New!!: Information theory and Arturo Carsetti · See more »

Ascendency

Ascendency is a quantitative attribute of an ecosystem, defined as a function of the ecosystem's trophic network.

New!!: Information theory and Ascendency · See more »

Asher Peres

Asher Peres (אשר פרס; January 30, 1934 – January 1, 2005) was an Israeli physicist, considered a pioneer in quantum information theory, as well as the connections between quantum mechanics and the theory of relativity.

New!!: Information theory and Asher Peres · See more »

Association for Logic, Language and Information

The Association for Logic, Language and Information (FoLLI) is an international, especially European, learned society administered from Nancy-Université in France.

New!!: Information theory and Association for Logic, Language and Information · See more »

Asymptotic equipartition property

In information theory, the asymptotic equipartition property (AEP) is a general property of the output samples of a stochastic source.

New!!: Information theory and Asymptotic equipartition property · See more »

Athanasios Papoulis

Athanasios Papoulis (Αθανάσιος Παπούλης; 1921 – April 25, 2002) was a Greek-American engineer and applied mathematician.

New!!: Information theory and Athanasios Papoulis · See more »

Babak Hassibi

Babak Hassibi (بابک حسیبی, born in Tehran, Iran) is an Iranian-American electrical engineer who is the inaugural Mose and Lillian S. Bohn Professor of Electrical Engineering at the California Institute of Technology (Caltech).

New!!: Information theory and Babak Hassibi · See more »

Bachelor of Computer Science

The Bachelor of Computer Science or Bachelor of Science in Computer Science (abbreviated BCompSc or BCS or BS CS or B.Sc. CS) is a type of bachelor's degree, usually awarded after three or four years of collegiate study in computer science, but possibly awarded in fewer years depending on factors such as an institution's course requirements and academic calendar.

New!!: Information theory and Bachelor of Computer Science · See more »

Background noise

Background noise or ambient noise is any sound other than the sound being monitored (primary sound).

New!!: Information theory and Background noise · See more »

Backus–Naur form

In computer science, Backus–Naur form or Backus normal form (BNF) is a notation technique for context-free grammars, often used to describe the syntax of languages used in computing, such as computer programming languages, document formats, instruction sets and communication protocols.

New!!: Information theory and Backus–Naur form · See more »

Bandwidth (signal processing)

Bandwidth is the difference between the upper and lower frequencies in a continuous band of frequencies.

New!!: Information theory and Bandwidth (signal processing) · See more »

Bar product

In information theory, the bar product of two linear codes C2 ⊆ C1 is defined as where (a | b) denotes the concatenation of a and b. If the code words in C1 are of length n, then the code words in C1 | C2 are of length 2n.

New!!: Information theory and Bar product · See more »

Bayesian estimation of templates in computational anatomy

Statistical shape analysis and statistical shape theory in computational anatomy (CA) is performed relative to templates, therefore it is a local theory of statistics on shape.

New!!: Information theory and Bayesian estimation of templates in computational anatomy · See more »

Bayesian model of computational anatomy

Computational anatomy (CA) is a discipline within medical imaging focusing on the study of anatomical shape and form at the visible or gross anatomical scale of morphology.

New!!: Information theory and Bayesian model of computational anatomy · See more »

Belief propagation

Belief propagation, also known as sum-product message passing, is a message-passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields.

New!!: Information theory and Belief propagation · See more »

Bell Labs

Nokia Bell Labs (formerly named AT&T Bell Laboratories, Bell Telephone Laboratories and Bell Labs) is an American research and scientific development company, owned by Finnish company Nokia.

New!!: Information theory and Bell Labs · See more »

Bell System Technical Journal

The Bell System Technical Journal was a periodical publication by the American Telephone and Telegraph Company (AT&T) in New York devoted to the scientific and engineering aspects of electrical communication.

New!!: Information theory and Bell System Technical Journal · See more »

Benjamin Schumacher

Benjamin "Ben" Schumacher is an American theoretical physicist, working mostly in the field of quantum information theory.

New!!: Information theory and Benjamin Schumacher · See more »

Benjamin Weiss

Benjamin Weiss (בנימין ווייס.; born 1941 in New York City) is an American-Israeli mathematician known for his contributions to Ergodic Theory, Topological dynamics, Probability theory, Game Theory, Descriptive set theory.

New!!: Information theory and Benjamin Weiss · See more »

Benoit Mandelbrot

Benoit B.  Mandelbrot  (20 November 1924 – 14 October 2010) was a Polish-born, French and American mathematician and polymath with broad interests in the practical sciences, especially regarding what he labeled as "the art of roughness" of physical phenomena and "the uncontrolled element in life".

New!!: Information theory and Benoit Mandelbrot · See more »

Bible code

The Bible code (הצופן התנ"כי, hatzofen hatanachi), also known as the Torah code, is a purported set of secret messages encoded within the Hebrew text of the Torah.

New!!: Information theory and Bible code · See more »

Biclustering

Biclustering, block clustering, co-clustering, or two-mode clustering is a data mining technique which allows simultaneous clustering of the rows and columns of a matrix.

New!!: Information theory and Biclustering · See more »

Binary entropy function

In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values.

New!!: Information theory and Binary entropy function · See more »

Binary erasure channel

disambiguation: Landauer's principle A binary erasure channel (or BEC) is a common communications channel model used in coding theory and information theory.

New!!: Information theory and Binary erasure channel · See more »

Binary logarithm

In mathematics, the binary logarithm is the power to which the number must be raised to obtain the value.

New!!: Information theory and Binary logarithm · See more »

Binary symmetric channel

A binary symmetric channel (or BSC) is a common communications channel model used in coding theory and information theory.

New!!: Information theory and Binary symmetric channel · See more »

Binary tree

In computer science, a binary tree is a tree data structure in which each node has at most two children, which are referred to as the and the.

New!!: Information theory and Binary tree · See more »

Bioinformatics

Bioinformatics is an interdisciplinary field that develops methods and software tools for understanding biological data.

New!!: Information theory and Bioinformatics · See more »

Biological network inference

Biological network inference is the process of making inferences and predictions about biological networks.

New!!: Information theory and Biological network inference · See more »

Biomedical cybernetics

Biomedical cybernetics investigates signal processing, decision making and control structures in living organisms.

New!!: Information theory and Biomedical cybernetics · See more »

Bit

The bit (a portmanteau of binary digit) is a basic unit of information used in computing and digital communications.

New!!: Information theory and Bit · See more »

Bit error rate

In digital transmission, the number of bit errors is the number of received bits of a data stream over a communication channel that have been altered due to noise, interference, distortion or bit synchronization errors.

New!!: Information theory and Bit error rate · See more »

Blackwell channel

The Blackwell channel is a deterministic broadcast channel model used in coding theory and information theory.

New!!: Information theory and Blackwell channel · See more »

Blahut–Arimoto algorithm

The Blahut–Arimoto algorithm, is often used to refer to a class of algorithms for computing numerically either the information theoretic capacity of a channel, or the rate-distortion function of a source.

New!!: Information theory and Blahut–Arimoto algorithm · See more »

Blind signal separation

Blind signal separation (BSS), also known as blind source separation, is the separation of a set of source signals from a set of mixed signals, without the aid of information (or with very little information) about the source signals or the mixing process.

New!!: Information theory and Blind signal separation · See more »

Block code

In coding theory, a block code is any member of the large and important family of error-correcting codes that encode data in blocks.

New!!: Information theory and Block code · See more »

Brain

The brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals.

New!!: Information theory and Brain · See more »

Branches of science

The branches of science, also referred to as sciences, "scientific fields", or "scientific disciplines" are commonly divided into three major groups.

New!!: Information theory and Branches of science · See more »

Brenda McCowan

Brenda McCowan is a research behaviorist interested in evolutionary, biological, and ecological aspects of animal behavior and communication.

New!!: Information theory and Brenda McCowan · See more »

Broadcasting (networking)

In computer networking, telecommunication and information theory, broadcasting is a method of transferring a message to all recipients simultaneously.

New!!: Information theory and Broadcasting (networking) · See more »

Bursting

Bursting, or burst firing, is an extremely diverse general phenomenon of the activation patterns of neurons in the central nervous system and spinal cord where periods of rapid action potential spiking are followed by G0 phase quiescent periods.

New!!: Information theory and Bursting · See more »

Business chess

Business chess is a variant of chess played in teams.

New!!: Information theory and Business chess · See more »

Cable modem

A cable modem is a type of network bridge that provides bi-directional data communication via radio frequency channels on a hybrid fibre-coaxial (HFC) and radio frequency over glass (RFoG) infrastructure.

New!!: Information theory and Cable modem · See more »

Calcium encoding

Calcium encoding (also referred to as Ca2+ encoding or calcium information processing) is an intracellular signaling pathway used by many cells to transfer, process and encode external information detected by the cell.

New!!: Information theory and Calcium encoding · See more »

Calvin Mooers

Calvin Northrup Mooers (October 24, 1919 – December 1, 1994), was an American computer scientist known for his work in information retrieval and for the programming language TRAC.

New!!: Information theory and Calvin Mooers · See more »

Carlo Kopp

Carlo Kopp is an Australian freelance defence analyst and academic who has published some 300 articles in defense security publications.

New!!: Information theory and Carlo Kopp · See more »

Category utility

Category utility is a measure of "category goodness" defined in and.

New!!: Information theory and Category utility · See more »

Channel capacity

Channel capacity, in electrical engineering, computer science and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

New!!: Information theory and Channel capacity · See more »

Chaos theory

Chaos theory is a branch of mathematics focusing on the behavior of dynamical systems that are highly sensitive to initial conditions.

New!!: Information theory and Chaos theory · See more »

Charles H. Bennett (computer scientist)

Charles Henry Bennett (b. 1943) is a physicist, information theorist and IBM Fellow at IBM Research.

New!!: Information theory and Charles H. Bennett (computer scientist) · See more »

Cheung–Marks theorem

In information theory, the Cheung–Marks theorem,J.L. Brown and S.D.Cabrera, "On well-posedness of the Papoulis generalized sampling expansion," IEEE Transactions on Circuits and Systems, May 1991 Volume: 38, Issue 5, pp.

New!!: Information theory and Cheung–Marks theorem · See more »

Chow–Liu tree

In probability theory and statistics Chow–Liu tree is an efficient method for constructing a second-order product approximation of a joint probability distribution, first described in a paper by.

New!!: Information theory and Chow–Liu tree · See more »

Chris Adami

Christoph Carl Herbert "Chris" Adami (born August 30, 1962) is a professor of Microbiology and Molecular Genetics, as well as professor of Physics and Astronomy, at Michigan State University.

New!!: Information theory and Chris Adami · See more »

Chris Mann (composer)

Chris Mann (born 1949) is an Australian-American composer, poet and performer specializing in the emerging field of compositional linguistics, coined by Kenneth Gaburo and described by Mann as "the mechanism whereby you understand what I'm thinking better than I do".

New!!: Information theory and Chris Mann (composer) · See more »

Chris Wallace (computer scientist)

Christopher Stewart "Chris" Wallace (26 October 1933 – 7 August 2004) was an Australian computer scientist and physicist.

New!!: Information theory and Chris Wallace (computer scientist) · See more »

Chunking (psychology)

In cognitive psychology, chunking is a process by which individual pieces of information are bound together into a meaningful whole (Neath & Surprenant, 2003).

New!!: Information theory and Chunking (psychology) · See more »

Claude E. Shannon Award

The Claude E. Shannon Award of the IEEE Information Theory Society was created to honor consistent and profound contributions to the field of information theory.

New!!: Information theory and Claude E. Shannon Award · See more »

Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory".

New!!: Information theory and Claude Shannon · See more »

Cluster analysis

Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense) to each other than to those in other groups (clusters).

New!!: Information theory and Cluster analysis · See more »

Cluster labeling

In natural language processing and information retrieval, cluster labeling is the problem of picking descriptive, human-readable labels for the clusters produced by a document clustering algorithm; standard clustering algorithms do not typically produce any such labels.

New!!: Information theory and Cluster labeling · See more »

Code

In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form or representation, sometimes shortened or secret, for communication through a communication channel or storage in a storage medium.

New!!: Information theory and Code · See more »

Code rate

In telecommunication and information theory, the code rate (or information rate) of a forward error correction code is the proportion of the data-stream that is useful (non-redundant).

New!!: Information theory and Code rate · See more »

CODEC Acceleration

Codec Acceleration describes computer hardware that offloads the computationally intensive compression or decompression.

New!!: Information theory and CODEC Acceleration · See more »

Coding theory

Coding theory is the study of the properties of codes and their respective fitness for specific applications.

New!!: Information theory and Coding theory · See more »

Coding tree unit

Coding tree unit (CTU) is the basic processing unit of the High Efficiency Video Coding (HEVC) video standard and conceptually corresponds in structure to macroblock units that were used in several previous video standards.

New!!: Information theory and Coding tree unit · See more »

Cognitive bias

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment.

New!!: Information theory and Cognitive bias · See more »

Cognitive philology

Cognitive philology is the science that studies written and oral texts as the product of human mental processes.

New!!: Information theory and Cognitive philology · See more »

Cognitive psychology

Cognitive psychology is the study of mental processes such as "attention, language use, memory, perception, problem solving, creativity, and thinking".

New!!: Information theory and Cognitive psychology · See more »

Coherent information

Coherent information is an entropy measure used in quantum information theory.

New!!: Information theory and Coherent information · See more »

Collisionless

Collisionless has multiple meanings.

New!!: Information theory and Collisionless · See more »

Combinatorics

Combinatorics is an area of mathematics primarily concerned with counting, both as a means and an end in obtaining results, and certain properties of finite structures.

New!!: Information theory and Combinatorics · See more »

Communication

Communication (from Latin commūnicāre, meaning "to share") is the act of conveying intended meanings from one entity or group to another through the use of mutually understood signs and semiotic rules.

New!!: Information theory and Communication · See more »

Communication channel

A communication channel or simply channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking.

New!!: Information theory and Communication channel · See more »

Communication source

A source or sender is one of the basic concepts of communication and information processing.

New!!: Information theory and Communication source · See more »

Communication theory

Communication theory is a field of information theory and mathematics that studies the technical process of information and the process of human communication.

New!!: Information theory and Communication theory · See more »

Communication Theory of Secrecy Systems

"Communication Theory of Secrecy Systems" is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory.

New!!: Information theory and Communication Theory of Secrecy Systems · See more »

Comparison sort

A comparison sort is a type of sorting algorithm that only reads the list elements through a single abstract comparison operation (often a "less than or equal to" operator or a three-way comparison) that determines which of two elements should occur first in the final sorted list.

New!!: Information theory and Comparison sort · See more »

Complex random variable

In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers.

New!!: Information theory and Complex random variable · See more »

Complex system

A complex system is a system composed of many components which may interact with each other.

New!!: Information theory and Complex system · See more »

Complex systems biology

Complex systems biology (CSB) is a branch or subfield of mathematical and theoretical biology concerned with complexity of both structure and function in biological organisms, as well as the emergence and evolution of organisms and species, with emphasis being placed on the complex interactions of, and within, bionetworks, and on the fundamental relations and relational patterns that are essential to life.

New!!: Information theory and Complex systems biology · See more »

Complexity

Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, meaning there is no reasonable higher instruction to define the various possible interactions.

New!!: Information theory and Complexity · See more »

Component video

Component video is a video signal that has been split into two or more component channels.

New!!: Information theory and Component video · See more »

Computational anatomy

Computational anatomy is an interdisciplinary field of biology focused on quantitative investigation and modelling of anatomical shapes variability.

New!!: Information theory and Computational anatomy · See more »

Computational learning theory

In computer science, computational learning theory (or just learning theory) is a subfield of Artificial Intelligence devoted to studying the design and analysis of machine learning algorithms.

New!!: Information theory and Computational learning theory · See more »

Computational neuroscience

Computational neuroscience (also known as theoretical neuroscience or mathematical neuroscience) is a branch of neuroscience which employs mathematical models, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.

New!!: Information theory and Computational neuroscience · See more »

Computational science

Computational science (also scientific computing or scientific computation (SC)) is a rapidly growing multidisciplinary field that uses advanced computing capabilities to understand and solve complex problems.

New!!: Information theory and Computational science · See more »

Computational semiotics

Computational semiotics is an interdisciplinary field that applies, conducts, and draws on research in logic, mathematics, the theory and practice of computation, formal and natural language studies, the cognitive sciences generally, and semiotics proper.

New!!: Information theory and Computational semiotics · See more »

Computational sociology

Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena.

New!!: Information theory and Computational sociology · See more »

Computationally bounded adversary

In information theory, the computationally bounded adversary problem is a different way of looking at the problem of sending data over a noisy channel.

New!!: Information theory and Computationally bounded adversary · See more »

Computer performance

Computer performance is the amount of work accomplished by a computer system.

New!!: Information theory and Computer performance · See more »

Computer scientist

A computer scientist is a person who has acquired the knowledge of computer science, the study of the theoretical foundations of information and computation and their application.

New!!: Information theory and Computer scientist · See more »

Concept learning

Concept learning, also known as category learning, concept attainment, and concept formation, is defined by Bruner, Goodnow, & Austin (1967) as "the search for and listing of attributes that can be used to distinguish exemplars from non exemplars of various categories".

New!!: Information theory and Concept learning · See more »

Conditional entropy

In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.

New!!: Information theory and Conditional entropy · See more »

Conditional mutual information

In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third.

New!!: Information theory and Conditional mutual information · See more »

Conditional quantum entropy

The conditional quantum entropy is an entropy measure used in quantum information theory.

New!!: Information theory and Conditional quantum entropy · See more »

Conduit metaphor

In linguistics, the conduit metaphor is a dominant class of figurative expressions used when discussing communication itself (metalanguage).

New!!: Information theory and Conduit metaphor · See more »

Conference on Neural Information Processing Systems

The Conference and Workshop on Neural Information Processing Systems (NIPS) is a machine learning and computational neuroscience conference held every December.

New!!: Information theory and Conference on Neural Information Processing Systems · See more »

Confidence

Confidence has a common meaning of a certainty about handling something, such as work, family, social events, or relationships.

New!!: Information theory and Confidence · See more »

Confirmation bias

Confirmation bias, also called confirmatory bias or myside bias,David Perkins, a professor and researcher at the Harvard Graduate School of Education, coined the term "myside bias" referring to a preference for "my" side of an issue.

New!!: Information theory and Confirmation bias · See more »

Conservatism (belief revision)

In cognitive psychology and decision science, conservatism or conservatism bias is a bias in human information processing, which refers to the tendency to revise one's belief insufficiently when presented with new evidence.

New!!: Information theory and Conservatism (belief revision) · See more »

Constraint (information theory)

Constraint in information theory is the degree of statistical dependence between or among variables.

New!!: Information theory and Constraint (information theory) · See more »

Constructor theory

Constructor theory is a proposal for a new mode of explanation in fundamental physics, first sketched out by David Deutsch, a quantum physicist at the University of Oxford, in 2012.

New!!: Information theory and Constructor theory · See more »

Consumer education

Consumer education is the preparation of an individual through skills, concepts and understanding that are required for everyday living to achieve maximum satisfaction and utilization of his/her resources.

New!!: Information theory and Consumer education · See more »

Contingency table

In statistics, a contingency table (also known as a cross tabulation or crosstab) is a type of table in a matrix format that displays the (multivariate) frequency distribution of the variables.

New!!: Information theory and Contingency table · See more »

Cross entropy

In information theory, the cross entropy between two probability distributions p and q over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set, if a coding scheme is used that is optimized for an "unnatural" probability distribution q, rather than the "true" distribution p. The cross entropy for the distributions p and q over a given set is defined as follows: where H(p) is the entropy of p, and D_(p \| q) is the Kullback–Leibler divergence of q from p (also known as the relative entropy of p with respect to q — note the reversal of emphasis).

New!!: Information theory and Cross entropy · See more »

Cryptographically secure pseudorandom number generator

A cryptographically secure pseudo-random number generator (CSPRNG) or cryptographic pseudo-random number generator (CPRNG) is a pseudo-random number generator (PRNG) with properties that make it suitable for use in cryptography.

New!!: Information theory and Cryptographically secure pseudorandom number generator · See more »

Cryptography

Cryptography or cryptology (from κρυπτός|translit.

New!!: Information theory and Cryptography · See more »

Cybernetical physics

Cybernetical physics is a scientific area on the border of cybernetics and physics which studies physical systems with cybernetical methods.

New!!: Information theory and Cybernetical physics · See more »

Cybernetics

Cybernetics is a transdisciplinary approach for exploring regulatory systems—their structures, constraints, and possibilities.

New!!: Information theory and Cybernetics · See more »

Cybernetics and Human Knowing

Cybernetics and Human Knowing: A Journal of Second Order Cybernetics, Autopoiesis & Cyber-Semiotics is a quarterly peer-reviewed academic journal covering autopoiesis, biosemiotics, cognition, complexity, cybersemiotics, hermeneutics, information theory, linguistics, second-order cybernetics, semiotics, and systems theory, among others.

New!!: Information theory and Cybernetics and Human Knowing · See more »

Cybernetics: Or Control and Communication in the Animal and the Machine

Cybernetics: Or Control and Communication in the Animal and the Machine is a book written by Norbert Wiener and published in 1948.

New!!: Information theory and Cybernetics: Or Control and Communication in the Animal and the Machine · See more »

Damerau–Levenshtein distance

In information theory and computer science, the Damerau–Levenshtein distance (named after Frederick J. Damerau and Vladimir I. Levenshtein.) is a string metric for measuring the edit distance between two sequences.

New!!: Information theory and Damerau–Levenshtein distance · See more »

Danny Hillis

William Daniel "Danny" Hillis (born September 25, 1956) is an American inventor, entrepreneur, scientist, and writer who is particularly known for his work in computer science.

New!!: Information theory and Danny Hillis · See more »

Data compression

In signal processing, data compression, source coding, or bit-rate reduction involves encoding information using fewer bits than the original representation.

New!!: Information theory and Data compression · See more »

Data conversion

Data conversion is the conversion of computer data from one format to another.

New!!: Information theory and Data conversion · See more »

Data differencing

In computer science and information theory, data differencing or differential compression is producing a technical description of the difference between two sets of data – a source and a target.

New!!: Information theory and Data differencing · See more »

Data processing inequality

The Data processing inequality is an information theoretic concept which states that the information content of a signal cannot be increased via a local physical operation.

New!!: Information theory and Data processing inequality · See more »

Data synchronization

Data synchronization is the process of establishing consistency among data from a source to a target data storage and vice versa and the continuous harmonization of the data over time.

New!!: Information theory and Data synchronization · See more »

Data transmission

Data transmission (also data communication or digital communications) is the transfer of data (a digital bitstream or a digitized analog signal) over a point-to-point or point-to-multipoint communication channel.

New!!: Information theory and Data transmission · See more »

Dave Forney

George David "Dave" Forney, Jr. (born March 6, 1940) is an American electrical engineer who made contributions in telecommunication system theory, specifically in coding theory and information theory.

New!!: Information theory and Dave Forney · See more »

David A. Huffman

David Albert Huffman (August 9, 1925 – October 7, 1999) was a pioneer in computer science, known for his Huffman coding.

New!!: Information theory and David A. Huffman · See more »

David Blackwell

David Harold Blackwell (April 24, 1919 – July 8, 2010) was an American statistician and mathematician who made significant contributions to game theory, probability theory, information theory, and Bayesian statistics.

New!!: Information theory and David Blackwell · See more »

David Chalmers

David John Chalmers (born 20 April 1966) is an Australian philosopher and cognitive scientist specializing in the areas of philosophy of mind and philosophy of language.

New!!: Information theory and David Chalmers · See more »

David J. C. MacKay

Sir David John Cameron MacKay (22 April 1967 – 14 April 2016) was a British physicist, mathematician, and academic.

New!!: Information theory and David J. C. MacKay · See more »

David Wolpert

David H. Wolpert is an American mathematician, physicist and computer scientist.

New!!: Information theory and David Wolpert · See more »

Deaths in January 2016

The following is a list of notable deaths in January 2016.

New!!: Information theory and Deaths in January 2016 · See more »

Decision tree learning

Decision tree learning uses a decision tree (as a predictive model) to go from observations about an item (represented in the branches) to conclusions about the item's target value (represented in the leaves).

New!!: Information theory and Decision tree learning · See more »

Decoding Reality

Decoding Reality: The Universe as Quantum Information is a popular science book by Vlatko Vedral published by Oxford University Press in 2010.

New!!: Information theory and Decoding Reality · See more »

Decoding the Universe

Decoding the Universe: How the New Science of Information Is Explaining Everything in the Cosmos, from Our Brains to Black Holes is the third non-fiction book by American author and journalist Charles Seife.

New!!: Information theory and Decoding the Universe · See more »

Deletion channel

A deletion channel is a communications channel model used in coding theory and information theory.

New!!: Information theory and Deletion channel · See more »

Descriptive knowledge

Descriptive knowledge, also declarative knowledge or propositional knowledge, is the type of knowledge that is, by its very nature, expressed in declarative sentences or indicative propositions.

New!!: Information theory and Descriptive knowledge · See more »

Determining the number of clusters in a data set

Determining the number of clusters in a data set, a quantity often labelled k as in the ''k''-means algorithm, is a frequent problem in data clustering, and is a distinct issue from the process of actually solving the clustering problem.

New!!: Information theory and Determining the number of clusters in a data set · See more »

Dialectical materialism

Dialectical materialism (sometimes abbreviated diamat) is a philosophy of science and nature developed in Europe and based on the writings of Karl Marx and Friedrich Engels.

New!!: Information theory and Dialectical materialism · See more »

Differential entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions.

New!!: Information theory and Differential entropy · See more »

Differential geometry

Differential geometry is a mathematical discipline that uses the techniques of differential calculus, integral calculus, linear algebra and multilinear algebra to study problems in geometry.

New!!: Information theory and Differential geometry · See more »

Digital data

Digital data, in information theory and information systems, is the discrete, discontinuous representation of information or works.

New!!: Information theory and Digital data · See more »

Digital philosophy

Digital philosophy is a direction in philosophy and cosmology advocated by certain mathematicians and theoretical physicists, including: Edward Fredkin, Konrad Zuse, Stephen Wolfram, Rudy Rucker, Gregory Chaitin, and Seth Lloyd.

New!!: Information theory and Digital philosophy · See more »

Digital physics

In physics and cosmology, digital physics (also referred to as digital ontology or digital philosophy) is a collection of theoretical perspectives based on the premise that the universe is describable by information.

New!!: Information theory and Digital physics · See more »

Digital signal processing

Digital signal processing (DSP) is the use of digital processing, such as by computers or more specialized digital signal processors, to perform a wide variety of signal processing operations.

New!!: Information theory and Digital signal processing · See more »

Dimensionality reduction

In statistics, machine learning, and information theory, dimensionality reduction or dimension reduction is the process of reducing the number of random variables under consideration by obtaining a set of principal variables.

New!!: Information theory and Dimensionality reduction · See more »

Directed information

Directed information, I(X^n\to Y^n), is a measure of information theory and it measures the amount of information that flows from the process X^n to Y^n, where X^n denotes the vector X_1,X_2,...,X_n and Y^n denotes Y_1,Y_2,...,Y_n.

New!!: Information theory and Directed information · See more »

Dirty paper coding

In telecommunications, dirty paper coding (DPC) or Costa precoding is a technique for efficient transmission of digital data through a channel subjected to some interference known to the transmitter.

New!!: Information theory and Dirty paper coding · See more »

Discrete mathematics

Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous.

New!!: Information theory and Discrete mathematics · See more »

Discrete Universal Denoiser

In information theory and signal processing, the Discrete Universal Denoiser (DUDE) is a denoising scheme for recovering sequences over a finite alphabet, which have been corrupted by a discrete memoryless channel.

New!!: Information theory and Discrete Universal Denoiser · See more »

Distributed source coding

Distributed source coding (DSC) is an important problem in information theory and communication.

New!!: Information theory and Distributed source coding · See more »

Distributive property

In abstract algebra and formal logic, the distributive property of binary operations generalizes the distributive law from boolean algebra and elementary algebra.

New!!: Information theory and Distributive property · See more »

DNA binding site

DNA binding sites are a type of binding site found in DNA where other molecules may bind.

New!!: Information theory and DNA binding site · See more »

Document dump

A document dump is the act of responding to an adversary's request for information by presenting the adversary with a large quantity of data that is transferred in a manner that indicates unfriendliness, hostility, or a legal conflict between the transmitter and the receiver of the information.

New!!: Information theory and Document dump · See more »

Domon group

The Domon Group, or Domon Research Group, is an interdisciplinary research group founded by former IBM researcher Eduard Domon in 1973.

New!!: Information theory and Domon group · See more »

Don Byrd

Donald J. Byrd is a poet, sound artist, and Professor of English at the State University of New York at Albany.

New!!: Information theory and Don Byrd · See more »

Donald MacCrimmon MacKay

Donald MacCrimmon MacKay (9 August 1922 – 6 February 1987) was a British physicist, and professor at the Department of Communication and Neuroscience at Keele University in Staffordshire, England, known for his contributions to information theory and the theory of brain organisation.

New!!: Information theory and Donald MacCrimmon MacKay · See more »

Dual total correlation

In information theory, dual total correlation (Han 1978), excess entropy (Olbrich 2008), or binding information (Abdallah and Plumbley 2010) is one of the two known non-negative generalizations of mutual information.

New!!: Information theory and Dual total correlation · See more »

Dutton Speedwords

Dutton Speedwords, sometimes called rapmotz, is an international auxiliary language as well as a shorthand writing system for all the languages of the world.

New!!: Information theory and Dutton Speedwords · See more »

Dynamic spectrum management

Dynamic spectrum management (DSM), also referred to as dynamic spectrum access (DSA), is a set of techniques based on theoretical concepts in network information theory and game theory that is being researched and developed to improve the performance of a communication network as a whole.

New!!: Information theory and Dynamic spectrum management · See more »

Dynamical neuroscience

The dynamical systems approach to neuroscience is a branch of mathematical biology that utilizes nonlinear dynamics to understand and model the nervous system and its functions.

New!!: Information theory and Dynamical neuroscience · See more »

Earth–Moon–Earth communication

Earth–Moon–Earth communication (EME), also known as moon bounce, is a radio communications technique that relies on the propagation of radio waves from an Earth-based transmitter directed via reflection from the surface of the Moon back to an Earth-based receiver.

New!!: Information theory and Earth–Moon–Earth communication · See more »

Earthscore

Earthscore is a notational system that enables collaborating videographers to produce a shared perception of environmental realities.

New!!: Information theory and Earthscore · See more »

Econophysics

Econophysics is an interdisciplinary research field, applying theories and methods originally developed by physicists in order to solve problems in economics, usually those including uncertainty or stochastic processes and nonlinear dynamics.

New!!: Information theory and Econophysics · See more »

Ecosystem model

An ecosystem model is an abstract, usually mathematical, representation of an ecological system (ranging in scale from an individual population, to an ecological community, or even an entire biome), which is studied to better understand the real system.

New!!: Information theory and Ecosystem model · See more »

Ed Posner

Edward Charles "Ed" Posner (August 10, 1933 – June 15, 1993) was an American information theorist and neural network researcher who became chief technologist at the Jet Propulsion Laboratory and founded the Conference on Neural Information Processing Systems.

New!!: Information theory and Ed Posner · See more »

Edgar Morin

Edgar Morin (born Edgar Nahoum on 8 July 1921) is a French philosopher and sociologist who has been internationally recognized for his work on complexity and "complex thought" (pensée complexe), and for his scholarly contributions to such diverse fields as media studies, politics, sociology, visual anthropology, ecology, education, and systems biology.

New!!: Information theory and Edgar Morin · See more »

Educational toy

Educational toys (sometimes called "instructive toys") are objects of play, generally designed for children, which are expected to stimulate learning.

New!!: Information theory and Educational toy · See more »

Edward Kofler

Edward Kofler (November 16, 1911 – April 22, 2007) was a mathematician who made important contributions to game theory and fuzzy logic by working out the theory of linear partial information.

New!!: Information theory and Edward Kofler · See more »

Edward Linfoot

Edward Hubert Linfoot was a British mathematician, primarily known for his work on optics, but also noted for his work in pure mathematics.

New!!: Information theory and Edward Linfoot · See more »

Edwin Thompson Jaynes

Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis.

New!!: Information theory and Edwin Thompson Jaynes · See more »

Effective action

In quantum field theory, the effective action is a modified expression for the action, which takes into account quantum-mechanical corrections, in the following sense: In classical mechanics, the equations of motion can be derived from the action by the principle of stationary action.

New!!: Information theory and Effective action · See more »

Efficient coding hypothesis

The efficient coding hypothesis was proposed by Horace Barlow in 1961 as a theoretical model of sensory coding in the brain.

New!!: Information theory and Efficient coding hypothesis · See more »

Elwyn Berlekamp

Elwyn Ralph Berlekamp (born September 6, 1940) is an American mathematician.

New!!: Information theory and Elwyn Berlekamp · See more »

Encyclopedia of Cryptography and Security

The Encyclopedia of Cryptography and Security is a comprehensive work on Cryptography for both information security professionals and experts in the fields of Computer Science, Applied Mathematics, Engineering, Information Theory, Data Encryption, etc.

New!!: Information theory and Encyclopedia of Cryptography and Security · See more »

Encyclopedia of Cybernetics

The Encyclopedia of Cybernetics (Енциклопедія кібернетики) is a Ukrainian language encyclopedia of computer science first published in Kiev in 1973, with Victor Glushkov serving as its chief editor.

New!!: Information theory and Encyclopedia of Cybernetics · See more »

Engineer

Engineers, as practitioners of engineering, are people who invent, design, analyze, build, and test machines, systems, structures and materials to fulfill objectives and requirements while considering the limitations imposed by practicality, regulation, safety, and cost.

New!!: Information theory and Engineer · See more »

Engineering physics

Engineering physics or engineering science refers to the study of the combined disciplines of physics, mathematics and engineering, particularly computer, nuclear, electrical, electronic, materials or mechanical engineering.

New!!: Information theory and Engineering physics · See more »

Ensemble average (statistical mechanics)

In statistical mechanics, the ensemble average is defined as the mean of a quantity that is a function of the microstate of a system (the ensemble of possible states), according to the distribution of the system on its micro-states in this ensemble.

New!!: Information theory and Ensemble average (statistical mechanics) · See more »

Entropic uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies.

New!!: Information theory and Entropic uncertainty · See more »

Entropic vector

The entropic vector or entropic function is a concept arising in information theory.

New!!: Information theory and Entropic vector · See more »

Entropy

In statistical mechanics, entropy is an extensive property of a thermodynamic system.

New!!: Information theory and Entropy · See more »

Entropy (arrow of time)

Entropy is the only quantity in the physical sciences (apart from certain rare interactions in particle physics; see below) that requires a particular direction for time, sometimes called an arrow of time.

New!!: Information theory and Entropy (arrow of time) · See more »

Entropy (journal)

Entropy is a peer-reviewed open access scientific journal covering research on all aspects of entropy and information theory.

New!!: Information theory and Entropy (journal) · See more »

Entropy (statistical thermodynamics)

In classical statistical mechanics, the entropy function earlier introduced by Rudolf Clausius is interpreted as statistical entropy using probability theory.

New!!: Information theory and Entropy (statistical thermodynamics) · See more »

Entropy and life

Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began around the turn of the 20th century.

New!!: Information theory and Entropy and life · See more »

Entropy compression

In mathematics and theoretical computer science, entropy compression is an information theoretic method for proving that a random process terminates, originally used by Robin Moser to prove an algorithmic version of the Lovász local lemma.

New!!: Information theory and Entropy compression · See more »

Entropy encoding

In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium.

New!!: Information theory and Entropy encoding · See more »

Entropy in thermodynamics and information theory

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s.

New!!: Information theory and Entropy in thermodynamics and information theory · See more »

Entropy of mixing

In thermodynamics the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the thermodynamic operation of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system.

New!!: Information theory and Entropy of mixing · See more »

Entropy power inequality

In information theory, the entropy power inequality is a result that relates to so-called "entropy power" of random variables.

New!!: Information theory and Entropy power inequality · See more »

Environment (systems)

In science and engineering, a system is the part of the universe that is being studied, while the environment is the remainder of the universe that lies outside the boundaries of the system.

New!!: Information theory and Environment (systems) · See more »

Erasure channel

In information theory and telecommunications, an erasure channel is a communication channel model wherein errors are described as erasures, and may refer to.

New!!: Information theory and Erasure channel · See more »

Eric Lander

Eric Steven Lander (born February 3, 1957), a mathematician and geneticist, is a Professor of Biology at the Massachusetts Institute of Technology (MIT), former member of the Whitehead Institute, and founding director of the Broad Institute of MIT and Harvard.

New!!: Information theory and Eric Lander · See more »

Error correction code

In computing, telecommunication, information theory, and coding theory, an error correction code, sometimes error correcting code, (ECC) is used for controlling errors in data over unreliable or noisy communication channels.

New!!: Information theory and Error correction code · See more »

Error detection and correction

In information theory and coding theory with applications in computer science and telecommunication, error detection and correction or error control are techniques that enable reliable delivery of digital data over unreliable communication channels.

New!!: Information theory and Error detection and correction · See more »

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the logarithm of the error probability.

New!!: Information theory and Error exponent · See more »

Error-correcting codes with feedback

In mathematics, computer science, telecommunication, information theory, and searching theory, error-correcting codes with feedback refers to error correcting codes designed to work in the presence of feedback from the receiver to the sender.

New!!: Information theory and Error-correcting codes with feedback · See more »

Erwin Lutwak

Erwin Lutwak (born 9 February 1946, Chernivtsi, now Ukraine), is a mathematician.

New!!: Information theory and Erwin Lutwak · See more »

Estimation theory

Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component.

New!!: Information theory and Estimation theory · See more »

Etienne Vermeersch

Etienne Vermeersch (born May 2, 1934 in Sint-Michiels (nowadays part of Bruges)) is a Belgian (moral) philosopher, skeptic, opinion maker and debater.

New!!: Information theory and Etienne Vermeersch · See more »

Evolution of sexual reproduction

The evolution of sexual reproduction describes how sexually reproducing animals, plants, fungi and protists evolved from a common ancestor that was a single celled eukaryotic species.

New!!: Information theory and Evolution of sexual reproduction · See more »

Exergy

In thermodynamics, the exergy (in older usage, available work or availability) of a system is the maximum useful work possible during a process that brings the system into equilibrium with a heat reservoir.

New!!: Information theory and Exergy · See more »

Exformation

Exformation (originally spelled eksformation in Danish) is a term coined by Danish science writer Tor Nørretranders in his book The User Illusion published in English 1998.

New!!: Information theory and Exformation · See more »

External memory algorithm

In computing, external memory algorithms or out-of-core algorithms are algorithms that are designed to process data that is too large to fit into a computer's main memory at one time.

New!!: Information theory and External memory algorithm · See more »

Extreme physical information

Extreme physical information (EPI) is a principle, first described and formulated in 1998B. Roy Frieden, Physics from Fisher Information: A Unification, 1st Ed.

New!!: Information theory and Extreme physical information · See more »

Fano's inequality

In information theory, Fano's inequality (also known as the Fano converse and the Fano lemma) relates the average information lost in a noisy channel to the probability of the categorization error.

New!!: Information theory and Fano's inequality · See more »

Fazlollah Reza

Fazlollah Reza (فضل‌الله رضا) (born January 1, 1915) is an Iranian university professor.

New!!: Information theory and Fazlollah Reza · See more »

Feature Selection Toolbox

Feature Selection Toolbox (FST) is software primarily for feature selection in the machine learning domain, written in C++, developed at the Institute of Information Theory and Automation (UTIA), of the Czech Academy of Sciences.

New!!: Information theory and Feature Selection Toolbox · See more »

Fiber-optic communication

Fiber-optic communication is a method of transmitting information from one place to another by sending pulses of light through an optical fiber.

New!!: Information theory and Fiber-optic communication · See more »

First-move advantage in chess

The first-move advantage in chess is the inherent advantage of the player (White) who makes the first move in chess.

New!!: Information theory and First-move advantage in chess · See more »

First-order inductive learner

In machine learning, first-order inductive learner (FOIL) is a rule-based learning algorithm.

New!!: Information theory and First-order inductive learner · See more »

Fisher information

In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.

New!!: Information theory and Fisher information · See more »

Fisher information metric

In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space.

New!!: Information theory and Fisher information metric · See more »

Flemming Topsøe

Flemming Topsøe (born 25 August 1938 in Aarhus, Denmark) is a Danish mathematician, mathematics docent at University of Copenhagen where he has worked since 1964 and author of several mathematical science works, among them works about analysis, probability theory and information theory.

New!!: Information theory and Flemming Topsøe · See more »

Flow network

In graph theory, a flow network (also known as a transportation network) is a directed graph where each edge has a capacity and each edge receives a flow.

New!!: Information theory and Flow network · See more »

Forest informatics

Forest informatics is the combined science of Forestry and informatics, with a special emphasis on collection, management, and processing of data, information and knowledge, and the incorporation of informatic concepts and theories specific to enrich forest management and forest science; it has a similar relationship to library science and information science.

New!!: Information theory and Forest informatics · See more »

Formal science

Formal sciences are formal language disciplines concerned with formal systems, such as logic, mathematics, statistics, theoretical computer science, robotics, information theory, game theory, systems theory, decision theory, and theoretical linguistics.

New!!: Information theory and Formal science · See more »

Formation matrix

In statistics and information theory, the expected formation matrix of a likelihood function L(\theta) is the matrix inverse of the Fisher information matrix of L(\theta), while the observed formation matrix of L(\theta) is the inverse of the observed information matrix of L(\theta).

New!!: Information theory and Formation matrix · See more »

Forward error correction

In telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

New!!: Information theory and Forward error correction · See more »

Foundations and Trends in Communications and Information Theory

Foundations and Trends in Communications and Information Theory is a peer-reviewed academic journal that publishes long survey and tutorial articles in the field of communication and information theory.

New!!: Information theory and Foundations and Trends in Communications and Information Theory · See more »

Fourier–Motzkin elimination

Fourier–Motzkin elimination, also known as the FME method, is a mathematical algorithm for eliminating variables from a system of linear inequalities.

New!!: Information theory and Fourier–Motzkin elimination · See more »

Fred Dretske

Frederick Irwin "Fred" Dretske (December 9, 1932 – July 24, 2013) was an American philosopher noted for his contributions to epistemology and the philosophy of mind.

New!!: Information theory and Fred Dretske · See more »

Frederick Jelinek

Frederick Jelinek (18 November 1932 – 14 September 2010) was a Czech-American researcher in information theory, automatic speech recognition, and natural language processing.

New!!: Information theory and Frederick Jelinek · See more »

Freedman's paradox

In statistical analysis, Freedman's paradox, named after David Freedman, is a problem in model selection whereby predictor variables with no relationship to the dependent variable can pass tests of significance – both individually via a t-test, and jointly via an F-test for the significance of the regression.

New!!: Information theory and Freedman's paradox · See more »

Frieder Nake

Frieder Nake (born December 16, 1938 in Stuttgart, Germany) is a mathematician, computer scientist, and pioneer of computer art.

New!!: Information theory and Frieder Nake · See more »

Functional decomposition

In mathematics, functional decomposition is the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed (i.e., recomposed) from those parts by function composition.

New!!: Information theory and Functional decomposition · See more »

Functional load

In linguistics and especially phonology, functional load (also referred to as phonemic load) refers to the importance of certain features in making distinctions in a language.

New!!: Information theory and Functional load · See more »

Fungible information

Fungible information is the information for which the means of encoding is not important.

New!!: Information theory and Fungible information · See more »

Gabor wavelet

Gabor wavelets are wavelets invented by Dennis Gabor using complex functions constructed to serve as a basis for Fourier transforms in information theory applications.

New!!: Information theory and Gabor wavelet · See more »

Gambling and information theory

Statistical inference might be thought of as gambling theory applied to the world around us.

New!!: Information theory and Gambling and information theory · See more »

Gary Anderson (designer)

Gary Dean Anderson (born 1947) is an influential graphic designer and architect.

New!!: Information theory and Gary Anderson (designer) · See more »

Gaylord, Michigan

Gaylord is a city in and the county seat of Otsego County, Michigan, United States.

New!!: Information theory and Gaylord, Michigan · See more »

Generalized distributive law

The generalized distributive law (GDL) is a generalization of the distributive property which gives rise to a general message passing algorithm.

New!!: Information theory and Generalized distributive law · See more »

Generalized entropy index

The generalized entropy index has been proposed as a measure of income inequality in a population.

New!!: Information theory and Generalized entropy index · See more »

Generalized relative entropy

Generalized relative entropy (\epsilon-relative entropy) is a measure of dissimilarity between two quantum states.

New!!: Information theory and Generalized relative entropy · See more »

Generative art

Generative art refers to art that in whole or in part has been created with the use of an autonomous system.

New!!: Information theory and Generative art · See more »

Generative science

Generative science is an area of research that explores the natural world and its complex behaviours.

New!!: Information theory and Generative science · See more »

Genetic code

The genetic code is the set of rules used by living cells to translate information encoded within genetic material (DNA or mRNA sequences) into proteins.

New!!: Information theory and Genetic code · See more »

George Armitage Miller

George Armitage Miller (February 3, 1920 – July 22, 2012) was an American psychologist who was one of the founders of the cognitive psychology field.

New!!: Information theory and George Armitage Miller · See more »

George Gilder

George Franklin Gilder (born November 29, 1939) is an American investor, writer, economist, techno-utopian advocate, and co-founder of the Discovery Institute.

New!!: Information theory and George Gilder · See more »

George Klir

George Jiří Klir (April 22, 1932 Prague, Czechoslovakia – May 27, 2016 Binghamton, USA) was a Czech-American computer scientist and professor of systems sciences at Binghamton University in Binghamton, New York.

New!!: Information theory and George Klir · See more »

Gibbs' inequality

Josiah Willard Gibbs In information theory, Gibbs' inequality is a statement about the mathematical entropy of a discrete probability distribution.

New!!: Information theory and Gibbs' inequality · See more »

Gleason's theorem

Gleason's theorem (named after Andrew M. Gleason) is a mathematical result which shows that the rule one uses to calculate probabilities in quantum physics follows logically from particular assumptions about how measurements are represented mathematically.

New!!: Information theory and Gleason's theorem · See more »

Glossary of electrical and electronics engineering

Most of the terms listed in Wikipedia glossaries are already defined and explained within Wikipedia itself.

New!!: Information theory and Glossary of electrical and electronics engineering · See more »

Golomb ruler

In mathematics, a Golomb ruler is a set of marks at integer positions along an imaginary ruler such that no two pairs of marks are the same distance apart.

New!!: Information theory and Golomb ruler · See more »

GOR method

The GOR method (Garnier-Osguthorpe-Robson) is an information theory-based method for the prediction of secondary structures in proteins.

New!!: Information theory and GOR method · See more »

Gottfried Ungerboeck

Gottfried Ungerboeck (born 15 March 1940, Vienna) is an Austrian communications engineer.

New!!: Information theory and Gottfried Ungerboeck · See more »

Gottfried Wilhelm Leibniz Prize

The Gottfried Wilhelm Leibniz Prize is a program of the Deutsche Forschungsgemeinschaft (the German Research Foundation) which awards prizes “to exceptional scientists and academics for their outstanding achievements in the field of research.” It was established in 1985 and up to ten prizes are awarded annually to individuals or research groups working at a research institution in Germany or at a German research institution abroad.

New!!: Information theory and Gottfried Wilhelm Leibniz Prize · See more »

Grammatical Man

Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by the Evening Standard's Washington correspondent, Jeremy Campbell.

New!!: Information theory and Grammatical Man · See more »

Graph coloring

In graph theory, graph coloring is a special case of graph labeling; it is an assignment of labels traditionally called "colors" to elements of a graph subject to certain constraints.

New!!: Information theory and Graph coloring · See more »

Graph entropy

In information theory, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.

New!!: Information theory and Graph entropy · See more »

Gregory Raleigh

Gregory “Greg” Raleigh (born in Orange, California in 1961), is an American radio scientist, inventor, and entrepreneur who has made contributions in the fields of wireless communication, information theory, mobile operating systems, medical devices, and network virtualization.

New!!: Information theory and Gregory Raleigh · See more »

Group testing

In statistics and combinatorial mathematics, group testing is any procedure that breaks up the task of identifying certain objects into tests on groups of items, rather than on individual ones.

New!!: Information theory and Group testing · See more »

Groupe des Dix

The Groupe des Dix (the Group of Ten) was the name given to a group of notable French personalities (mostly philosophers and scientists) who would regularly meet up between 1969 and 1976.

New!!: Information theory and Groupe des Dix · See more »

H. K. Kesavan

Hiremagalur Krishnaswamy Kesavan, known as H.K. Kesavan, (14 June 1926 – 26 November 2014) was Distinguished Professor Emeritus in the Faculty of Engineering at the University of Waterloo, Ontario, Canada.

New!!: Information theory and H. K. Kesavan · See more »

H1 neuron

The H1 neuron is located in the visual cortex of true flies of the order Diptera and mediates motor responses to visual stimuli.

New!!: Information theory and H1 neuron · See more »

Hamming distance

In information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different.

New!!: Information theory and Hamming distance · See more »

Hamming weight

The Hamming weight of a string is the number of symbols that are different from the zero-symbol of the alphabet used.

New!!: Information theory and Hamming weight · See more »

Hanns Malissa

Hanns Malissa (October 8, 1920 – June 22, 2010) was an Austrian analytical chemist and environmental chemist who published about 250 scientific papers and several books.

New!!: Information theory and Hanns Malissa · See more »

Hans Grassmann

Hans Grassmann (Bamberg, 21 May 1960) is a German physicist, writer and entrepreneur, who teaches and works in Italy.

New!!: Information theory and Hans Grassmann · See more »

Hard-core predicate

In cryptography, a hard-core predicate of a one-way function f is a predicate b (i.e., a function whose output is a single bit) which is easy to compute (as a function of x) but is hard to compute given f(x).

New!!: Information theory and Hard-core predicate · See more »

Harold Pender Award

The Harold Pender Award, initiated in 1972 and named after founding Dean Harold Pender, is given by the Faculty of the School of Engineering and Applied Science of the University of Pennsylvania to an outstanding member of the engineering profession who has achieved distinction by significant contributions to society.

New!!: Information theory and Harold Pender Award · See more »

Harry Nyquist

Harry Nyquist (born Harry Theodor Nyqvist,; February 7, 1889 – April 4, 1976) was a Swedish-born American electronic engineer who made important contributions to communication theory.

New!!: Information theory and Harry Nyquist · See more »

Harvey Jerome Brudner

Harvey Jerome Brudner (May 29, 1931 - September 15, 2009) was a theoretical physicist and engineer.

New!!: Information theory and Harvey Jerome Brudner · See more »

Hendrik C. Ferreira

Hendrik Christoffel Ferreira is a professor in Digital Communications and Information Theory at the University of Johannesburg, Johannesburg, South Africa.

New!!: Information theory and Hendrik C. Ferreira · See more »

Hendrik Wade Bode

Hendrik Wade BodeVan Valkenburg, M. E. University of Illinois at Urbana-Champaign, "In memoriam: Hendrik W. Bode (1905-1982)", IEEE Transactions on Automatic Control, Vol.

New!!: Information theory and Hendrik Wade Bode · See more »

Henri Atlan

Henri Atlan (born 27 December 1931 in Blida, French Algeria) is a French biophysicist and philosopher.

New!!: Information theory and Henri Atlan · See more »

Henri Daniel Rathgeber

Henri Daniel Rathgeber (11 June 1908 – 20 July 1995) was an Australian physicist who studied cosmic rays but considered his most important contribution to be an economic theory that explain how entropy causes unemployment.

New!!: Information theory and Henri Daniel Rathgeber · See more »

Henri Lefebvre

Henri Lefebvre (16 June 1901 – 29 June 1991) was a French Marxist philosopher and sociologist, best known for pioneering the critique of everyday life, for introducing the concepts of the right to the city and the production of social space, and for his work on dialectics, alienation, and criticism of Stalinism, existentialism, and structuralism.

New!!: Information theory and Henri Lefebvre · See more »

Henry Earl Singleton

Henry Earl Singleton (November 27, 1916 – August 31, 1999) was an American electrical engineer, business executive, and rancher/land owner.

New!!: Information theory and Henry Earl Singleton · See more »

Henry Landau

Henry Jacob Landau is an American mathematician, known for his contributions to information theory, in particular to the theory of bandlimited functions and on moment issues.

New!!: Information theory and Henry Landau · See more »

Henry O. Pollak

Henry Otto Pollak (born December 13, 1927) is an Austrian-American mathematician, known for his contributions to information theory.

New!!: Information theory and Henry O. Pollak · See more »

Henry Quastler

Henry Quastler (November 11, 1908 – July 4, 1963) was an Austrian physician and radiologist who became a pioneer in the field of information theory applied to biology after emigrating to America.

New!!: Information theory and Henry Quastler · See more »

Hick's law

Hick's law, or the Hick–Hyman law, named after British and American psychologists William Edmund Hick and Ray Hyman, describes the time it takes for a person to make a decision as a result of the possible choices he or she has: increasing the number of choices will increase the decision time logarithmically.

New!!: Information theory and Hick's law · See more »

Hideki Imai

is an information theorist and cryptographer, currently the director of Research Center for Information Security (RCIS), National Institute of Advanced Industrial Science and Technology (AIST) and a full professor at Chuo University.

New!!: Information theory and Hideki Imai · See more »

His Master's Voice (novel)

His Master's Voice (original Polish title: Głos Pana) is a science fiction novel on the "message from space" theme written by Polish writer Stanisław Lem.

New!!: Information theory and His Master's Voice (novel) · See more »

History of artificial intelligence

The history of Artificial Intelligence (AI) began in antiquity, with myths, stories and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen; as Pamela McCorduck writes, AI began with "an ancient wish to forge the gods." The seeds of modern AI were planted by classical philosophers who attempted to describe the process of human thinking as the mechanical manipulation of symbols.

New!!: Information theory and History of artificial intelligence · See more »

History of communication studies

Various aspects of communication have been the subject of study since ancient times, and the approach eventually developed into the academic discipline known today as communication studies.

New!!: Information theory and History of communication studies · See more »

History of computer science

The history of computer science began long before our modern discipline of computer science.

New!!: Information theory and History of computer science · See more »

History of entropy

The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work.

New!!: Information theory and History of entropy · See more »

History of information theory

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

New!!: Information theory and History of information theory · See more »

History of machine translation

Machine translation is a sub-field of computational linguistics that investigates the use of software to translate text or speech from one natural language to another.

New!!: Information theory and History of machine translation · See more »

History of mathematics

The area of study known as the history of mathematics is primarily an investigation into the origin of discoveries in mathematics and, to a lesser extent, an investigation into the mathematical methods and notation of the past.

New!!: Information theory and History of mathematics · See more »

History of molecular biology

The history of molecular biology begins in the 1930s with the convergence of various, previously distinct biological and physical disciplines: biochemistry, genetics, microbiology, virology and physics.

New!!: Information theory and History of molecular biology · See more »

History of randomness

In ancient history, the concepts of chance and randomness were intertwined with that of fate.

New!!: Information theory and History of randomness · See more »

History of science

The history of science is the study of the development of science and scientific knowledge, including both the natural and social sciences.

New!!: Information theory and History of science · See more »

History of the Internet

The history of the Internet begins with the development of electronic computers in the 1950s.

New!!: Information theory and History of the Internet · See more »

History of thermodynamics

The history of thermodynamics is a fundamental strand in the history of physics, the history of chemistry, and the history of science in general.

New!!: Information theory and History of thermodynamics · See more »

Holographic principle

The holographic principle is a principle of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region—preferably a light-like boundary like a gravitational horizon.

New!!: Information theory and Holographic principle · See more »

Homeostasis

Homeostasis is the tendency of organisms to auto-regulate and maintain their internal environment in a stable state.

New!!: Information theory and Homeostasis · See more »

Horizontal correlation

Horizontal correlation is a methodology for gene sequence analysis.

New!!: Information theory and Horizontal correlation · See more »

Hubert Yockey

Professor Hubert P. Yockey (April 15, 1916 – January 31, 2016) was a physicist and information theorist.

New!!: Information theory and Hubert Yockey · See more »

Huffman coding

In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly used for lossless data compression.

New!!: Information theory and Huffman coding · See more »

Human performance modeling

Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes; a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction.

New!!: Information theory and Human performance modeling · See more »

Hypnosis

Hypnosis is a state of human consciousness involving focused attention and reduced peripheral awareness and an enhanced capacity to respond to suggestion.

New!!: Information theory and Hypnosis · See more »

Ideal observer analysis

Ideal observer analysis is a method for investigating how information is processed in a perceptual system.

New!!: Information theory and Ideal observer analysis · See more »

Ideal tasks

Ideal tasks arise during task analysis.

New!!: Information theory and Ideal tasks · See more »

IEEE International Symposium on Information Theory

The IEEE International Symposium on Information Theory (ISIT) is the flagship meeting of the IEEE Information Theory Society.

New!!: Information theory and IEEE International Symposium on Information Theory · See more »

IEEE Transactions on Information Theory

IEEE Transactions on Information Theory is a monthly peer-reviewed scientific journal published by the IEEE Information Theory Society.

New!!: Information theory and IEEE Transactions on Information Theory · See more »

IFISC

The Instituto de Fisica Interdisciplinar y Sistemas Complejos (IFISC), or in English the Institute for Cross-Disciplinary Physics and Complex Systems, is a Research Institute in Europe involved in the study of complex systems and complex phenomena, staffed by permanent professors and researchers, non-tenured Postdoctoral researchers and Ph.D. and Master students.

New!!: Information theory and IFISC · See more »

Ilan Sadeh

Ilan Sadeh (born June 1, 1953) is an Israeli IT theoretician, entrepreneur, and human rights activist.

New!!: Information theory and Ilan Sadeh · See more »

Illusory superiority

In the field of social psychology, illusory superiority is a condition of cognitive bias whereby a person overestimates their own qualities and abilities, in relation to the same qualities and abilities of other persons.

New!!: Information theory and Illusory superiority · See more »

IMA Journal of Mathematical Control and Information

The IMA Journal of Mathematical Control and Information is published by Oxford University Press on behalf of the Institute of Mathematics and its Applications.

New!!: Information theory and IMA Journal of Mathematical Control and Information · See more »

Implicit data structure

In computer science, an implicit data structure or space-efficient data structure is a data structure that stores very little information other than the main or required data: a data structure that requires low overhead.

New!!: Information theory and Implicit data structure · See more »

Imre Csiszár

Imre Csiszár is a Hungarian mathematician with contributions to information theory and probability theory.

New!!: Information theory and Imre Csiszár · See more »

Income inequality metrics

Income inequality metrics or income distribution metrics are used by social scientists to measure the distribution of income, and economic inequality among the participants in a particular economy, such as that of a specific country or of the world in general.

New!!: Information theory and Income inequality metrics · See more »

Incomplete Nature

Incomplete Nature: How Mind Emerged from Matter is a 2011 book by biological anthropologist Terrence Deacon.

New!!: Information theory and Incomplete Nature · See more »

Index of electrical engineering articles

This is an alphabetical list of articles pertaining specifically to electrical and electronics engineering.

New!!: Information theory and Index of electrical engineering articles · See more »

Index of information theory articles

This is a list of information theory topics, by Wikipedia page.

New!!: Information theory and Index of information theory articles · See more »

Index of optics articles

Optics is the branch of physics which involves the behavior and properties of light, including its interactions with matter and the construction of instruments that use or detect it.

New!!: Information theory and Index of optics articles · See more »

Index of philosophy articles (I–Q)

No description.

New!!: Information theory and Index of philosophy articles (I–Q) · See more »

Inductive probability

Inductive probability attempts to give the probability of future events based on past events.

New!!: Information theory and Inductive probability · See more »

Inequalities in information theory

Inequalities are very important in the study of information theory.

New!!: Information theory and Inequalities in information theory · See more »

Info-metrics

Info-metrics is an interdisciplinary approach to scientific modeling, inference and efficient information processing.

New!!: Information theory and Info-metrics · See more »

Informatics

Informatics is a branch of information engineering.

New!!: Information theory and Informatics · See more »

Information

Information is any entity or form that provides the answer to a question of some kind or resolves uncertainty.

New!!: Information theory and Information · See more »

Information (disambiguation)

ion Information is a colof topic.

New!!: Information theory and Information (disambiguation) · See more »

Information Age

The Information Age (also known as the Computer Age, Digital Age, or New Media Age) is a 21st century period in human history characterized by the rapid shift from traditional industry that the Industrial Revolution brought through industrialization, to an economy based on information technology.

New!!: Information theory and Information Age · See more »

Information algebra

The term "information algebra" refers to mathematical techniques of information processing.

New!!: Information theory and Information algebra · See more »

Information bottleneck method

The information bottleneck method is a technique in information theory introduced by Naftali Tishby, Fernando C. Pereira, and William Bialek.

New!!: Information theory and Information bottleneck method · See more »

Information capital

Information capital is a concept which asserts that information has intrinsic value which can be shared and leveraged within and between organizations.

New!!: Information theory and Information capital · See more »

Information diagram

An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information.

New!!: Information theory and Information diagram · See more »

Information dimension

In information theory, information dimension is an information measure for random vectors in Euclidean space, based on the normalized entropy of finely quantized versions of the random vectors.

New!!: Information theory and Information dimension · See more »

Information distance

Information distance is the distance between two finite objects (represented as computer files) expressed as the number of bits in the shortest program which transforms one object into the other one or vice versa on a universal computer.

New!!: Information theory and Information distance · See more »

Information exchange

Information exchange or information sharing are informal terms that can either refer to bidirectional information transmission/information transfer in telecommunications and computer science or communication seen from a system-theoretic or information-theoretic point of view.

New!!: Information theory and Information exchange · See more »

Information flow (disambiguation)

Information flow can have one of several meanings.

New!!: Information theory and Information flow (disambiguation) · See more »

Information flow (information theory)

Information flow in an information theoretical context is the transfer of information from a variable x to a variable y in a given process.

New!!: Information theory and Information flow (information theory) · See more »

Information gain in decision trees

In information theory and machine learning, information gain is a synonym for Kullback–Leibler divergence.

New!!: Information theory and Information gain in decision trees · See more »

Information history

Information history may refer to the history of each of the categories listed below (or to combinations of them).

New!!: Information theory and Information history · See more »

Information processor

An information processor or information processing system, as its name suggests, is a system (be it electrical, mechanical or biological) which takes information (a sequence of enumerated symbols or states) in one form and processes (transforms) it into another form, e.g. to statistics, by an algorithmic process.

New!!: Information theory and Information processor · See more »

Information projection

In information theory, the information projection or I-projection of a probability distribution q onto a set of distributions P is where D_ is the Kullback–Leibler divergence from q to p. Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection p^* is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry, notably because of the following inequality, valid when P is convex: \operatorname_(p||q) \geq \operatorname_(p||p^*) + \operatorname_(p^*||q) This inequality can be interpreted as an information-geometric version of Pythagoras' triangle inequality theorem, where KL divergence is viewed as squared distance in a Euclidean space.

New!!: Information theory and Information projection · See more »

Information revolution

The term information revolution describes current economic, social and technological trends beyond the Industrial Revolution.

New!!: Information theory and Information revolution · See more »

Information science

Information science is a field primarily concerned with the analysis, collection, classification, manipulation, storage, retrieval, movement, dissemination, and protection of information.

New!!: Information theory and Information science · See more »

Information theory and measure theory

This article discusses how information theory (a branch of mathematics studying the transmission, processing and storage of information) is related to measure theory (a branch of mathematics related to integration and probability).

New!!: Information theory and Information theory and measure theory · See more »

Information-theoretic death

Information-theoretic death is the scrambling of information within a brain to such an extent that recovery of the original person becomes theoretically impossible.

New!!: Information theory and Information-theoretic death · See more »

Information-theoretic security

Information-theoretic security is a cryptosystem whose security derives purely from information theory.

New!!: Information theory and Information-theoretic security · See more »

Ingleton's inequality

In mathematics, Ingleton's inequality is an inequality that is satisfied by the rank function of any representable matroid.

New!!: Information theory and Ingleton's inequality · See more »

Inquiry

An inquiry is any process that has the aim of augmenting knowledge, resolving doubt, or solving a problem.

New!!: Information theory and Inquiry · See more »

Intelligent design

Intelligent design (ID) is a religious argument for the existence of God, presented by its proponents as "an evidence-based scientific theory about life's origins",Numbers 2006, p. 373; " captured headlines for its bold attempt to rewrite the basic rules of science and its claim to have found indisputable evidence of a God-like being.

New!!: Information theory and Intelligent design · See more »

Interactive media

Interactive media normally refers to products and services on digital computer-based systems which respond to the user's actions by presenting content such as text, moving image, animation, video, audio, and video games.

New!!: Information theory and Interactive media · See more »

International Conference on Information Processing in Sensor Networks

IPSN, the IEEE/ACM International Conference on Information Processing in Sensor Networks, is an academic conference on sensor networks with its main focus on information processing aspects of sensor networks.

New!!: Information theory and International Conference on Information Processing in Sensor Networks · See more »

Iran Workshop on Communication and Information Theory

The Iran Workshop on Communication and Information Theory (IWCIT) (کارگاه ملی نظریه اطلاعات و مخابرات.) is an international academic workshop that is held annually in one of the Iranian University campuses.

New!!: Information theory and Iran Workshop on Communication and Information Theory · See more »

Irving S. Reed

Irving Stoy Reed (November 12, 1923 – September 11, 2012) was a mathematician and engineer.

New!!: Information theory and Irving S. Reed · See more »

István Vincze (mathematician)

István Vincze (&ndash) was a Hungarian mathematician, known for his contributions to number theory, non-parametric statistics, empirical distribution, Cramér–Rao inequality, and information theory.

New!!: Information theory and István Vincze (mathematician) · See more »

Ivan Sutherland

Ivan Edward Sutherland (born May 16, 1938) is an American computer scientist and Internet pioneer, widely regarded as the "father of computer graphics." His early work in computer graphics as well as his teaching with David C. Evans in that subject at the University of Utah in the 1970s was pioneering in the field.

New!!: Information theory and Ivan Sutherland · See more »

Jack Wolf

Jack Keil Wolf (March 14, 1935 – May 12, 2011) was an American researcher in information theory and coding theory.

New!!: Information theory and Jack Wolf · See more »

Jacob Wolfowitz

Jacob Wolfowitz (March 19, 1910 – July 16, 1981) was a Polish-born American statistician and Shannon Award-winning information theorist.

New!!: Information theory and Jacob Wolfowitz · See more »

Jaikumar Radhakrishnan

Jaikumar Radhakrishnan (born 30 May 1964) is an Indian computer scientist specialising in combinatorics and communication complexity.

New!!: Information theory and Jaikumar Radhakrishnan · See more »

James A. Krumhansl

James Arthur "Jim" Krumhansl (August 2, 1919 – May 6, 2004) was an American physicist who specialized in condensed matter physics and materials science.

New!!: Information theory and James A. Krumhansl · See more »

James Massey

James Lee Massey (February 11, 1934 – June 16, 2013) was an information theorist and cryptographer, Professor Emeritus of Digital Technology at ETH Zurich.

New!!: Information theory and James Massey · See more »

James Tenney

James Tenney (August 10, 1934 – August 24, 2006) was an American composer and music theorist.

New!!: Information theory and James Tenney · See more »

János Aczél (mathematician)

János Dezső Aczél (born 26 December 1924) is a Hungarian-Canadian mathematician, who specializes in functional equations and information theory.

New!!: Information theory and János Aczél (mathematician) · See more »

Jim K. Omura

Jimmy K. Omura (born September 8, 1940 in San Jose, California) is an electrical engineer and information theorist.

New!!: Information theory and Jim K. Omura · See more »

Joachim Hagenauer

Joachim Hagenauer (born 29 July 1941) is an information theorist and professor emeritus at Technical University of Munich.

New!!: Information theory and Joachim Hagenauer · See more »

John Larry Kelly Jr.

John Larry Kelly Jr. (1923–1965), was a scientist who worked at Bell Labs.

New!!: Information theory and John Larry Kelly Jr. · See more »

John Preskill

John Phillip Preskill (born January 19, 1953) is an American theoretical physicist and the Richard P. Feynman Professor of Theoretical Physics at the California Institute of Technology (Caltech).

New!!: Information theory and John Preskill · See more »

John R. Pierce

John Robinson Pierce (March 27, 1910 – April 2, 2002), was an American engineer and author.

New!!: Information theory and John R. Pierce · See more »

John Scales Avery

John Scales Avery (born in 1933 in Lebanon to American parents) is a theoretical chemist noted for his research publications in quantum chemistry, thermodynamics, evolution, and history of science.

New!!: Information theory and John Scales Avery · See more »

John Wozencraft

John McReynolds "Jack" Wozencraft (September 30, 1925 – August 31, 2009) was an electrical engineer and information theorist, professor emeritus at the Massachusetts Institute of Technology.

New!!: Information theory and John Wozencraft · See more »

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

New!!: Information theory and Joint entropy · See more »

Joint quantum entropy

The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory.

New!!: Information theory and Joint quantum entropy · See more »

Joint source and channel coding

In information theory, joint source–channel coding is the encoding of a redundant information source for transmission over a noisy channel, and the corresponding decoding, using a single code instead of the more conventional steps of source coding followed by channel coding.

New!!: Information theory and Joint source and channel coding · See more »

Jorma Rissanen

Jorma J. Rissanen (born 20 October 1932) is an information theorist, known for inventing the minimum description length principle and practical approaches to arithmetic coding for lossless data compression.

New!!: Information theory and Jorma Rissanen · See more »

Joseph Kampé de Fériet

Marie-Joseph Kampé de Fériet (Paris, 14 May 1893 – Villeneuve d'Ascq, 6 April 1982) was professor at Université Lille Nord de France from 1919 to 1969.

New!!: Information theory and Joseph Kampé de Fériet · See more »

Josiah Willard Gibbs

Josiah Willard Gibbs (February 11, 1839 – April 28, 1903) was an American scientist who made important theoretical contributions to physics, chemistry, and mathematics.

New!!: Information theory and Josiah Willard Gibbs · See more »

Juan Gualterio Roederer

Juan G. Roederer is a professor of physics emeritus at the University of Alaska Fairbanks (UAF).

New!!: Information theory and Juan Gualterio Roederer · See more »

Kadir–Brady saliency detector

The Kadir–Brady saliency detector extracts features of objects in images that are distinct and representative.

New!!: Information theory and Kadir–Brady saliency detector · See more »

Katalin Marton

Katalin Marton is a Hungarian mathematician born 1941 in Budapest.

New!!: Information theory and Katalin Marton · See more »

Kazimierz Urbanik

Kazimierz Urbanik (February 5, 1930 – May 29, 2005) was a prominent member of the Polish School of Mathematics.

New!!: Information theory and Kazimierz Urbanik · See more »

Kees Schouhamer Immink

Kornelis Antonie "Kees" Schouhamer Immink (born 18 December 1946) is a Dutch scientist, inventor, and entrepreneur, who pioneered and advanced the era of digital audio, video, and data recording, including popular digital media such as Compact Disc, DVD and Blu-ray Disc.

New!!: Information theory and Kees Schouhamer Immink · See more »

Keith Martin Ball

Keith Martin Ball FRS FRSE (born 26 December 1960) is a mathematician and professor at the University of Warwick.

New!!: Information theory and Keith Martin Ball · See more »

Kenneth A. Loparo

Kenneth A. Loparo is Nord Professor of Engineering and Chair of Department of Electrical Engineering and Computer Science at the Case Western Reserve University, OH, USA, where has been affiliated with since 1979.

New!!: Information theory and Kenneth A. Loparo · See more »

Kenneth M. Sayre

Kenneth M. Sayre is an American philosopher who spent most of his career at the University of Notre Dame (ND).

New!!: Information theory and Kenneth M. Sayre · See more »

Key size

In cryptography, key size or key length is the number of bits in a key used by a cryptographic algorithm (such as a cipher).

New!!: Information theory and Key size · See more »

Klaus Scherrer

Klaus Scherrer (Schaffhouse 10 December 1931 –) is a French biologist of Swiss nationality.

New!!: Information theory and Klaus Scherrer · See more »

Knowledge management

Knowledge management (KM) is the process of creating, sharing, using and managing the knowledge and information of an organisation.

New!!: Information theory and Knowledge management · See more »

Knowledge retrieval

Knowledge retrieval (KR) seeks to return information in a structured form, consistent with human cognitive processes as opposed to simple lists of data items.

New!!: Information theory and Knowledge retrieval · See more »

Kraft–McMillan inequality

In coding theory, the Kraft–McMillan inequality gives a necessary and sufficient condition for the existence of a prefix code (in Leon G. Kraft's version) or a uniquely decodable code (in Brockway McMillan's version) for a given set of codeword lengths.

New!!: Information theory and Kraft–McMillan inequality · See more »

Krichevsky–Trofimov estimator

In information theory, given an unknown stationary source π with alphabet A and a sample w from π, the Krichevsky–Trofimov (KT) estimator produces an estimate pi(w) of the probability of each symbol i ∈ A.

New!!: Information theory and Krichevsky–Trofimov estimator · See more »

Kullback's inequality

In information theory and statistics, Kullback's inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function.

New!!: Information theory and Kullback's inequality · See more »

Landauer's principle

Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation.

New!!: Information theory and Landauer's principle · See more »

Large deviations theory

In probability theory, the theory of large deviations concerns the asymptotic behaviour of remote tails of sequences of probability distributions.

New!!: Information theory and Large deviations theory · See more »

Lawrence J. Fogel

Dr.

New!!: Information theory and Lawrence J. Fogel · See more »

Léon Brillouin

Léon Nicolas Brillouin (August 7, 1889 – October 4, 1969) was a French physicist.

New!!: Information theory and Léon Brillouin · See more »

Learning

Learning is the process of acquiring new or modifying existing knowledge, behaviors, skills, values, or preferences.

New!!: Information theory and Learning · See more »

Leo Szilard

Leo Szilard (Szilárd Leó; Leo Spitz until age 2; February 11, 1898 – May 30, 1964) was a Hungarian-German-American physicist and inventor.

New!!: Information theory and Leo Szilard · See more »

Leonard Schulman

Leonard J. Y. Schulman (born September 14, 1963) is Professor of Computer Science in the Computing and Mathematical Sciences Department at the California Institute of Technology.

New!!: Information theory and Leonard Schulman · See more »

Leonid Levin

Leonid Anatolievich Levin (Леони́д Анато́льевич Ле́вин; Леоні́д Анато́лійович Ле́він; born November 2, 1948) is a Soviet-American computer scientist.

New!!: Information theory and Leonid Levin · See more »

Levenshtein distance

In information theory, linguistics and computer science, the Levenshtein distance is a string metric for measuring the difference between two sequences.

New!!: Information theory and Levenshtein distance · See more »

Library of Congress Classification:Class Q -- Science

Class Q: Science is a classification used by the Library of Congress Classification system.

New!!: Information theory and Library of Congress Classification:Class Q -- Science · See more »

Library science

Library science (often termed library studies, library and information science, bibliothecography, library economy) is an interdisciplinary or multidisciplinary field that applies the practices, perspectives, and tools of management, information technology, education, and other areas to libraries; the collection, organization, preservation, and dissemination of information resources; and the political economy of information.

New!!: Information theory and Library science · See more »

Limiting density of discrete points

In information theory, the limiting density of discrete points is an adjustment to the formula of Claude Shannon for differential entropy.

New!!: Information theory and Limiting density of discrete points · See more »

Linear programming decoding

In information theory and coding theory, linear programming decoding (LP decoding) is a decoding method which uses concepts from linear programming (LP) theory to solve decoding problems.

New!!: Information theory and Linear programming decoding · See more »

Linear response function

A linear response function describes the input-output relationship of a signal transducer such as a radio turning electromagnetic waves into music or a neuron turning synaptic input into a response.

New!!: Information theory and Linear response function · See more »

List of academic fields

The following outline is provided as an overview of an topical guide to academic disciplines: An academic discipline or field of study is known as a branch of knowledge.

New!!: Information theory and List of academic fields · See more »

List of African-American inventors and scientists

This list of black inventors and scientists documents many of the African Americans who have invented a multitude of items or made discoveries in the course of their lives.

New!!: Information theory and List of African-American inventors and scientists · See more »

List of amateur radio modes

The following is a list of the modes of radio communication used in the amateur radio hobby.

New!!: Information theory and List of amateur radio modes · See more »

List of awards named after people

This is a list of prizes that are named after people.

New!!: Information theory and List of awards named after people · See more »

List of cognitive biases

Cognitive biases are systematic patterns of deviation from norm or rationality in judgment, and are often studied in psychology and behavioral economics.

New!!: Information theory and List of cognitive biases · See more »

List of Columbia University alumni

This is a sorted list of notable persons who are alumni of Columbia University, New York City.

New!!: Information theory and List of Columbia University alumni · See more »

List of Columbia University alumni and attendees

This is a partial list of notable persons who have had ties to Columbia University.

New!!: Information theory and List of Columbia University alumni and attendees · See more »

List of computer scientists

This is a list of computer scientists, people who do work in computer science, in particular researchers and authors.

New!!: Information theory and List of computer scientists · See more »

List of cryptographers

List of cryptographers.

New!!: Information theory and List of cryptographers · See more »

List of game theorists

This is a list of notable economists, mathematicians, political scientists, and computer scientists whose work has added substantially to the field of game theory.

New!!: Information theory and List of game theorists · See more »

List of IEEE awards

Through its awards program, the Institute of Electrical and Electronics Engineers recognizes contributions that advance the fields of interest to the IEEE.

New!!: Information theory and List of IEEE awards · See more »

List of important publications in computer science

This is a list of important publications in computer science, organized by field.

New!!: Information theory and List of important publications in computer science · See more »

List of important publications in cryptography

This is a list of important publications in cryptography, organized by field.

New!!: Information theory and List of important publications in cryptography · See more »

List of important publications in theoretical computer science

This is a list of important publications in theoretical computer science, organized by field.

New!!: Information theory and List of important publications in theoretical computer science · See more »

List of inequalities

This page lists Wikipedia articles about named mathematical inequalities.

New!!: Information theory and List of inequalities · See more »

List of Internet pioneers

Instead of a single "inventor", the Internet was developed by many people over many years.

New!!: Information theory and List of Internet pioneers · See more »

List of mathematical theories

This is a list of mathematical theories.

New!!: Information theory and List of mathematical theories · See more »

List of people considered father or mother of a scientific field

The following is a list of people who are considered a "father" or "mother" (or "founding father" or "founding mother") of a scientific field.

New!!: Information theory and List of people considered father or mother of a scientific field · See more »

List of people from South Orange, New Jersey

Notable current and former residents of South Orange, New Jersey include.

New!!: Information theory and List of people from South Orange, New Jersey · See more »

List of people in systems and control

This is an alphabetical list of people who have made significant contributions in the fields of system analysis and control theory.

New!!: Information theory and List of people in systems and control · See more »

List of pioneers in computer science

This article presents a list of individuals who made transformative breakthroughs in the creation, development and imagining of what computers and electronics could do.

New!!: Information theory and List of pioneers in computer science · See more »

List of quantitative analysts

This is a list of notable quantitative analysts (by surname); see also List of financial economists.

New!!: Information theory and List of quantitative analysts · See more »

List of Queens College people

This is a list of notable alumni and faculty of Queens College, City University of New York.

New!!: Information theory and List of Queens College people · See more »

List of Russian mathematicians

This list of Russian mathematicians includes the famous mathematicians from the Russian Empire, the Soviet Union and the Russian Federation.

New!!: Information theory and List of Russian mathematicians · See more »

List of Russian scientists

Alona Soschen.

New!!: Information theory and List of Russian scientists · See more »

List of social psychologists

The following is a list of academicians, both past and present, who are widely renowned for their groundbreaking contributions to the field of social psychology.

New!!: Information theory and List of social psychologists · See more »

List of statistics articles

No description.

New!!: Information theory and List of statistics articles · See more »

List of Swedish Americans

The following is a list of notable Swedish Americans. Including both original immigrants who obtained American citizenship and their American descendants.

New!!: Information theory and List of Swedish Americans · See more »

List of theorems

This is a list of theorems, by Wikipedia page.

New!!: Information theory and List of theorems · See more »

List of University of Michigan alumni

There are more than 500,000 living alumni of the University of Michigan.

New!!: Information theory and List of University of Michigan alumni · See more »

List of University of North Dakota people

This is a list of notable people associated with the University of North Dakota in Grand Forks, North Dakota.

New!!: Information theory and List of University of North Dakota people · See more »

List of University of Utah people

This list of University of Utah people includes notable alumni, non-graduate former students, faculty, staff, and former university presidents.

New!!: Information theory and List of University of Utah people · See more »

List of unsolved problems in information theory

This article lists some unsolved problems in information theory which are separated into source coding and channel coding.

New!!: Information theory and List of unsolved problems in information theory · See more »

Lists of mathematics topics

This article itemizes the various lists of mathematics topics.

New!!: Information theory and Lists of mathematics topics · See more »

Lloyd R. Welch

Lloyd Richard Welch (born September 28, 1927) is an American information theorist and applied mathematician, and co-inventor of the Baum–Welch algorithm and the Berlekamp–Welch algorithm, also known as the Welch–Berlekamp algorithm.

New!!: Information theory and Lloyd R. Welch · See more »

Lloyd's algorithm

In computer science and electrical engineering, Lloyd's algorithm, also known as Voronoi iteration or relaxation, is an algorithm named after Stuart P. Lloyd for finding evenly spaced sets of points in subsets of Euclidean spaces and partitions of these subsets into well-shaped and uniformly sized convex cells.

New!!: Information theory and Lloyd's algorithm · See more »

Log probability

In computer science, a log probability is simply the logarithm of a probability.

New!!: Information theory and Log probability · See more »

Log sum inequality

In information theory, the log sum inequality is an inequality which is useful for proving several theorems in information theory.

New!!: Information theory and Log sum inequality · See more »

Logarithm

In mathematics, the logarithm is the inverse function to exponentiation.

New!!: Information theory and Logarithm · See more »

Logarithmic scale

A logarithmic scale is a nonlinear scale used when there is a large range of quantities.

New!!: Information theory and Logarithmic scale · See more »

Logarithmic Schrödinger equation

In theoretical physics, the logarithmic Schrödinger equation (sometimes abbreviated as LNSE or LogSE) is one of the nonlinear modifications of Schrödinger's equation.

New!!: Information theory and Logarithmic Schrödinger equation · See more »

Logic of information

The logic of information, or the logical theory of information, considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce.

New!!: Information theory and Logic of information · See more »

Lossless compression

Lossless compression is a class of data compression algorithms that allows the original data to be perfectly reconstructed from the compressed data.

New!!: Information theory and Lossless compression · See more »

Lossy compression

In information technology, lossy compression or irreversible compression is the class of data encoding methods that uses inexact approximations and partial data discarding to represent the content.

New!!: Information theory and Lossy compression · See more »

Low-density parity-check code

In information theory, a low-density parity-check (LDPC) code is a linear error correcting code, a method of transmitting a message over a noisy transmission channel.

New!!: Information theory and Low-density parity-check code · See more »

Lp space

In mathematics, the Lp spaces are function spaces defined using a natural generalization of the ''p''-norm for finite-dimensional vector spaces.

New!!: Information theory and Lp space · See more »

Luciano Floridi

Luciano Floridi (born 16 November 1964) is currently Professor of Philosophy and Ethics of Information and Director of the Digital Ethics Lab, at the University of Oxford, Oxford Internet Institute, Professorial Fellow of Exeter College, Oxford,, Senior Member of the Faculty of Philosophy, Research Associate and Fellow in Information Policy at the Department of Computer Science, University of Oxford, and Distinguished Research Fellow of the Oxford Uehiro Centre for Practical Ethics.

New!!: Information theory and Luciano Floridi · See more »

Ludwig von Bertalanffy

Karl Ludwig von Bertalanffy (19 September 1901 – 12 June 1972) was an Austrian biologist known as one of the founders of general systems theory (GST).

New!!: Information theory and Ludwig von Bertalanffy · See more »

Macy conferences

The Macy Conferences were a set of meetings of scholars from various disciplines held in New York under the direction of Frank Fremont-Smith at the Josiah Macy, Jr. Foundation starting in 1941 and ending in 1960.

New!!: Information theory and Macy conferences · See more »

Many-worlds interpretation

The many-worlds interpretation is an interpretation of quantum mechanics that asserts the objective reality of the universal wavefunction and denies the actuality of wavefunction collapse.

New!!: Information theory and Many-worlds interpretation · See more »

Map communication model

Map Communication Model is a theory in cartography that characterizes mapping as a process of transmitting geographic information via the map from the cartographer to the end-user.

New!!: Information theory and Map communication model · See more »

Marcel J. E. Golay

Marcel Jules Edouard Golay (May 3, 1902 – April 27, 1989) was a Swiss-born mathematician, physicist, and information theorist, who applied mathematics to real-world military and industrial problems.

New!!: Information theory and Marcel J. E. Golay · See more »

Marcel-Paul Schützenberger

Marcel-Paul "Marco" Schützenberger (October 24, 1920 – July 29, 1996) was a French mathematician and Doctor of Medicine.

New!!: Information theory and Marcel-Paul Schützenberger · See more »

Marcin Schroeder

Marcin Schroeder (born January 10, 1953 in Wrocław, Lower Silesian Voivodeship in Poland, son of Jerzy Schroeder and Irena Grudzińska) is a Polish-Japanese mathematician and theoretical physicist, currently a professor and head of basic education and dean of academic affairs at Akita International University, Japan, and President Elect of the International Society for the Study of Information (IS4SI).

New!!: Information theory and Marcin Schroeder · See more »

Marcus Hutter

Marcus Hutter (born April 14, 1967) is a German computer scientist.

New!!: Information theory and Marcus Hutter · See more »

Mark Henry Hansen

Mark Henry Hansen is an American statistician, professor at the Columbia University Graduate School of Journalism and Director of the David and Helen Gurely Brown Institute for Media Innovation.

New!!: Information theory and Mark Henry Hansen · See more »

Mark Semenovich Pinsker

Mark Semenovich Pinsker (Марк Семено́вич Пи́нскер; April 24, 1925 – December 23, 2003) or Mark Shlemovich Pinsker (Марк Шлемо́вич Пи́нскер) was a noted Russian mathematician in the fields of information theory, probability theory, coding theory, ergodic theory, mathematical statistics, and communication networks.

New!!: Information theory and Mark Semenovich Pinsker · See more »

Markov chain

A Markov chain is "a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event".

New!!: Information theory and Markov chain · See more »

Massachusetts Institute of Technology

The Massachusetts Institute of Technology (MIT) is a private research university located in Cambridge, Massachusetts, United States.

New!!: Information theory and Massachusetts Institute of Technology · See more »

Master of Science in Business Analytics

A Master of Science in Business Analytics (MSBA) is an interdisciplinary STEM graduate professional degree that blends concepts from data science, computer science, statistics, business intelligence, and information theory geared towards commercial applications.

New!!: Information theory and Master of Science in Business Analytics · See more »

Mathematical beauty

Mathematical beauty describes the notion that some mathematicians may derive aesthetic pleasure from their work, and from mathematics in general.

New!!: Information theory and Mathematical beauty · See more »

Mathematical constant

A mathematical constant is a special number that is "significantly interesting in some way".

New!!: Information theory and Mathematical constant · See more »

Mathematical methods in electronics

Mathematical methods are integral to the study of electronics.

New!!: Information theory and Mathematical methods in electronics · See more »

Mathematical psychology

Mathematical psychology is an approach to psychological research that is based on mathematical modeling of perceptual, cognitive and motor processes, and on the establishment of law-like rules that relate quantifiable stimulus characteristics with quantifiable behavior.

New!!: Information theory and Mathematical psychology · See more »

Mathematics

Mathematics (from Greek μάθημα máthēma, "knowledge, study, learning") is the study of such topics as quantity, structure, space, and change.

New!!: Information theory and Mathematics · See more »

Mathematics of radio engineering

The mathematics of radio engineering is the application of electromagnetic theory to radio-frequency engineering, using conceptual tools such as vector calculus and complex analysis.

New!!: Information theory and Mathematics of radio engineering · See more »

Max Bense

Max Bense (February 7, 1910 in Strasbourg – April 29, 1990 in Stuttgart) was a German philosopher, writer, and publicist, known for his work in philosophy of science, logic, aesthetics, and semiotics.

New!!: Information theory and Max Bense · See more »

Maximum entropy

Entropy is a concept that originated in thermodynamics, and later, via statistical mechanics, motivated entire branches of information theory, statistics, and machine learning.

New!!: Information theory and Maximum entropy · See more »

Maximum entropy probability distribution

In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions.

New!!: Information theory and Maximum entropy probability distribution · See more »

Maximum entropy spectral estimation

Maximum entropy spectral estimation is a method of spectral density estimation.

New!!: Information theory and Maximum entropy spectral estimation · See more »

Maximum entropy thermodynamics

In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes.

New!!: Information theory and Maximum entropy thermodynamics · See more »

Mean field particle methods

Mean field particle methods are a broad class of interacting type Monte Carlo algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation These flows of probability measures can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depends on the distributions of the current random states.

New!!: Information theory and Mean field particle methods · See more »

Meaning (semiotics)

In semiotics, the meaning of a sign is its place in a sign relation, in other words, the set of roles that it occupies within a given sign relation.

New!!: Information theory and Meaning (semiotics) · See more »

Measurement

Measurement is the assignment of a number to a characteristic of an object or event, which can be compared with other objects or events.

New!!: Information theory and Measurement · See more »

Medford, Massachusetts

Medford is a city 3.2 miles northwest of downtown Boston on the Mystic River in Middlesex County, Massachusetts, United States.

New!!: Information theory and Medford, Massachusetts · See more »

Media studies

Media studies is a discipline and field of study that deals with the content, history, and effects of various media; in particular, the mass media.

New!!: Information theory and Media studies · See more »

Melodic expectation

In music cognition and musical analysis, the study of melodic expectation considers the engagement of the brain's predictive mechanisms in response to music.

New!!: Information theory and Melodic expectation · See more »

Mental chronometry

Mental chronometry is the use of response time in perceptual-motor tasks to infer the content, duration, and temporal sequencing of cognitive operations.

New!!: Information theory and Mental chronometry · See more »

Mesoeconomics

Mesoeconomics or Mezzoeconomics is a neologism used to describe the study of economic arrangements which are not based either on the microeconomics of buying and selling and supply and demand, nor on the macroeconomic reasoning of aggregate totals of demand, but on the importance of the structures under which these forces play out, and how to measure these effects.

New!!: Information theory and Mesoeconomics · See more »

Middle European Cooperation in Statistical Physics

The Middle European Cooperation in Statistical Physics (MECO) is an international conference on statistical physics which takes place every year in a different country of Europe.

New!!: Information theory and Middle European Cooperation in Statistical Physics · See more »

MIMO

In radio, multiple-input and multiple-output, or MIMO (pronounced or), is a method for multiplying the capacity of a radio link using multiple transmit and receive antennas to exploit multipath propagation.

New!!: Information theory and MIMO · See more »

Min entropy

The min entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome.

New!!: Information theory and Min entropy · See more »

Mind

The mind is a set of cognitive faculties including consciousness, perception, thinking, judgement, language and memory.

New!!: Information theory and Mind · See more »

Minimum description length

The minimum description length (MDL) principle is a formalization of Occam's razor in which the best hypothesis (a model and its parameters) for a given set of data is the one that leads to the best compression of the data.

New!!: Information theory and Minimum description length · See more »

Minimum Fisher information

In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system.

New!!: Information theory and Minimum Fisher information · See more »

Minimum message length

Minimum message length (MML) is a formal information theory restatement of Occam's Razor: even when models are equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct (where the message consists of a statement of the model, followed by a statement of data encoded concisely using that model).

New!!: Information theory and Minimum message length · See more »

Minivac 601

Minivac 601 Digital Computer Kit was an electromechanical digital computer system created by information theory pioneer Claude Shannon as an educational kit using digital circuits.

New!!: Information theory and Minivac 601 · See more »

MIT Electrical Engineering and Computer Science Department

The Electrical Engineering and Computer Science Department at MIT offers academic programs leading to the S.B., S.M., M.Eng.

New!!: Information theory and MIT Electrical Engineering and Computer Science Department · See more »

Models of collaborative tagging

It has been argued that social tagging or collaborative tagging systems can provide navigational cues or "way-finders" for other users to explore information.

New!!: Information theory and Models of collaborative tagging · See more »

Mohammad Reza Aref

Mohammad Reza Aref (محمدرضا عارف, born 19 December 1951) is an Iranian engineer, academic and reformist politician who is currently parliamentary leader of reformists' Hope fraction in the Iranian Parliament, representing Tehran, Rey, Shemiranat and Eslamshahr.

New!!: Information theory and Mohammad Reza Aref · See more »

Mooney Face Test

The Mooney Face Test was developed by Craig M. Mooney and his results published in 1957 as “Age in the development of closure ability in children.” In the test, participants are shown low-information, two-tone pictures of faces and are asked to identify features and distinguish between real and false faces.

New!!: Information theory and Mooney Face Test · See more »

Morse code

Morse code is a method of transmitting text information as a series of on-off tones, lights, or clicks that can be directly understood by a skilled listener or observer without special equipment.

New!!: Information theory and Morse code · See more »

Most frequent k characters

In information theory, MostFreqKDistance is a string metric technique for quickly estimating how similar two ordered sets or strings are.

New!!: Information theory and Most frequent k characters · See more »

MPEG-1

MPEG-1 is a standard for lossy compression of video and audio.

New!!: Information theory and MPEG-1 · See more »

MRK (visual artist)

MRK (Markos Kay) is a visual artist, creative director, illustrator and lecturer based in London, best known for his artificial life video art experiment "aDiatomea" (2007), a permanent exhibit at the Phyletic Museum in Jena, Germany.

New!!: Information theory and MRK (visual artist) · See more »

Multiple description coding

Multiple description coding (MDC) is a coding technique that fragments a single media stream into n substreams (n ≥ 2) referred to as descriptions.

New!!: Information theory and Multiple description coding · See more »

Multivariate mutual information

In information theory there have been various attempts over the years to extend the definition of mutual information to more than two random variables.

New!!: Information theory and Multivariate mutual information · See more »

Music learning theory

The field of music education contains a number of learning theories that specify how students learn music based on behavioral and cognitive psychology.

New!!: Information theory and Music learning theory · See more »

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

New!!: Information theory and Mutual information · See more »

N-gram

In the fields of computational linguistics and probability, an n-gram is a contiguous sequence of n items from a given sample of text or speech.

New!!: Information theory and N-gram · See more »

Nariman Farvardin

Nariman Farvardin (born July 15, 1956) is an Iranian-American engineer and educator, currently serving as President of Stevens Institute of Technology, Hoboken, New Jersey.

New!!: Information theory and Nariman Farvardin · See more »

Nathaniel Rochester (computer scientist)

Nathaniel Rochester (January 14, 1919 – June 8, 2001) designed the IBM 701, wrote the first assembler and participated in the founding of the field of artificial intelligence.

New!!: Information theory and Nathaniel Rochester (computer scientist) · See more »

Negentropy

The negentropy has different meanings in information theory and theoretical biology.

New!!: Information theory and Negentropy · See more »

Neural network

The term neural network was traditionally used to refer to a network or circuit of neurons.

New!!: Information theory and Neural network · See more »

Nicolas J. Cerf

Nicolas Jean Cerf (born 1965) is a Belgian physicist.

New!!: Information theory and Nicolas J. Cerf · See more »

Ninoslav Marina

Ninoslav Marina (Нинослав Марина; born 25 September 1974) is Rector of the University of Information Science and Technology "St. Paul the Apostle" located in Ohrid, Macedonia and President of the Rectors' Conference of the public universities in the Republic of Macedonia.

New!!: Information theory and Ninoslav Marina · See more »

Noise (signal processing)

In signal processing, noise is a general term for unwanted (and, in general, unknown) modifications that a signal may suffer during capture, storage, transmission, processing, or conversion.

New!!: Information theory and Noise (signal processing) · See more »

Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.

New!!: Information theory and Noisy-channel coding theorem · See more »

Noisy-storage model

The noisy-storage model refers to a cryptographic model employed in quantum cryptography.

New!!: Information theory and Noisy-storage model · See more »

Norbert Wiener

Norbert Wiener (November 26, 1894 – March 18, 1964) was an American mathematician and philosopher.

New!!: Information theory and Norbert Wiener · See more »

Norm (mathematics)

In linear algebra, functional analysis, and related areas of mathematics, a norm is a function that assigns a strictly positive length or size to each vector in a vector space—save for the zero vector, which is assigned a length of zero.

New!!: Information theory and Norm (mathematics) · See more »

Nucleic acid design

Nucleic acid design is the process of generating a set of nucleic acid base sequences that will associate into a desired conformation.

New!!: Information theory and Nucleic acid design · See more »

Numbers (season 2)

Season two of Numbers, an American television series, premiered on September 23, 2005 and its season finale was on May 19, 2006.

New!!: Information theory and Numbers (season 2) · See more »

Objections to evolution

Objections to evolution have been raised since evolutionary ideas came to prominence in the 19th century.

New!!: Information theory and Objections to evolution · See more »

Observer

An observer is one who engages in observation or in watching an experiment.

New!!: Information theory and Observer · See more »

Observer (special relativity)

In special relativity, an observer is a frame of reference from which a set of objects or events are being measured.

New!!: Information theory and Observer (special relativity) · See more »

Occam's razor

Occam's razor (also Ockham's razor or Ocham's razor; Latin: lex parsimoniae "law of parsimony") is the problem-solving principle that, the simplest explanation tends to be the right one.

New!!: Information theory and Occam's razor · See more »

Occupations in electrical/electronics engineering

The field of electrical and electronics engineering has grown to include many related disciplines and occupations.

New!!: Information theory and Occupations in electrical/electronics engineering · See more »

Olivier Costa de Beauregard

Olivier Costa de Beauregard (Paris, 6 November 1911 – Poitiers, 5 February 2007) was a French relativistic and quantum physicist, and philosopher of science.

New!!: Information theory and Olivier Costa de Beauregard · See more »

One-time pad

In cryptography, the one-time pad (OTP) is an encryption technique that cannot be cracked, but requires the use of a one-time pre-shared key the same size as, or longer than, the message being sent.

New!!: Information theory and One-time pad · See more »

Open system (systems theory)

An open system is a system that has external interactions.

New!!: Information theory and Open system (systems theory) · See more »

Optimal design

In the design of experiments, optimal designs (or optimum designs) are a class of experimental designs that are optimal with respect to some statistical criterion.

New!!: Information theory and Optimal design · See more »

Orange Poodle

Orange Poodle was an experimental over-the-horizon radar developed beginning in 1949 and tested experimentally in 1952 and 53.

New!!: Information theory and Orange Poodle · See more »

Ordinal data

Ordinal data is a categorical, statistical data type where the variables have natural, ordered categories and the distances between the categories is not known.

New!!: Information theory and Ordinal data · See more »

Oregon State University

Oregon State University (OSU) is an international, public research university in the northwest United States, located in Corvallis, Oregon.

New!!: Information theory and Oregon State University · See more »

Outage probability

In Information theory, outage probability of a communication channel is the probability that a given information rate is not supported, because of variable channel capacity.

New!!: Information theory and Outage probability · See more »

Outline of academic disciplines

An academic discipline or field of study is a branch of knowledge that is taught and researched as part of higher education.

New!!: Information theory and Outline of academic disciplines · See more »

Outline of automation

The following outline is provided as an overview of and topical guide to automation: Automation – use of control systems and information technologies to reduce the need for human work in the production of goods and services.

New!!: Information theory and Outline of automation · See more »

Outline of communication

The following outline is provided as an overview of and topical guide to communication: Communication – purposeful activity of exchanging information and meaning across space and time using various technical or natural means, whichever is available or preferred.

New!!: Information theory and Outline of communication · See more »

Outline of computing

The following outline is provided as an overview of and topical guide to computing: Computing – activity of using and improving computer hardware and software.

New!!: Information theory and Outline of computing · See more »

Outline of discrete mathematics

Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous.

New!!: Information theory and Outline of discrete mathematics · See more »

Outline of electrical engineering

The following outline is provided as an overview of and topical guide to electrical engineering.

New!!: Information theory and Outline of electrical engineering · See more »

Outline of mathematics

Mathematics is a field of study that investigates topics including number, space, structure, and change.

New!!: Information theory and Outline of mathematics · See more »

Outline of radio science

One way of outlining the subject of radio science is listing the topics associated with it by authoritative bodies.

New!!: Information theory and Outline of radio science · See more »

Outline of science

The following outline is provided as a topical overview of science: Science – the systematic effort of acquiring knowledge—through observation and experimentation coupled with logic and reasoning to find out what can be proved or not proved—and the knowledge thus acquired.

New!!: Information theory and Outline of science · See more »

Parity (mathematics)

In mathematics, parity is the property of an integer's inclusion in one of two categories: even or odd.

New!!: Information theory and Parity (mathematics) · See more »

Partition function (mathematics)

The partition function or configuration integral, as used in probability theory, information theory and dynamical systems, is a generalization of the definition of a partition function in statistical mechanics.

New!!: Information theory and Partition function (mathematics) · See more »

Partition function (statistical mechanics)

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium.

New!!: Information theory and Partition function (statistical mechanics) · See more »

Password strength

Password strength is a measure of the effectiveness of a password against guessing or brute-force attacks.

New!!: Information theory and Password strength · See more »

Paul Vitányi

Paul Michael Béla Vitányi (born 21 July 1944) is a Dutch computer scientist, Professor of Computer Science at the University of Amsterdam and researcher at the Dutch Centrum Wiskunde & Informatica.

New!!: Information theory and Paul Vitányi · See more »

Pedometric mapping

Pedometric mapping, or statistical soil mapping, is data-driven generation of soil property and class maps that is based on use of statistical methods.

New!!: Information theory and Pedometric mapping · See more »

Pedro Crespo

Pedro M. Crespo Bofill is Professor of Electrical Engineering at the Tecnun School of Engineering of the University of Navarre in Spain, and Head of the Electronics & Communications Department at CEIT Research Institute.

New!!: Information theory and Pedro Crespo · See more »

Perceptual paradox

A perceptual paradox illustrates the failure of a theoretical prediction.

New!!: Information theory and Perceptual paradox · See more »

Perplexity

In information theory, perplexity is a measurement of how well a probability distribution or probability model predicts a sample.

New!!: Information theory and Perplexity · See more »

Peter Elias

Peter Elias (November 23, 1923 – December 7, 2001) was a pioneer in the field of information theory.

New!!: Information theory and Peter Elias · See more »

Peter Franaszek

Peter A. Franaszek is an American information theorist, an IEEE Fellow, a research staff member emeritus at the IBM T.J. Watson Research Center and a former member of the IBM Academy of Technology.

New!!: Information theory and Peter Franaszek · See more »

Petoskey, Michigan

Petoskey is a city and coastal resort community in the U.S. state of Michigan.

New!!: Information theory and Petoskey, Michigan · See more »

Phi Kappa Phi

The Honor Society of Phi Kappa Phi (or simply Phi Kappa Phi or ΦΚΦ) is an honor society established in 1897 to recognize and encourage superior scholarship without restriction as to area of study and to promote the "unity and democracy of education".

New!!: Information theory and Phi Kappa Phi · See more »

Philip Woodward

Philip Mayne Woodward (6 September 1919 – 30 January 2018) was a British mathematician, radar engineer and horologist.

New!!: Information theory and Philip Woodward · See more »

Philosophy of information

The philosophy of information (PI) is a branch of philosophy that studies topics relevant to computer science, information science and information technology.

New!!: Information theory and Philosophy of information · See more »

Philosophy of thermal and statistical physics

The philosophy of thermal and statistical physics is that part of the philosophy of physics whose subject matter is classical thermodynamics, statistical mechanics, and related theories.

New!!: Information theory and Philosophy of thermal and statistical physics · See more »

Physical information

In physics, physical information refers generally to the information that is contained in a physical system.

New!!: Information theory and Physical information · See more »

Pinsker's inequality

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence.

New!!: Information theory and Pinsker's inequality · See more »

Plan

A plan is typically any diagram or list of steps with details of timing and resources, used to achieve an objective to do something.

New!!: Information theory and Plan · See more »

Planar separator theorem

In graph theory, the planar separator theorem is a form of isoperimetric inequality for planar graphs, that states that any planar graph can be split into smaller pieces by removing a small number of vertices.

New!!: Information theory and Planar separator theorem · See more »

Point estimation

In statistics, point estimation involves the use of sample data to calculate a single value (known as a point estimate or statistic) which is to serve as a "best guess" or "best estimate" of an unknown population parameter (for example, the population mean).

New!!: Information theory and Point estimation · See more »

Pointwise mutual information

Pointwise mutual information (PMI), or point mutual information, is a measure of association used in information theory and statistics.

New!!: Information theory and Pointwise mutual information · See more »

Polar code (coding theory)

In information theory, a polar code is a linear block error correcting code.

New!!: Information theory and Polar code (coding theory) · See more »

Positive-definite kernel

In operator theory, a branch of mathematics, a positive definite kernel is a generalization of a positive definite function or a positive-definite matrix.

New!!: Information theory and Positive-definite kernel · See more »

Post-industrial society

In sociology, the post-industrial society is the stage of society's development when the service sector generates more wealth than the manufacturing sector of the economy.

New!!: Information theory and Post-industrial society · See more »

Pragmatic theory of information

The pragmatic theory of information is derived from Charles Sanders Peirce's general theory of signs and inquiry.

New!!: Information theory and Pragmatic theory of information · See more »

Prediction in language comprehension

Linguistic prediction is a phenomenon in psycholinguistics occurring whenever information about a word or other linguistic unit is activated before that unit is actually encountered.

New!!: Information theory and Prediction in language comprehension · See more »

Principle of maximum entropy

The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information).

New!!: Information theory and Principle of maximum entropy · See more »

Prior probability

In Bayesian statistical inference, a prior probability distribution, often simply called the prior, of an uncertain quantity is the probability distribution that would express one's beliefs about this quantity before some evidence is taken into account.

New!!: Information theory and Prior probability · See more »

Probabilistic method

The probabilistic method is a nonconstructive method, primarily used in combinatorics and pioneered by Paul Erdős, for proving the existence of a prescribed kind of mathematical object.

New!!: Information theory and Probabilistic method · See more »

Problem domain

A problem domain is the area of expertise or application that needs to be examined to solve a problem.

New!!: Information theory and Problem domain · See more »

Protein structure prediction

Protein structure prediction is the inference of the three-dimensional structure of a protein from its amino acid sequence—that is, the prediction of its folding and its secondary and tertiary structure from its primary structure.

New!!: Information theory and Protein structure prediction · See more »

Prototype theory

Prototype theory is a mode of graded categorization in cognitive science, where some members of a category are more central than others.

New!!: Information theory and Prototype theory · See more »

Pseudorandomness

A pseudorandom process is a process that appears to be random but is not.

New!!: Information theory and Pseudorandomness · See more »

Quantities of information

The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information.

New!!: Information theory and Quantities of information · See more »

Quantum channel

In quantum information theory, a quantum channel is a communication channel which can transmit quantum information, as well as classical information.

New!!: Information theory and Quantum channel · See more »

Quantum entanglement

Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole.

New!!: Information theory and Quantum entanglement · See more »

Quantum information

In physics and computer science, quantum information is information that is held in the state of a quantum system.

New!!: Information theory and Quantum information · See more »

Quantum key distribution

Quantum key distribution (QKD) is a secure communication method which implements a cryptographic protocol involving components of quantum mechanics.

New!!: Information theory and Quantum key distribution · See more »

Quantum reference frame

A quantum reference frame is a reference frame which is treated quantum theoretically.

New!!: Information theory and Quantum reference frame · See more »

Quantum teleportation

Quantum teleportation is a process by which quantum information (e.g. the exact state of an atom or photon) can be transmitted (exactly, in principle) from one location to another, with the help of classical communication and previously shared quantum entanglement between the sending and receiving location.

New!!: Information theory and Quantum teleportation · See more »

R. Luke DuBois

Roger Luke DuBois (born September 10, 1975, Morristown, New Jersey, United States) is an American composer, performer, conceptual new media artist, programmer, record producer and pedagogue based in New York City.

New!!: Information theory and R. Luke DuBois · See more »

Rafael Capurro

Rafael Capurro (born 20 November 1945) is a Uruguayan philosopher and academic specialist in the field of information ethics.

New!!: Information theory and Rafael Capurro · See more »

Ralph Hartley

Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an electronics researcher.

New!!: Information theory and Ralph Hartley · See more »

Ramon Margalef

Ramón Margalef i López (Barcelona 16 May 1919 - 23 May 2004) was a Spanish biologist and ecologist.

New!!: Information theory and Ramon Margalef · See more »

Randal A. Koene

Randal A. Koene is a Dutch neuroscientist and neuroengineer, and co-founder of carboncopies.org, the outreach and roadmapping organization for advancing Substrate-Independent Minds (SIM).

New!!: Information theory and Randal A. Koene · See more »

Randall Dougherty

Randall Dougherty (born 1961) is an American mathematician.

New!!: Information theory and Randall Dougherty · See more »

Randomness

Randomness is the lack of pattern or predictability in events.

New!!: Information theory and Randomness · See more »

Rate–distortion theory

Rate–distortion theory is a major branch of information theory which provides the theoretical foundations for lossy data compression; it addresses the problem of determining the minimal number of bits per symbol, as measured by the rate R, that should be communicated over a channel, so that the source (input signal) can be approximately reconstructed at the receiver (output signal) without exceeding a given distortion D.

New!!: Information theory and Rate–distortion theory · See more »

Ray C. Dougherty

Ray C. Dougherty (born 1940) is an American linguist and was a member of the Arts and Science faculty at New York University until 2014 (retired).

New!!: Information theory and Ray C. Dougherty · See more »

Rényi entropy

In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min entropy.

New!!: Information theory and Rényi entropy · See more »

Receiver (information theory)

The receiver in information theory is the receiving end of a communication channel.

New!!: Information theory and Receiver (information theory) · See more »

Reductionism

Reductionism is any of several related philosophical ideas regarding the associations between phenomena which can be described in terms of other simpler or more fundamental phenomena.

New!!: Information theory and Reductionism · See more »

Redundancy (information theory)

In Information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value \log(|\mathcal_X|).

New!!: Information theory and Redundancy (information theory) · See more »

Reinforcement learning

Reinforcement learning (RL) is an area of machine learning inspired by behaviourist psychology, concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward.

New!!: Information theory and Reinforcement learning · See more »

Relational quantum mechanics

Relational quantum mechanics (RQM) is an interpretation of quantum mechanics which treats the state of a quantum system as being observer-dependent, that is, the state is the relation between the observer and the system.

New!!: Information theory and Relational quantum mechanics · See more »

Relational theory

In physics and philosophy, a relational theory is a framework to understand reality or a physical system in such a way that the positions and other properties of objects are only meaningful relative to other objects.

New!!: Information theory and Relational theory · See more »

Relay channel

In information theory, a relay channel is a probability model of the communication between a sender and a receiver aided by one or more intermediate relay nodes.

New!!: Information theory and Relay channel · See more »

Research Institute of Computer Science and Random Systems

The Institut de recherche en informatique et systèmes aléatoires is a joint computer science research center of CNRS, University of Rennes 1, ENS Rennes, INSA Rennes and Inria, located in Rennes in Brittany.

New!!: Information theory and Research Institute of Computer Science and Random Systems · See more »

Retina

The retina is the innermost, light-sensitive "coat", or layer, of shell tissue of the eye of most vertebrates and some molluscs.

New!!: Information theory and Retina · See more »

Richard Blahut

Richard Blahut, as a member of National Academy of Engineering in Electronics, Communication & Information Systems Engineering and Computer Science & Engineering for pioneering work in coherent emitter signal processing and for contributions to information theory and error control codes.

New!!: Information theory and Richard Blahut · See more »

Richard Leibler

Richard A. Leibler (March 18, 1914, Chicago – October 25, 2003, Reston, Virginia) was an American mathematician and cryptanalyst.

New!!: Information theory and Richard Leibler · See more »

Robert Calderbank

Robert Calderbank (born 28 December 1954) is a professor of Computer Science, Electrical Engineering, and Mathematics and director of the Information Initiative at Duke.

New!!: Information theory and Robert Calderbank · See more »

Robert Fano

Roberto Mario "Robert" Fano (11 November 1917 – 13 July 2016) was an Italian-American computer scientist and professor of electrical engineering and computer science at the Massachusetts Institute of Technology.

New!!: Information theory and Robert Fano · See more »

Robert G. Gallager

Robert Gray Gallager (born May 29, 1931) is an American electrical engineer known for his work on information theory and communications networks.

New!!: Information theory and Robert G. Gallager · See more »

Robert M. Gray

Robert M. Gray (born November 1, 1943) is an American information theorist, and the Alcatel-Lucent Professor of Electrical Engineering at Stanford University in Palo Alto, California.

New!!: Information theory and Robert M. Gray · See more »

Robert McEliece

Robert J. McEliece (born 1942) is a mathematician and engineering professor at the California Institute of Technology (Caltech) best known for his work in information theory.

New!!: Information theory and Robert McEliece · See more »

Robert Tienwen Chien

Robert Tienwen Chien (November 20, 1931 – December 8, 1983) was an American computer scientist concerned largely with research in information theory, fault-tolerance, and artificial intelligence (AI), director of the Coordinated Science Laboratory (CSL) at the University of Illinois at Urbana–Champaign, and known for his invention of the Chien search and seminal contributions to the PMC model in system level fault diagnosis.

New!!: Information theory and Robert Tienwen Chien · See more »

Robert Ulanowicz

Robert Edward Ulanowicz is an American theoretical ecologist and philosopher of Polish descent who in his search for a unified theory of ecology has formulated a paradigm he calls Process Ecology.

New!!: Information theory and Robert Ulanowicz · See more »

Robert Vallée

Robert Vallée (5 October 1922 in Poitiers, France – 1 January 2017, Paris, France) was a French cyberneticist and mathematician.

New!!: Information theory and Robert Vallée · See more »

Roland Dobrushin

Roland Lvovich Dobrushin (Рола́нд Льво́вич Добру́шин) (July 20, 1929 – November 12, 1995) was a mathematician who made important contributions to probability theory, mathematical physics, and information theory.

New!!: Information theory and Roland Dobrushin · See more »

Roulette

Roulette is a casino game named after the French word meaning little wheel.

New!!: Information theory and Roulette · See more »

Rudolf Ahlswede

Rudolf F. Ahlswede (September 15, 1938 – December 18, 2010) was a German mathematician.

New!!: Information theory and Rudolf Ahlswede · See more »

Sackur–Tetrode equation

The Sackur–Tetrode equation is an expression for the entropy of a monatomic classical ideal gas which incorporates quantum considerations which give a more detailed description of its regime of validity.

New!!: Information theory and Sackur–Tetrode equation · See more »

Sanjeev Kulkarni

Sanjeev Ramesh Kulkarni (born September 21, 1963 in Mumbai, India) is an Indian-born American academic.

New!!: Information theory and Sanjeev Kulkarni · See more »

Sanov's theorem

In information theory, Sanov's theorem gives a bound on the probability of observing an atypical sequence of samples from a given probability distribution.

New!!: Information theory and Sanov's theorem · See more »

Santa Fe Institute

The Santa Fe Institute (SFI) is an independent, nonprofit theoretical research institute located in Santa Fe (New Mexico, United States) and dedicated to the multidisciplinary study of the fundamental principles of complex adaptive systems, including physical, computational, biological, and social systems.

New!!: Information theory and Santa Fe Institute · See more »

Sara Imari Walker

Sara Imari Walker is an American theoretical physicist and astrobiologist with research interests in the origins of life, astrobiology, physics of life, emergence, complex and dynamical systems, and artificial life.

New!!: Information theory and Sara Imari Walker · See more »

Scene statistics

Scene statistics is a discipline within the field of perception.

New!!: Information theory and Scene statistics · See more »

Science

R. P. Feynman, The Feynman Lectures on Physics, Vol.1, Chaps.1,2,&3.

New!!: Information theory and Science · See more »

Science and technology in Russia

Science and technology in Russia developed rapidly since the Age of Enlightenment, when Peter the Great founded the Russian Academy of Sciences and Saint Petersburg State University and polymath Mikhail Lomonosov founded the Moscow State University, establishing a strong native tradition in learning and innovation.

New!!: Information theory and Science and technology in Russia · See more »

Scientia Iranica

Scientia Iranica is a peer-reviewed scientific journal published by Sharif University of Technology (Tehran, Iran).

New!!: Information theory and Scientia Iranica · See more »

Scientific method

Scientific method is an empirical method of knowledge acquisition, which has characterized the development of natural science since at least the 17th century, involving careful observation, which includes rigorous skepticism about what one observes, given that cognitive assumptions about how the world works influence how one interprets a percept; formulating hypotheses, via induction, based on such observations; experimental testing and measurement of deductions drawn from the hypotheses; and refinement (or elimination) of the hypotheses based on the experimental findings.

New!!: Information theory and Scientific method · See more »

Score (statistics)

In statistics, the score, score function, efficient score or informant indicates how sensitive a likelihood function \mathcal L(\theta; X) is to its parameter \theta.

New!!: Information theory and Score (statistics) · See more »

Scoring rule

In decision theory, a score function, or scoring rule, measures the accuracy of probabilistic predictions.

New!!: Information theory and Scoring rule · See more »

Secret sharing

Secret sharing (also called secret splitting) refers to methods for distributing a secret amongst a group of participants, each of whom is allocated a share of the secret.

New!!: Information theory and Secret sharing · See more »

Secure two-party computation

Secure two-party computation (2PC) is sub-problem of secure multi-party computation (MPC) that has received special attention by researchers because of its close relation to many cryptographic tasks.

New!!: Information theory and Secure two-party computation · See more »

Self-information

In information theory, self-information or surprisal is the surprise when a random variable is sampled.

New!!: Information theory and Self-information · See more »

Self-organization

Self-organization, also called (in the social sciences) spontaneous order, is a process where some form of overall order arises from local interactions between parts of an initially disordered system.

New!!: Information theory and Self-organization · See more »

Self-organized criticality

In physics, self-organized criticality (SOC) is a property of dynamical systems that have a critical point as an attractor.

New!!: Information theory and Self-organized criticality · See more »

Semantic compression

In natural language processing, semantic compression is a process of compacting a lexicon used to build a textual document (or a set of documents) by reducing language heterogeneity, while maintaining text semantics.

New!!: Information theory and Semantic compression · See more »

Semiotic information theory

Semiotic information theory considers the information content of signs and expressions as it is conceived within the semiotic or sign-relational framework developed by Charles Sanders Peirce.

New!!: Information theory and Semiotic information theory · See more »

Semiotics

Semiotics (also called semiotic studies) is the study of meaning-making, the study of sign process (semiosis) and meaningful communication.

New!!: Information theory and Semiotics · See more »

Sepp Hochreiter

Sepp Hochreiter (born Josef Hochreiter in 1967) is a German computer scientist.

New!!: Information theory and Sepp Hochreiter · See more »

Sergey Bobkov

Sergey Bobkov (Russian: Cергей Германович Бобков, born March 15, 1961) is a mathematician.

New!!: Information theory and Sergey Bobkov · See more »

Sergio Albeverio

Sergio Albeverio (born 17 January 1939) is a Swiss mathematician and mathematical physicist working in numerous fields of mathematics and its applications.

New!!: Information theory and Sergio Albeverio · See more »

Sergio Verdú

Sergio Verdú (born Barcelona, Spain, August 15, 1958) is the Eugene Higgins Professor of Electrical Engineering at Princeton University, where he teaches and conducts research on Information Theory in the Information Sciences and Systems Group.

New!!: Information theory and Sergio Verdú · See more »

Set redundancy compression

In computer science and information theory, set redundancy compression are methods of data compression that exploits redundancy between individual data groups of a set, usually a set of similar images.

New!!: Information theory and Set redundancy compression · See more »

Shannon's source coding theorem

In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy.

New!!: Information theory and Shannon's source coding theorem · See more »

Shannon–Fano coding

In the field of data compression, Shannon–Fano coding, named after Claude Shannon and Robert Fano, is a technique for constructing a prefix code based on a set of symbols and their probabilities (estimated or measured).

New!!: Information theory and Shannon–Fano coding · See more »

Shannon–Fano–Elias coding

In information theory, Shannon–Fano–Elias coding is a precursor to arithmetic coding, in which probabilities are used to determine codewords.

New!!: Information theory and Shannon–Fano–Elias coding · See more »

Shannon–Hartley theorem

In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise.

New!!: Information theory and Shannon–Hartley theorem · See more »

Shannon–Weaver model

The Shannon–Weaver model of communication has been called the "mother of all models." Social Scientists use the term to refer to an integrated model of the concepts of information source, message, transmitter, signal, channel, noise, receiver, information destination, probability of error, encoding, decoding, information rate, channel capacity, etc.

New!!: Information theory and Shannon–Weaver model · See more »

Shearer's inequality

In information theory, Shearer's inequality, named after James Shearer, states that if X1,..., Xd are random variables and S1,..., Sn are subsets of such that every integer between 1 and d lies in at least r of these subsets, then where (X_)_ is the Cartesian product of random variables X_ with indices j in S_ (so the dimension of this vector is equal to the size of S_).

New!!: Information theory and Shearer's inequality · See more »

Shlomo Shamai

Professor Shlomo Shamai (Shitz) (Hebrew: שלמה שמאי (שיץ) &rlm) is a distinguished professor at the Department of Electrical engineering at the Technion − Israel Institute of Technology.

New!!: Information theory and Shlomo Shamai · See more »

Sidney Dancoff

Sidney Michael Dancoff (September 27, 1913 in Philadelphia – August 15, 1951 in Urbana, Illinois) was an American theoretical physicist best known for the Tamm–Dancoff approximation method and for nearly developing a renormalization method for solving quantum electrodynamics (QED).

New!!: Information theory and Sidney Dancoff · See more »

Signal

A signal as referred to in communication systems, signal processing, and electrical engineering is a function that "conveys information about the behavior or attributes of some phenomenon".

New!!: Information theory and Signal · See more »

Signal processing

Signal processing concerns the analysis, synthesis, and modification of signals, which are broadly defined as functions conveying "information about the behavior or attributes of some phenomenon", such as sound, images, and biological measurements.

New!!: Information theory and Signal processing · See more »

Signal-to-interference-plus-noise ratio

In information theory and telecommunication engineering, the signal-to-interference-plus-noise ratio (SINR) (also known as the signal-to-noise-plus-interference ratio (SNIR)) is a quantity used to give theoretical upper bounds on channel capacity (or the rate of information transfer) in wireless communication systems such as networks.

New!!: Information theory and Signal-to-interference-plus-noise ratio · See more »

Sinc function

In mathematics, physics and engineering, the cardinal sine function or sinc function, denoted by, has two slightly different definitions.

New!!: Information theory and Sinc function · See more »

Slepian–Wolf coding

In information theory and communication, the Slepian–Wolf coding, also known as the Slepian–Wolf bound, is a result in distributed source coding discovered by David Slepian and Jack Wolf in 1973.

New!!: Information theory and Slepian–Wolf coding · See more »

Smart grid

A smart grid is an electrical grid which includes a variety of operational and energy measures including smart meters, smart appliances, renewable energy resources, and energy efficient resources.

New!!: Information theory and Smart grid · See more »

Sneakernet

Sneakernet is an informal term for the transfer of electronic information by physically moving media such as magnetic tape, floppy disks, compact discs, USB flash drives or external hard drives from one computer to another; rather than transmitting the information over a computer network.

New!!: Information theory and Sneakernet · See more »

Social network

A social network is a social structure made up of a set of social actors (such as individuals or organizations), sets of dyadic ties, and other social interactions between actors.

New!!: Information theory and Social network · See more »

Soft heap

In computer science, a soft heap is a variant on the simple heap data structure that has constant amortized time for 5 types of operations.

New!!: Information theory and Soft heap · See more »

Soft-decision decoder

In information theory, a soft-decision decoder is a kind of decoding methods – a class of algorithm used to decode data that has been encoded with an error correcting code.

New!!: Information theory and Soft-decision decoder · See more »

Solèr's theorem

In mathematics, Solèr's theorem is a result concerning certain infinite-dimensional vector spaces.

New!!: Information theory and Solèr's theorem · See more »

Solomon Kullback

Solomon Kullback (April 3, 1907August 5, 1994) was an American cryptanalyst and mathematician, who was one of the first three employees hired by William F. Friedman at the US Army's Signal Intelligence Service (SIS) in the 1930s, along with Frank Rowlett and Abraham Sinkov.

New!!: Information theory and Solomon Kullback · See more »

Solving chess

Solving chess means finding an optimal strategy for playing chess, i.e. one by which one of the players (White or Black) can always force a victory, or both can force a draw (see Solved game).

New!!: Information theory and Solving chess · See more »

Spatial correlation

Theoretically, the performance of wireless communication systems can be improved by having multiple antennas at the transmitter and the receiver.

New!!: Information theory and Spatial correlation · See more »

Specific-information

In information theory, specific-information is the generic name given to family of state-dependent measures that in expectation converge to the mutual information.

New!!: Information theory and Specific-information · See more »

Specified complexity

Specified complexity is a concept proposed by William Dembski and used by him and others to promote the pseudoscientific arguments of intelligent design.

New!!: Information theory and Specified complexity · See more »

Spekkens toy model

The Spekkens toy model is a conceptually simple toy model introduced by Robert Spekkens in 2004, to argue in favour of the epistemic view of quantum mechanics.

New!!: Information theory and Spekkens toy model · See more »

Spiking neural network

Spiking neural networks (SNNs) fall into the third generation of artificial neural network models, increasing the level of realism in a neural simulation.

New!!: Information theory and Spiking neural network · See more »

Spin–spin relaxation

In physics, the spin–spin relaxation is the mechanism by which, the transverse component of the magnetization vector, exponentially decays towards its equilibrium value in nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI).

New!!: Information theory and Spin–spin relaxation · See more »

Splice site mutation

A splice site mutation is a genetic mutation that inserts, deletes or changes a number of nucleotides in the specific site at which splicing takes place during the processing of precursor messenger RNA into mature messenger RNA.

New!!: Information theory and Splice site mutation · See more »

Squashed entanglement

Squashed entanglement, also called CMI entanglement (CMI can be pronounced "see me"), is an information theoretic measure of quantum entanglement for a bipartite quantum system.

New!!: Information theory and Squashed entanglement · See more »

Statistical distance

In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.

New!!: Information theory and Statistical distance · See more »

Statistical inference

Statistical inference is the process of using data analysis to deduce properties of an underlying probability distribution.

New!!: Information theory and Statistical inference · See more »

Statistical machine translation

Statistical machine translation (SMT) is a machine translation paradigm where translations are generated on the basis of statistical models whose parameters are derived from the analysis of bilingual text corpora.

New!!: Information theory and Statistical machine translation · See more »

Steganography

Steganography is the practice of concealing a file, message, image, or video within another file, message, image, or video.

New!!: Information theory and Steganography · See more »

Stephen O. Rice

Stephen "Steve" Oswald Rice (November 29, 1907 – November 18, 1986) was a pioneer in the related fields of information theory, communications theory, and telecommunications.

New!!: Information theory and Stephen O. Rice · See more »

Steven H. Simon

Steven H. Simon is an American theoretical physics professor at Oxford University and tutorial fellow of Somerville College, Oxford.

New!!: Information theory and Steven H. Simon · See more »

Stochastic

The word stochastic is an adjective in English that describes something that was randomly determined.

New!!: Information theory and Stochastic · See more »

Stochastic geometry models of wireless networks

In mathematics and telecommunications, stochastic geometry models of wireless networks refer to mathematical models based on stochastic geometry that are designed to represent aspects of wireless networks.

New!!: Information theory and Stochastic geometry models of wireless networks · See more »

Stochastic process

--> In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a collection of random variables.

New!!: Information theory and Stochastic process · See more »

Structural information theory

Structural information theory (SIT) is a theory about human perception and in particular about visual perceptual organization, which is the neuro-cognitive process that enables us to perceive scenes as structured wholes consisting of objects arranged in space.

New!!: Information theory and Structural information theory · See more »

Structured expert judgment: the classical model

Expert Judgment (EJ) denotes a wide variety of techniques ranging from a single undocumented opinion, through preference surveys, to formal elicitation with external validation of expert probability assessments.

New!!: Information theory and Structured expert judgment: the classical model · See more »

Subadditivity effect

The subadditivity effect is the tendency to judge probability of the whole to be less than the probabilities of the parts.

New!!: Information theory and Subadditivity effect · See more »

Subhash Kak

Subhash Kak (born 26 March 1947 in Srinagar) is an Indian American computer scientist.

New!!: Information theory and Subhash Kak · See more »

Succinct data structure

In computer science, a succinct data structure is a data structure which uses an amount of space that is "close" to the information-theoretic lower bound, but (unlike other compressed representations) still allows for efficient query operations.

New!!: Information theory and Succinct data structure · See more »

Supertask

In philosophy, a supertask is a countably infinite sequence of operations that occur sequentially within a finite interval of time.

New!!: Information theory and Supertask · See more »

Surprisal analysis

Surprisal analysis is an information-theoretical analysis technique that integrates and applies principles of thermodynamics and maximal entropy.

New!!: Information theory and Surprisal analysis · See more »

Surround suppression

Surround suppression is a descriptive term referring to observations that the relative firing rate of a neuron may under certain conditions decrease when a particular stimulus is enlarged.

New!!: Information theory and Surround suppression · See more »

Symbol rate

In digital communications, symbol rate, also known as baud rate and modulation rate, is the number of symbol changes, waveform changes, or signaling events, across the transmission medium per time unit using a digitally modulated signal or a line code.

New!!: Information theory and Symbol rate · See more »

Symbolic dynamics

In mathematics, symbolic dynamics is the practice of modeling a topological or smooth dynamical system by a discrete space consisting of infinite sequences of abstract symbols, each of which corresponds to a state of the system, with the dynamics (evolution) given by the shift operator.

New!!: Information theory and Symbolic dynamics · See more »

Synergy

Synergy is the creation of a whole that is greater than the simple sum of its parts.

New!!: Information theory and Synergy · See more »

System analysis

System analysis in the field of electrical engineering that characterizes electrical systems and their properties.

New!!: Information theory and System analysis · See more »

Systems science

Systems science is an interdisciplinary field that studies the nature of systems—from simple to complex—in nature, society, cognition, and science itself.

New!!: Information theory and Systems science · See more »

Szeged index

In chemical graph theory, the Szeged index is a topological index of a molecule, used in biochemistry.

New!!: Information theory and Szeged index · See more »

T-symmetry

T-symmetry or time reversal symmetry is the theoretical symmetry of physical laws under the transformation of time reversal: T-symmetry can be shown to be equivalent to the conservation of entropy, by Noether's Theorem.

New!!: Information theory and T-symmetry · See more »

Tadao Kasami

was a noted Japanese information theorist who made significant contributions to error correcting codes.

New!!: Information theory and Tadao Kasami · See more »

Te Sun Han

Te Sun Han (born 1941, Kiryū) is a Korean Japanese information theorist and winner of the 2010 Shannon Award.

New!!: Information theory and Te Sun Han · See more »

Telecommunications engineering

Telecommunications engineering is an engineering discipline centered on electrical and computer engineering which seeks to support and enhance telecommunication systems.

New!!: Information theory and Telecommunications engineering · See more »

Television standards conversion

Television standards conversion is the process of changing one type of television system to another.

New!!: Information theory and Television standards conversion · See more »

Temporal information retrieval

Temporal information retrieval (T-IR) is an emerging area of research related to the field of information retrieval (IR) and a considerable number of sub-areas, positioning itself, as an important dimension in the context of the user information needs.

New!!: Information theory and Temporal information retrieval · See more »

Terence Tao

Terence Chi-Shen Tao (born 17 July 1975) is an Australian-American mathematician who has worked in various areas of mathematics.

New!!: Information theory and Terence Tao · See more »

Tf–idf

In information retrieval, tf–idf or TFIDF, short for term frequency–inverse document frequency, is a numerical statistic that is intended to reflect how important a word is to a document in a collection or corpus.

New!!: Information theory and Tf–idf · See more »

The Information: A History, a Theory, a Flood

The Information: A History, a Theory, a Flood is a book by science history writer James Gleick published in March 2011 which covers the genesis of our current information age.

New!!: Information theory and The Information: A History, a Theory, a Flood · See more »

The Ingenuity Gap

The Ingenuity Gap is a non-fiction book by Canadian academic Thomas Homer-Dixon.

New!!: Information theory and The Ingenuity Gap · See more »

The Pattern on the Stone

The Pattern on the Stone: The Simple Ideas that Make Computers Work is a book by W. Daniel Hillis, published in 1998 by Basic Books.

New!!: Information theory and The Pattern on the Stone · See more »

Theil index

The Theil index is a statistic primarily used to measure economic inequality and other economic phenomena, though it has also been used to measure racial segregation.

New!!: Information theory and Theil index · See more »

Theoretical computer science

Theoretical computer science, or TCS, is a subset of general computer science and mathematics that focuses on more mathematical topics of computing and includes the theory of computation.

New!!: Information theory and Theoretical computer science · See more »

Theoretical ecology

Theoretical ecology is the scientific discipline devoted to the study of ecological systems using theoretical methods such as simple conceptual models, mathematical models, computational simulations, and advanced data analysis.

New!!: Information theory and Theoretical ecology · See more »

Theoretical physics

Theoretical physics is a branch of physics that employs mathematical models and abstractions of physical objects and systems to rationalize, explain and predict natural phenomena.

New!!: Information theory and Theoretical physics · See more »

Theory

A theory is a contemplative and rational type of abstract or generalizing thinking, or the results of such thinking.

New!!: Information theory and Theory · See more »

Theory of Visualization

Theory is becoming an important topic in visualization, expanding from its traditional origins in low-level perception and statistics to an ever-broader array of fields and subfields.

New!!: Information theory and Theory of Visualization · See more »

Thermodynamic beta

In statistical mechanics, the thermodynamic beta (or occasionally perk) is the reciprocal of the thermodynamic temperature of a system.

New!!: Information theory and Thermodynamic beta · See more »

Thomas Huang

Thomas Shi-Tao Huang (born June 26, 1936, Shanghai) is a researcher and professor emeritus at the University of Illinois at Urbana-Champaign (UIUC).

New!!: Information theory and Thomas Huang · See more »

Thomas Kailath

Thomas Kailath (born June 7, 1935) is an electrical engineer, information theorist, control engineer, entrepreneur and the Hitachi America Professor of Engineering, Emeritus, at Stanford University.

New!!: Information theory and Thomas Kailath · See more »

Thomas M. Cover

Thomas M. Cover (August 7, 1938 – March 26, 2012) was an information theorist and professor jointly in the Departments of Electrical Engineering and Statistics at Stanford University.

New!!: Information theory and Thomas M. Cover · See more »

Timeline of communication technology

Timeline of communication technology.

New!!: Information theory and Timeline of communication technology · See more »

Timeline of cryptography

Below is a timeline of notable events related to cryptography.

New!!: Information theory and Timeline of cryptography · See more »

Timeline of information theory

A timeline of events related to information theory, quantum information theory and statistical physics, data compression, error correcting codes and related subjects.

New!!: Information theory and Timeline of information theory · See more »

Timeline of machine translation

This is a timeline of machine translation.

New!!: Information theory and Timeline of machine translation · See more »

Timeline of mathematics

This is a timeline of pure and applied mathematics history.

New!!: Information theory and Timeline of mathematics · See more »

Timeline of quantum computing

This is a timeline of quantum computing.

New!!: Information theory and Timeline of quantum computing · See more »

Timeline of scientific discoveries

The timeline below shows the date of publication of possible major scientific theories and discoveries, along with the discoverer.

New!!: Information theory and Timeline of scientific discoveries · See more »

Timeline of scientific thought

This is a list of important landmarks in the history of systematic philosophical inquiry and scientific analysis of phenomena.

New!!: Information theory and Timeline of scientific thought · See more »

Timeline of thermodynamics

A timeline of events related to thermodynamics.

New!!: Information theory and Timeline of thermodynamics · See more »

Toby Berger

Toby Berger (born September 4, 1940) is a noted American information theorist.

New!!: Information theory and Toby Berger · See more »

Too Big to Know

Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room is a non-fiction book by the American technology writer David Weinberger published in 2012 by Basic Books.

New!!: Information theory and Too Big to Know · See more »

Total correlation

In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information.

New!!: Information theory and Total correlation · See more »

Trust (emotion)

In a social context, trust has several connotations.

New!!: Information theory and Trust (emotion) · See more »

Trusted system

In the security engineering subspecialty of computer science, a trusted system is a system that is relied upon to a specified extent to enforce a specified security policy.

New!!: Information theory and Trusted system · See more »

Tsallis entropy

In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy.

New!!: Information theory and Tsallis entropy · See more »

Tunstall coding

In computer science and information theory, Tunstall coding is a form of entropy coding used for lossless data compression.

New!!: Information theory and Tunstall coding · See more »

Turbo code

In information theory, turbo codes (originally in French Turbocodes) are a class of high-performance forward error correction (FEC) codes developed around 1990–91 (but first published in 1993), which were the first practical codes to closely approach the channel capacity, a theoretical maximum for the code rate at which reliable communication is still possible given a specific noise level.

New!!: Information theory and Turbo code · See more »

Twenty Questions

Twenty Questions is a spoken parlor game which encourages deductive reasoning and creativity.

New!!: Information theory and Twenty Questions · See more »

Typical set

In information theory, the typical set is a set of sequences whose probability is close to two raised to the negative power of the entropy of their source distribution.

New!!: Information theory and Typical set · See more »

Typical subspace

In quantum information theory, the idea of a typical subspace plays an important role in the proofs of many coding theorems (the most prominent example being Schumacher compression).

New!!: Information theory and Typical subspace · See more »

Ulam's game

Ulam's game, or the Rényi–Ulam game, is a mathematical game similar to the popular game of twenty questions where one attempts to guess an unnamed object with yes–no questions, but where some of the answers may be wrong.

New!!: Information theory and Ulam's game · See more »

Uncertainty

Uncertainty has been called "an unintelligible expression without a straightforward description".

New!!: Information theory and Uncertainty · See more »

Uncertainty reduction theory

The uncertainty reduction theory, also known as initial interaction theory, developed in 1975 by Charles Berger and Richard Calabrese, is a communication theory from the post-positivist tradition.

New!!: Information theory and Uncertainty reduction theory · See more »

Unicycle

A unicycle is a vehicle that touches the ground with only one wheel.

New!!: Information theory and Unicycle · See more »

Units of information

In computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels.

New!!: Information theory and Units of information · See more »

Universal portfolio algorithm

The universal portfolio algorithm is a portfolio selection algorithm from the field of machine learning and information theory.

New!!: Information theory and Universal portfolio algorithm · See more »

University of Michigan

The University of Michigan (UM, U-M, U of M, or UMich), often simply referred to as Michigan, is a public research university in Ann Arbor, Michigan.

New!!: Information theory and University of Michigan · See more »

University of North Dakota

The University of North Dakota (also known as UND or North Dakota) is a public research university in Grand Forks, North Dakota.

New!!: Information theory and University of North Dakota · See more »

University of Rijeka

The University of Rijeka (Sveučilište u Rijeci) is in the city of Rijeka with faculties in cities throughout the regions of Primorje, Istria and Lika.

New!!: Information theory and University of Rijeka · See more »

University of Utah College of Engineering

The College of Engineering at the University of Utah is an academic college of the University of Utah in Salt Lake City, Utah.

New!!: Information theory and University of Utah College of Engineering · See more »

Uplift modelling

Uplift modelling, also known as incremental modelling, true lift modelling, or net modelling is a predictive modelling technique that directly models the incremental impact of a treatment (such as a direct marketing action) on an individual's behaviour.

New!!: Information theory and Uplift modelling · See more »

Useless machine

A useless machine is a device which has a function but no direct purpose.

New!!: Information theory and Useless machine · See more »

Value of information

Value of information (VOI or VoI) is the amount a decision maker would be willing to pay for information prior to making a decision.

New!!: Information theory and Value of information · See more »

Variable-order Markov model

In stochastic processes, variable-order Markov (VOM) models are an important class of models that extend the well known Markov chain models.

New!!: Information theory and Variable-order Markov model · See more »

Variant of uncertain significance

A variant of uncertain (or unknown) significance (VUS) is an allele, or variant form of a gene, which has been identified through genetic testing, but whose significance to the function or health of an organism is not known.

New!!: Information theory and Variant of uncertain significance · See more »

Variation of information

In probability theory and information theory, the variation of information or shared information distance is a measure of the distance between two clusterings (partitions of elements).

New!!: Information theory and Variation of information · See more »

Variety (cybernetics)

In cybernetics, the term variety denotes the total number of distinct states of a system.

New!!: Information theory and Variety (cybernetics) · See more »

Venti

Venti is a network storage system that permanently stores data blocks.

New!!: Information theory and Venti · See more »

Visual Information Fidelity

Visual Information Fidelity (VIF) is a full reference image quality assessment index based on natural scene statistics and the notion of image information extracted by the human visual system.

New!!: Information theory and Visual Information Fidelity · See more »

Vitold Belevitch

Vitold Belevitch (2 March 1921 – 26 December 1999) was a Belgian mathematician and electrical engineer of Russian origin who produced some important work in the field of electrical network theory.

New!!: Information theory and Vitold Belevitch · See more »

Vladimir Kotelnikov

Vladimir Aleksandrovich Kotelnikov (Russian Владимир Александрович Котельников, scientific transliteration Vladimir Alexandrovič Kotelnikov, 6 September 1908 in Kazan – 11 February 2005 in Moscow) was an information theory and radar astronomy pioneer from the Soviet Union.

New!!: Information theory and Vladimir Kotelnikov · See more »

Vladimir Levenshtein

Vladimir Iosifovich Levenshtein (a; March 20, 1935 – September 6, 2017) was a Russian scientist who did research in information theory, error-correcting codes, and combinatorial design.

New!!: Information theory and Vladimir Levenshtein · See more »

Volume of an n-ball

In geometry, a ball is a region in space comprising all points within a fixed distance from a given point; that is, it is the region enclosed by a sphere or hypersphere.

New!!: Information theory and Volume of an n-ball · See more »

Von Neumann entropy

In quantum statistical mechanics, the von Neumann entropy, named after John von Neumann, is the extension of classical Gibbs entropy concepts to the field of quantum mechanics.

New!!: Information theory and Von Neumann entropy · See more »

Warren Weaver

Warren Weaver (July 17, 1894 – November 24, 1978) was an American scientist, mathematician, and science administrator.

New!!: Information theory and Warren Weaver · See more »

Wassim Michael Haddad

Wassim Michael Haddad (born July 14, 1961) is a Lebanese-Greek-American applied mathematician, scientist, and engineer, with research specialization in the areas of dynamical systems and control.

New!!: Information theory and Wassim Michael Haddad · See more »

Watchmaker analogy

The watchmaker analogy or watchmaker argument is a teleological argument which states, by way of an analogy, that a design implies a designer.

New!!: Information theory and Watchmaker analogy · See more »

Weissman score

The Weissman score is an efficiency metric for lossless compression applications, which was developed for fictional use.

New!!: Information theory and Weissman score · See more »

Werner Meyer-Eppler

Werner Meyer-Eppler (30 April 1913 – 8 July 1960), was a Belgian-born German physicist, experimental acoustician, phoneticist and information theorist.

New!!: Information theory and Werner Meyer-Eppler · See more »

William A. Dembski

William Albert "Bill" Dembski (born July 18, 1960) is an American mathematician, philosopher and theologian.

New!!: Information theory and William A. Dembski · See more »

William Bialek

William Bialek (born 1960 in Los Angeles, California) is a theoretical biophysicist and a professor at Princeton University and The Graduate Center, CUNY.

New!!: Information theory and William Bialek · See more »

William Lucas Root

William Lucas Root (1919 – April 22, 2007) was a noted American information theorist.

New!!: Information theory and William Lucas Root · See more »

Witsenhausen's counterexample

Witsenhausen's counterexample, shown in the figure below, is a deceptively simple toy problem in decentralized stochastic control.

New!!: Information theory and Witsenhausen's counterexample · See more »

Wojciech Szpankowski

Wojciech Szpankowski is the Saul Rosen Professor of Computer Science at the Purdue University.

New!!: Information theory and Wojciech Szpankowski · See more »

Worse-than-average effect

The worse-than-average effect or below-average effect is the human tendency to underestimate one's achievements and capabilities in relation to others.

New!!: Information theory and Worse-than-average effect · See more »

Xiaodong Wang (electrical engineer)

Xiaodong Wang is an information theorist and professor of Electrical Engineering at Columbia University.

New!!: Information theory and Xiaodong Wang (electrical engineer) · See more »

Yaakov Ziv

Yaakov Ziv (יעקב זיו; born 1931) is an Israeli electrical engineer who, along with Abraham Lempel, developed the LZ family of lossless data compression algorithms.

New!!: Information theory and Yaakov Ziv · See more »

Yuri Linnik

Yuri Vladimirovich Linnik (Ю́рий Влади́мирович Ли́нник; January 8, 1915 – June 30, 1972) was a Soviet mathematician active in number theory, probability theory and mathematical statistics.

New!!: Information theory and Yuri Linnik · See more »

Z-channel (information theory)

A Z-channel is a communications channel used in coding theory and information theory to model the behaviour of some data storage systems.

New!!: Information theory and Z-channel (information theory) · See more »

Zellig Harris

Zellig Sabbettai Harris (October 23, 1909 – May 22, 1992) was a very influential American linguist, mathematical syntactician, and methodologist of science.

New!!: Information theory and Zellig Harris · See more »

Zenon Pylyshyn

Zenon Walter Pylyshyn (born 1937) is a Canadian cognitive scientist and philosopher.

New!!: Information theory and Zenon Pylyshyn · See more »

Zipf's law

Zipf's law is an empirical law formulated using mathematical statistics that refers to the fact that many types of data studied in the physical and social sciences can be approximated with a Zipfian distribution, one of a family of related discrete power law probability distributions.

New!!: Information theory and Zipf's law · See more »

1889

No description.

New!!: Information theory and 1889 · See more »

1916 in science

The year 1916 involved a number of significant events in science and technology, some of which are listed below.

New!!: Information theory and 1916 in science · See more »

1916 in the United States

Events from the year 1916 in the United States.

New!!: Information theory and 1916 in the United States · See more »

1948 in science

The year 1948 in science and technology involved some significant events, listed below.

New!!: Information theory and 1948 in science · See more »

1976

No description.

New!!: Information theory and 1976 · See more »

2013 in science

A number of significant scientific events occurred in 2013, including the discovery of numerous Earthlike exoplanets, the development of viable lab-grown ears, teeth, livers and blood vessels, and the atmospheric entry of the most destructive meteor since 1908.

New!!: Information theory and 2013 in science · See more »

20th century in science

Science advanced dramatically during the 20th century.

New!!: Information theory and 20th century in science · See more »

Redirects here:

Classical information theory, Information Theory, Information theorist, Information-theoretic, Shannon information theory, Shannon theory, Shannon's information theory, Shannons theory.

References

[1] https://en.wikipedia.org/wiki/Information_theory

OutgoingIncoming
Hey! We are on Facebook now! »