254 relations: Active learning (machine learning), ADALINE, Adaptive website, Affective computing, Alan Turing, Algorithm, Algorithmic bias, Amazon Web Services, Andrew Ng, Angoss, Apache Mahout, Apache MXNet, Arthur Samuel, Artificial immune system, Artificial intelligence, Artificial neural network, Artificial neuron, ArXiv, Association rule learning, AT&T Labs, Automated machine learning, Automated reasoning, Automated theorem proving, Autonomous car, Ayasdi, Backpropagation, Basis function, Bias–variance tradeoff, Big data, Bioinformatics, Bootstrap aggregating, Brain–computer interface, Cheminformatics, Christopher Bishop, Chromosome (genetic algorithm), Cluster analysis, Computation, Computational anatomy, Computational intelligence, Computational learning theory, Computational neuroscience, Computational statistics, Computer, Computer program, Computer science, Computer vision, Computing Machinery and Intelligence, Conditional independence, Conference on Neural Information Processing Systems, Connectionism, ..., Credit card fraud, Cross-validation (statistics), Crossover (genetic algorithm), Dartmouth workshop, Data, Data analysis, Data breach, Data collection, Data mining, Data modeling, Data science, David J. C. MacKay, David Rumelhart, Decision tree, Deep learning, Deeplearning4j, Density estimation, Developmental robotics, Diagnosis (artificial intelligence), Dimensionality reduction, Directed acyclic graph, Discovery (observation), ECML PKDD, Economics, ELKI, Email filtering, Ensemble averaging (machine learning), Errors and residuals, Ethics of artificial intelligence, Evolutionary algorithm, Existential risk from artificial general intelligence, Expert system, Explanation-based learning, Exploratory data analysis, False positive rate, False positives and false negatives, Feature learning, Financial market, Functional programming, General game playing, Generalized linear model, Genetic algorithm, Geoffrey Hinton, GNU Octave, Google, Google APIs, Graphical model, Graphics processing unit, H2O (software), Handwriting recognition, Heuristic (computer science), IBM, IBM Data Science Experience, IEEE Signal Processing Society, Inductive bias, Inductive logic programming, Inductive programming, Inference, Information retrieval, Insurance, International Conference on Machine Learning, Internet, Internet fraud, Jerome H. Friedman, John Hopfield, Joint probability distribution, Journal of Machine Learning Research, K-SVD, KNIME, Knowledge extraction, KXEN Inc., Learning classifier system, Learning to rank, Leo Breiman, Linguistics, LIONsolver, List of datasets for machine learning research, Logic programming, Logical consequence, Loss function, Machine ethics, Machine Learning (journal), Machine learning control, Machine learning in bioinformatics, Machine perception, Mallet (software project), Map (mathematics), Marketing, Massive Online Analysis, Mathematical model, Mathematical optimization, MATLAB, Meta learning (computer science), Metaheuristic, Michael I. Jordan, Microsoft Azure, Microsoft Cognitive Toolkit, MLPACK (C++ library), Multi-label classification, Multilinear subspace learning, Multimodal sentiment analysis, Mutation (genetic algorithm), Natural language, Natural language processing, Natural language understanding, Natural selection, Netflix, Netflix Prize, Network simulation, Neural circuit, Neural coding, Neural Computation (journal), Neural Designer, Neural network, NeuroSolutions, Noise reduction, Nonlinear dimensionality reduction, Nonlinear system, Nucleic acid sequence, Online advertising, OpenNN, Operational definition, Optical character recognition, Oracle Data Mining, Orange (software), Outline of machine learning, Outline of object recognition, Overfitting, Paraphrase, Pattern recognition, PC game, Pedro Domingos, Perceptron, Peter E. Hart, Piecewise, Precision agriculture, Predictive analytics, Predictive modelling, Principal component analysis, Probability distribution, Probability theory, PyTorch, Quantum machine learning, Random forest, Random variable, RapidMiner, Ray Solomonoff, RCASE, Receiver operating characteristic, Recommender system, Regression analysis, Reinforcement learning, Richard O. Duda, Robert Tibshirani, Robot learning, Robot locomotion, Rule-based machine learning, Scikit-learn, Search algorithm, Semi-supervised learning, Sensitivity and specificity, Sentiment analysis, SequenceL, Sequential pattern mining, Shogun (toolbox), Similarity learning, Software engineering, Software suite, Speech recognition, Splunk, SPSS Modeler, Stanford University, Statistica, Statistical classification, Statistics, Strong NP-completeness, Structural health monitoring, Sun Microsystems, Supervised learning, Symbolic artificial intelligence, Syntactic pattern recognition, Telecommunication, Tensor, TensorFlow, Text corpus, The Master Algorithm, Theoretical computer science, Time complexity, Time series, Tom M. Mitchell, Topic model, Torch (machine learning), Total operating characteristic, Training, test, and validation sets, Translation, Trevor Hastie, Unsupervised learning, User behavior analytics, Vinod Khosla, Web search engine, Weka (machine learning), Wolfram Mathematica, Yooreeka, Yoshua Bengio. Expand index (204 more) » « Shrink index
Active learning is a special case of semi-supervised machine learning in which a learning algorithm is able to interactively query the user (or some other information source) to obtain the desired outputs at new data points.
ADALINE (Adaptive Linear Neuron or later Adaptive Linear Element) is an early single-layer artificial neural network and the name of the physical device that implemented this network.
An adaptive website is a website that builds a model of user activity and modifies the information and/or presentation of information to the user in order to better address the user's needs.
Affective computing (sometimes called artificial emotional intelligence, or emotion AI) is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects.
Alan Mathison Turing (23 June 1912 – 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst, philosopher, and theoretical biologist.
In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems.
Algorithmic bias occurs when a computer system behaves in ways that reflects the implicit values of humans involved in that data collection, selection, or use.
Amazon Web Services (AWS) is a subsidiary of Amazon.com that provides on-demand cloud computing platforms to individuals, companies and governments, on a paid subscription basis.
Andrew Yan-Tak Ng (born 1976) is a Chinese American computer scientist and entrepreneur.
Angoss Software Corporation, headquartered in Toronto, Ontario, Canada, with offices in the United States and UK, is a provider of predictive analytics systems through software licensing and services.
Apache Mahout is a project of the Apache Software Foundation to produce free implementations of distributed or otherwise scalable machine learning algorithms focused primarily in the areas of collaborative filtering, clustering and classification.
Apache MXNet is a modern open-source deep learning framework used to train, and deploy deep neural networks.
Arthur Lee Samuel (December 5, 1901 – July 29, 1990) was an American pioneer in the field of computer gaming and artificial intelligence.
In artificial intelligence, artificial immune systems (AIS) are a class of computationally intelligent, rule-based machine learning systems inspired by the principles and processes of the vertebrate immune system.
Artificial intelligence (AI, also machine intelligence, MI) is intelligence demonstrated by machines, in contrast to the natural intelligence (NI) displayed by humans and other animals.
Artificial neural networks (ANNs) or connectionist systems are computing systems vaguely inspired by the biological neural networks that constitute animal brains.
An artificial neuron is a mathematical function conceived as a model of biological neurons, a neural network.
arXiv (pronounced "archive") is a repository of electronic preprints (known as e-prints) approved for publication after moderation, that consists of scientific papers in the fields of mathematics, physics, astronomy, computer science, quantitative biology, statistics, and quantitative finance, which can be accessed online.
Association rule learning is a rule-based machine learning method for discovering interesting relations between variables in large databases.
AT&T Labs is the research & development division of AT&T.
Automated machine learning (AutoML) is the process of automating the end-to-end process of applying machine learning to real-world problems.
Automated reasoning is an area of computer science and mathematical logic dedicated to understanding different aspects of reasoning.
Automated theorem proving (also known as ATP or automated deduction) is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs.
An autonomous car (also known as a driverless car, self-driving car, and robotic car) is a vehicle that is capable of sensing its environment and navigating without human input.
Ayasdi is a machine intelligence software company that offers a software platform and applications to organizations looking to analyze and build predictive models using big data or highly dimensional data sets.
Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network.
In mathematics, a basis function is an element of a particular basis for a function space.
In statistics and machine learning, the bias–variance tradeoff is the property of a set of predictive models whereby models with a lower bias in parameter estimation have a higher variance of the parameter estimates across samples, and vice versa.
Big data is data sets that are so big and complex that traditional data-processing application software are inadequate to deal with them.
Bioinformatics is an interdisciplinary field that develops methods and software tools for understanding biological data.
Bootstrap aggregating, also called bagging, is a machine learning ensemble meta-algorithm designed to improve the stability and accuracy of machine learning algorithms used in statistical classification and regression.
A brain–computer interface (BCI), sometimes called a neural-control interface (NCI), mind-machine interface (MMI), direct neural interface (DNI), or brain–machine interface (BMI), is a direct communication pathway between an enhanced or wired brain and an external device.
Cheminformatics (also known as chemoinformatics, chemioinformatics and chemical informatics) is the use of computer and informational techniques applied to a range of problems in the field of chemistry.
Christopher Michael Bishop (born 7 April 1959) is the Laboratory Director at Microsoft Research Cambridge, Professor of Computer Science at the University of Edinburgh and a Fellow of Darwin College, Cambridge.
In genetic algorithms, a chromosome (also sometimes called a genotype) is a set of parameters which define a proposed solution to the problem that the genetic algorithm is trying to solve.
Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense) to each other than to those in other groups (clusters).
Computation is any type of calculation that includes both arithmetical and non-arithmetical steps and follows a well-defined model, for example an algorithm.
Computational anatomy is an interdisciplinary field of biology focused on quantitative investigation and modelling of anatomical shapes variability.
The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation.
In computer science, computational learning theory (or just learning theory) is a subfield of Artificial Intelligence devoted to studying the design and analysis of machine learning algorithms.
Computational neuroscience (also known as theoretical neuroscience or mathematical neuroscience) is a branch of neuroscience which employs mathematical models, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.
Computational statistics, or statistical computing, is the interface between statistics and computer science.
A computer is a device that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming.
A computer program is a collection of instructions for performing a specific task that is designed to solve a specific class of problems.
Computer science deals with the theoretical foundations of information and computation, together with practical techniques for the implementation and application of these foundations.
Computer vision is a field that deals with how computers can be made for gaining high-level understanding from digital images or videos.
"Computing Machinery and Intelligence" is a seminal paper written by Alan Turing on the topic of artificial intelligence.
In probability theory, two events R and B are conditionally independent given a third event Y precisely if the occurrence of R and the occurrence of B are independent events in their conditional probability distribution given Y. In other words, R and B are conditionally independent given Y if and only if, given knowledge that Y occurs, knowledge of whether R occurs provides no information on the likelihood of B occurring, and knowledge of whether B occurs provides no information on the likelihood of R occurring.
The Conference and Workshop on Neural Information Processing Systems (NIPS) is a machine learning and computational neuroscience conference held every December.
Connectionism is an approach in the fields of cognitive science, that hopes to represent mental phenomena using artificial neural networks.
Credit card fraud is a wide-ranging term for theft and fraud committed using or involving a payment card, such as a credit card or debit card, as a fraudulent source of funds in a transaction.
Cross-validation, sometimes called rotation estimation, or out-of-sample testing is any of various similar model validation techniques for assessing how the results of a statistical analysis will generalize to an independent data set.
In genetic algorithms and evolutionary computation, crossover, also called recombination, is a genetic operator used to combine the genetic information of two parents to generate new offspring.
The Dartmouth Summer Research Project on Artificial Intelligence was the name of a 1956 summer workshop now considered by many (though not all) to be the seminal event for artificial intelligence as a field.
Data is a set of values of qualitative or quantitative variables.
Data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, informing conclusions, and supporting decision-making.
A data breach is the intentional or unintentional release of secure or private/confidential information to an untrusted environment.
Data collection is the process of gathering and measuring information on targeted variables in an established systematic fashion, which then enables one to answer relevant questions and evaluate outcomes.
Data mining is the process of discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems.
Data modeling in software engineering is the process of creating a data model for an information system by applying certain formal techniques.
Data science is an interdisciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from data in various forms, both structured and unstructured, similar to data mining.
Sir David John Cameron MacKay (22 April 1967 – 14 April 2016) was a British physicist, mathematician, and academic.
David Everett Rumelhart (June 12, 1942 – March 13, 2011) was an American psychologist who made many contributions to the formal analysis of human cognition, working primarily within the frameworks of mathematical psychology, symbolic artificial intelligence, and parallel distributed processing.
A decision tree is a decision support tool that uses a tree-like graph or model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility.
Deep learning (also known as deep structured learning or hierarchical learning) is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms.
Eclipse Deeplearning4j is a deep learning programming library written for Java and the Java virtual machine (JVM) and a computing framework with wide support for deep learning algorithms.
In probability and statistics, density estimation is the construction of an estimate, based on observed data, of an unobservable underlying probability density function.
Developmental robotics (DevRob), sometimes called epigenetic robotics, is a scientific field which aims at studying the developmental mechanisms, architectures and constraints that allow lifelong and open-ended learning of new skills and new knowledge in embodied machines.
As a subfield in artificial intelligence, Diagnosis is concerned with the development of algorithms and techniques that are able to determine whether the behaviour of a system is correct.
In statistics, machine learning, and information theory, dimensionality reduction or dimension reduction is the process of reducing the number of random variables under consideration by obtaining a set of principal variables.
In mathematics and computer science, a directed acyclic graph (DAG), is a finite directed graph with no directed cycles.
Discovery is the act of detecting something new, or something "old" that had been unrecognized as meaningful.
ECML PKDD, the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, is one of the leading academic conferences on machine learning and knowledge discovery, held in Europe every year.
Economics is the social science that studies the production, distribution, and consumption of goods and services.
ELKI (for Environment for DeveLoping KDD-Applications Supported by Index-Structures) is a knowledge discovery in databases (KDD, "data mining") software framework developed for use in research and teaching originally at the database systems research unit of Professor Hans-Peter Kriegel at the Ludwig Maximilian University of Munich, Germany.
Email filtering is the processing of email to organize it according to specified criteria.
In machine learning, particularly in the creation of artificial neural networks, ensemble averaging is the process of creating multiple models and combining them to produce a desired output, as opposed to creating just one model.
In statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "theoretical value".
The ethics of artificial intelligence is the part of the ethics of technology specific to robots and other artificially intelligent beings.
In artificial intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm.
Existential risk from artificial general intelligence is the hypothesis that substantial progress in artificial general intelligence (AI) could someday result in human extinction or some other unrecoverable global catastrophe.
In artificial intelligence, an expert system is a computer system that emulates the decision-making ability of a human expert.
Explanation-based learning (EBL) is a form of machine learning that exploits a very strong, or even perfect, domain theory in order to make generalizations or form concepts from training examples.
In statistics, exploratory data analysis (EDA) is an approach to analyzing data sets to summarize their main characteristics, often with visual methods.
In statistics, when performing multiple comparisons, a false positive ratio (or false alarm ratio) is the probability of falsely rejecting the null hypothesis for a particular test.
In medical testing, and more generally in binary classification, a false positive is an error in data reporting in which a test result improperly indicates presence of a condition, such as a disease (the result is positive), when in reality it is not present, while a false negative is an error in which a test result improperly indicates no presence of a condition (the result is negative), when in reality it is present.
In machine learning, feature learning or representation learning is a set of techniques that allows a system to automatically discover the representations needed for feature detection or classification from raw data.
A financial market is a market in which people trade financial securities and derivatives such as futures and options at low transaction costs.
In computer science, functional programming is a programming paradigm—a style of building the structure and elements of computer programs—that treats computation as the evaluation of mathematical functions and avoids changing-state and mutable data.
General game playing (GGP) is the design of artificial intelligence programs to be able to play more than one game successfully.
In statistics, the generalized linear model (GLM) is a flexible generalization of ordinary linear regression that allows for response variables that have error distribution models other than a normal distribution.
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA).
Geoffrey Everest Hinton One or more of the preceding sentences incorporates text from the royalsociety.org website where: (born 6 December 1947) is a British cognitive psychologist and computer scientist, most noted for his work on artificial neural networks.
GNU Octave is software featuring a high-level programming language, primarily intended for numerical computations.
Google LLC is an American multinational technology company that specializes in Internet-related services and products, which include online advertising technologies, search engine, cloud computing, software, and hardware.
Google APIs is a set of application programming interfaces (APIs) developed by Google which allow communication with Google Services and their integration to other services.
A graphical model or probabilistic graphical model (PGM) or structured probabilistic model is a probabilistic model for which a graph expresses the conditional dependence structure between random variables.
A graphics processing unit (GPU) is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device.
H2O is open-source software for big-data analysis.
Handwriting recognition (HWR) is the ability of a computer to receive and interpret intelligible handwritten input from sources such as paper documents, photographs, touch-screens and other devices.
In computer science, artificial intelligence, and mathematical optimization, a heuristic (from Greek εὑρίσκω "I find, discover") is a technique designed for solving a problem more quickly when classic methods are too slow, or for finding an approximate solution when classic methods fail to find any exact solution.
The International Business Machines Corporation (IBM) is an American multinational technology company headquartered in Armonk, New York, United States, with operations in over 170 countries.
Data Science Experience (DSX) is IBM’s platform for data science, a workspace that includes multiple collaboration and open-source tools for use in data science.
The IEEE Signal Processing Society is a society of the IEEE.
The inductive bias (also known as learning bias) of a learning algorithm is the set of assumptions that the learner uses to predict outputs given inputs that it has not encountered.
Inductive logic programming (ILP) is a subfield of machine learning which uses logic programming as a uniform representation for examples, background knowledge and hypotheses.
Inductive programming (IP) is a special area of automatic programming, covering research from artificial intelligence and programming, which addresses learning of typically declarative (logic or functional) and often recursive programs from incomplete specifications, such as input/output examples or constraints.
Inferences are steps in reasoning, moving from premises to logical consequences.
Information retrieval (IR) is the activity of obtaining information system resources relevant to an information need from a collection of information resources.
Insurance is a means of protection from financial loss.
The International Conference on Machine Learning (ICML) is the leading international academic conference in machine learning.
The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide.
Internet fraud is a type of fraud which makes use of the Internet.
Jerome Harold Friedman (born 1939) is an American statistician, consultant and Professor of Statistics at Stanford University, known for his contributions in the field of statistics and data mining.
John Joseph Hopfield (born July 15, 1933) is an American scientist most widely known for his invention of an associative neural network in 1982.
Given random variables X, Y,..., that are defined on a probability space, the joint probability distribution for X, Y,...
The Journal of Machine Learning Research is a peer-reviewed open access scientific journal covering machine learning.
In applied mathematics, K-SVD is a dictionary learning algorithm for creating a dictionary for sparse representations, via a singular value decomposition approach.
KNIME, the Konstanz Information Miner, is a free and open-source data analytics, reporting and integration platform.
Knowledge extraction is the creation of knowledge from structured (relational databases, XML) and unstructured (text, documents, images) sources.
KXEN was an American software company which existed from 1998 to 2013 when it was acquired by SAP AG.
Learning classifier systems, or LCS, are a paradigm of rule-based machine learning methods that combine a discovery component (e.g. typically a genetic algorithm) with a learning component (performing either supervised learning, reinforcement learning, or unsupervised learning).
Learning to rank.
Leo Breiman (January 27, 1928 – July 5, 2005) was a distinguished statistician at the University of California, Berkeley.
Linguistics is the scientific study of language, and involves an analysis of language form, language meaning, and language in context.
LIONsolver is an integrated software for data mining, business intelligence, analytics, and modeling Learning and Intelligent OptimizatioN and reactive business intelligence approach.
These datasets are used for machine-learning research and have been cited in peer-reviewed academic journals.
Logic programming is a type of programming paradigm which is largely based on formal logic.
Logical consequence (also entailment) is a fundamental concept in logic, which describes the relationship between statements that hold true when one statement logically follows from one or more statements.
In mathematical optimization, statistics, econometrics, decision theory, machine learning and computational neuroscience, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event.
Machine ethics (or machine morality, computational morality, or computational ethics) is a part of the ethics of artificial intelligence concerned with the moral behavior of artificially intelligent beings.
Machine Learning is a peer-reviewed scientific journal, published since 1986.
Machine learning control (MLC) is a subfield of machine learning, intelligent control and control theory which solves optimal control problems with methods of machine learning.
Machine learning, a subfield of computer science involving the development of algorithms that learn how to make predictions based on data, has a number of emerging applications in the field of bioinformatics.
Machine perception is the capability of a computer system to interpret data in a manner that is similar to the way humans use their senses to relate to the world around them.
MALLET is a Java "Machine Learning for Language Toolkit".
In mathematics, the term mapping, sometimes shortened to map, refers to either a function, often with some sort of special structure, or a morphism in category theory, which generalizes the idea of a function.
Marketing is the study and management of exchange relationships.
Massive Online Analysis (MOA) is a free open-source software project specific for data stream mining with concept drift.
A mathematical model is a description of a system using mathematical concepts and language.
In mathematics, computer science and operations research, mathematical optimization or mathematical programming, alternatively spelled optimisation, is the selection of a best element (with regard to some criterion) from some set of available alternatives.
MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and proprietary programming language developed by MathWorks.
Meta learning is a subfield of machine learning where automatic learning algorithms are applied on metadata about machine learning experiments.
In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem, especially with incomplete or imperfect information or limited computation capacity.
Michael Irwin Jordan is an American scientist, Professor at the University of California, Berkeley and a researcher in machine learning, statistics, and artificial intelligence.
Microsoft Azure (formerly Windows Azure) is a cloud computing service created by Microsoft for building, testing, deploying, and managing applications and services through a global network of Microsoft-managed data centers.
Microsoft Cognitive Toolkit, previously known as CNTK and sometimes styled as The Microsoft Cognitive Toolkit, is a deep learning framework developed by Microsoft Research.
mlpack is a machine learning software library for C++, built on top of the Armadillo library.
In machine learning, multi-label classification and the strongly related problem of multi-output classification are variants of the classification problem where multiple labels may be assigned to each instance.
Multilinear subspace learning is an approach to dimensionality reduction.
Multimodal sentiment analysis is a new dimension of the traditional text-based sentiment analysis, which goes beyond the analysis of texts, and includes other modalities such as audio and visual data.
Mutation is a genetic operator used to maintain genetic diversity from one generation of a population of genetic algorithm chromosomes to the next.
In neuropsychology, linguistics, and the philosophy of language, a natural language or ordinary language is any language that has evolved naturally in humans through use and repetition without conscious planning or premeditation.
Natural language processing (NLP) is an area of computer science and artificial intelligence concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data.
Natural language understanding (NLU) or natural language interpretation (NLI) is a subtopic of natural language processing in artificial intelligence that deals with machine reading comprehension.
Natural selection is the differential survival and reproduction of individuals due to differences in phenotype.
Netflix, Inc. is an American over-the-top media services provider, headquartered in Los Gatos, California.
The Netflix Prize was an open competition for the best collaborative filtering algorithm to predict user ratings for films, based on previous ratings without any other information about the users or films, i.e. without the users or the films being identified except by numbers assigned for the contest.
In computer network research, network simulation is a technique whereby a software program models the behavior of a network by calculating the interaction between the different network entities (routers, switches, nodes, access points, links etc.). Most simulators use discrete event simulation - the modeling of systems in which state variables change at discrete points in time.
A neural circuit, is a population of neurons interconnected by synapses to carry out a specific function when activated.
Neural coding is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the individual or ensemble neuronal responses and the relationship among the electrical activity of the neurons in the ensemble.
Neural Computation is a monthly peer-reviewed scientific journal covering all aspects of neural computation, including modeling the brain and the design and construction of neurally-inspired information processing systems.
Neural Designer is a software tool for data analytics based on neural networks, a main area of artificial intelligence research, and contains a graphical user interface which simplifies data entry and interpretation of results.
The term neural network was traditionally used to refer to a network or circuit of neurons.
NeuroSolutions is a neural network development environment developed by NeuroDimension.
Noise reduction is the process of removing noise from a signal.
High-dimensional data, meaning data that requires more than two or three dimensions to represent, can be difficult to interpret.
In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input.
A nucleic acid sequence is a succession of letters that indicate the order of nucleotides forming alleles within a DNA (using GACT) or RNA (GACU) molecule.
Online advertising, also called online marketing or Internet advertising or web advertising, is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers.
OpenNN (Open Neural Networks Library) is a software library written in the C++ programming language which implements neural networks, a main area of deep learning research.
An operational definition is the articulation of operationalization (or statement of procedures) used in defining the terms of a process (or set of validation tests) needed to determine the nature of an item or phenomenon (a variable, term, or object) and its properties such as duration, quantity, extension in space, chemical composition, etc.
Optical character recognition (also optical character reader, OCR) is the mechanical or electronic conversion of images of typed, handwritten or printed text into machine-encoded text, whether from a scanned document, a photo of a document, a scene-photo (for example the text on signs and billboards in a landscape photo) or from subtitle text superimposed on an image (for example from a television broadcast).
Oracle Data Mining (ODM) is an option of Oracle Corporation's Relational Database Management System (RDBMS) Enterprise Edition (EE).
Orange is an open-source data visualization, machine learning and data mining toolkit.
The following outline is provided as an overview of and topical guide to machine learning: Machine learning – subfield of computer sciencehttp://www.britannica.com/EBchecked/topic/1116194/machine-learning (more particularly soft computing) that evolved from the study of pattern recognition and computational learning theory in artificial intelligence.
The following outline is provided as an overview of and topical guide to object recognition: Object recognition – technology in the field of computer vision for finding and identifying objects in an image or video sequence.
In statistics, overfitting is "the production of an analysis that corresponds too closely or exactly to a particular set of data, and may therefore fail to fit additional data or predict future observations reliably".
A paraphrase is a restatement of the meaning of a text or passage using other words.
Pattern recognition is a branch of machine learning that focuses on the recognition of patterns and regularities in data, although it is in some cases considered to be nearly synonymous with machine learning.
PC games, also known as computer games or personal computer games, are video games played on a personal computer rather than a dedicated video game console or arcade machine.
Pedro Domingos is Professor at University of Washington.
In machine learning, the perceptron is an algorithm for supervised learning of binary classifiers (functions that can decide whether an input, represented by a vector of numbers, belongs to some specific class or not).
Peter E. Hart (born c. 1940s) is an American computer scientist and entrepreneur.
In mathematics, a piecewise-defined function (also called a piecewise function or a hybrid function) is a function defined by multiple sub-functions, each sub-function applying to a certain interval of the main function's domain, a sub-domain.
Precision agriculture (PA), satellite farming or site specific crop management (SSCM) is a farming management concept based on observing, measuring and responding to inter and intra-field variability in crops.
Predictive analytics encompasses a variety of statistical techniques from predictive modelling, machine learning, and data mining that analyze current and historical facts to make predictions about future or otherwise unknown events.
Predictive modelling uses statistics to predict outcomes.
Principal component analysis (PCA) is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.
Probability theory is the branch of mathematics concerned with probability.
PyTorch is an open source machine learning library for Python, based on Torch, used for applications such as natural language processing.
Quantum machine learning is an emerging interdisciplinary research area at the intersection of quantum physics and machine learning.
Random forests or random decision forests are an ensemble learning method for classification, regression and other tasks, that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes (classification) or mean prediction (regression) of the individual trees.
In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is a variable whose possible values are outcomes of a random phenomenon.
RapidMiner is a data science software platform developed by the company of the same name that provides an integrated environment for data preparation, machine learning, deep learning, text mining, and predictive analytics.
Ray Solomonoff (July 25, 1926 – December 7, 2009) was the inventor of algorithmic probability, his General Theory of Inductive Inference (also known as Universal Inductive Inference),Samuel Rathmanner and Marcus Hutter.
Root Cause Analysis Solver Engine (informally RCASE) is a proprietary algorithm developed from research originally at the Warwick Manufacturing Group (WMG) at Warwick University.
In statistics, a receiver operating characteristic curve, i.e. ROC curve, is a graphical plot that illustrates the diagnostic ability of a binary classifier system as its discrimination threshold is varied.
A recommender system or a recommendation system (sometimes replacing "system" with a synonym such as platform or engine) is a subclass of information filtering system that seeks to predict the "rating" or "preference" a user would give to an item.
In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships among variables.
Reinforcement learning (RL) is an area of machine learning inspired by behaviourist psychology, concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward.
Richard O. Duda is Professor Emeritus of Electrical Engineering at San Jose State University renowned for his work on sound localization and pattern recognition.
Robert Tibshirani (born July 10, 1956) is a Professor in the Departments of Statistics and Health Research and Policy at Stanford University.
Robot learning is a research field at the intersection of machine learning and robotics.
Robot locomotion is the collective name for the various methods that robots use to transport themselves from place to place.
Rule-based machine learning (RBML) is a term in computer science intended to encompass any machine learning method that identifies, learns, or evolves 'rules' to store, manipulate or apply.
Scikit-learn (formerly scikits.learn) is a free software machine learning library for the Python programming language.
In computer science, a search algorithm is any algorithm which solves the search problem, namely, to retrieve information stored within some data structure, or calculated in the search space of a problem domain.
Semi-supervised learning is a class of supervised learning tasks and techniques that also make use of unlabeled data for training – typically a small amount of labeled data with a large amount of unlabeled data.
Sensitivity and specificity are statistical measures of the performance of a binary classification test, also known in statistics as a classification function.
Opinion mining (sometimes known as sentiment analysis or emotion AI) refers to the use of natural language processing, text analysis, computational linguistics, and biometrics to systematically identify, extract, quantify, and study affective states and subjective information.
SequenceL is a general purpose functional programming language and auto-parallelizing (Parallel computing) compiler and tool set, whose primary design objectives are performance on multi-core processor hardware, ease of programming, platform portability/optimization, and code clarity and readability.
Sequential pattern mining is a topic of data mining concerned with finding statistically relevant patterns between data examples where the values are delivered in a sequence.
Shogun is a free, open source machine learning software library written in C++.
Similarity learning is an area of supervised machine learning in artificial intelligence.
Software engineering is the application of engineering to the development of software in a systematic method.
A software suite or application suite is a collection of computer programs —usually application software or programming software— of related functionality, often sharing a similar user interface and the ability to easily exchange data with each other.
Speech recognition is the inter-disciplinary sub-field of computational linguistics that develops methodologies and technologies that enables the recognition and translation of spoken language into text by computers.
Splunk Inc. is an American multinational corporation based in San Francisco, California, that produces software for searching, monitoring, and analyzing machine-generated big data, via a Web-style interface.
IBM SPSS Modeler is a data mining and text analytics software application from IBM.
Stanford University (officially Leland Stanford Junior University, colloquially the Farm) is a private research university in Stanford, California.
Statistica is an advanced analytics software package originally developed by StatSoft which was acquired by Dell in March 2014.
In machine learning and statistics, classification is the problem of identifying to which of a set of categories (sub-populations) a new observation belongs, on the basis of a training set of data containing observations (or instances) whose category membership is known.
Statistics is a branch of mathematics dealing with the collection, analysis, interpretation, presentation, and organization of data.
In computational complexity, strong NP-completeness is a property of computational problems that is a special case of NP-completeness.
Structural health monitoring (SHM) refers to the process of implementing a damage detection and characterization strategy for engineering structures.
Sun Microsystems, Inc. was an American company that sold computers, computer components, software, and information technology services and created the Java programming language, the Solaris operating system, ZFS, the Network File System (NFS), and SPARC.
Supervised learning is the machine learning task of learning a function that maps an input to an output based on example input-output pairs.
Symbolic artificial intelligence is the term for the collection of all methods in artificial intelligence research that are based on high-level "symbolic" (human-readable) representations of problems, logic and search.
Syntactic pattern recognition or structural pattern recognition is a form of pattern recognition, in which each object can be represented by a variable-cardinality set of symbolic, nominal features.
Telecommunication is the transmission of signs, signals, messages, words, writings, images and sounds or information of any nature by wire, radio, optical or other electromagnetic systems.
In mathematics, tensors are geometric objects that describe linear relations between geometric vectors, scalars, and other tensors.
TensorFlow is an open-source software library for dataflow programming across a range of tasks.
In linguistics, a corpus (plural corpora) or text corpus is a large and structured set of texts (nowadays usually electronically stored and processed).
The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World is a book by Pedro Domingos released in 2015.
Theoretical computer science, or TCS, is a subset of general computer science and mathematics that focuses on more mathematical topics of computing and includes the theory of computation.
In computer science, the time complexity is the computational complexity that describes the amount of time it takes to run an algorithm.
A time series is a series of data points indexed (or listed or graphed) in time order.
Tom Michael Mitchell (born August 9, 1951) is an American computer scientist and E. Fredkin University Professor at the Carnegie Mellon University (CMU).
In machine learning and natural language processing, a topic model is a type of statistical model for discovering the abstract "topics" that occur in a collection of documents.
Torch is an open source machine learning library, a scientific computing framework, and a script language based on the Lua programming language.
The Total Operating Characteristic (TOC) is a statistical method to compare a Boolean variable versus a rank variable.
In machine learning, the study and construction of algorithms that can learn from and make predictions on data is a common task.
Translation is the communication of the meaning of a source-language text by means of an equivalent target-language text.
Trevor John Hastie (born 27 June 1953) is a South African and American statistician and computer scientist.
Unsupervised machine learning is the machine learning task of inferring a function that describes the structure of "unlabeled" data (i.e. data that has not been classified or categorized).
User behavior analytics ("UBA") as defined by Gartner is a cybersecurity process about detection of insider threats, targeted attacks, and financial fraud.
Vinod Khosla (Gurmukhi: ਵਿਨੋਦ ਖੋਸਲਾ; born 28 January 1955) is an Indian American billionaire engineer, businessman and venture capitalist.
A web search engine is a software system that is designed to search for information on the World Wide Web.
Waikato Environment for Knowledge Analysis (Weka) is a suite of machine learning software written in Java, developed at the University of Waikato, New Zealand.
Wolfram Mathematica (usually termed Mathematica) is a modern technical computing system spanning most areas of technical computing — including neural networks, machine learning, image processing, geometry, data science, visualizations, and others.
Yooreeka is a library for data mining, machine learning, soft computing, and mathematical analysis.
Yoshua Bengio (born 1964 in France) is a Canadian computer scientist, most noted for his work on artificial neural networks and deep learning.
Adaptive machine learning, Applications of machine learning, Automatic learning algorithms, Computer machine learning, Dictionary learning, Ethics of machine learning, Feature discovery, Genetic algorithms for machine learning, History of machine learning, Learning algorithm, Learning algorithms, Learning machine, List of machine learning software, List of open-source machine learning software, Machine Learning, Machine learning algorithm, Machine-learned, Machine-learning, Statistical learning, Strengthening learning algorithms.