Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Androidâ„¢ device!
Free
Faster access than browser!
 

Adjusted mutual information

Index Adjusted mutual information

In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings. [1]

11 relations: Cluster analysis, Contingency table, Entropy (information theory), Hypergeometric distribution, Information theory, Mutual information, Partition of a set, Probability theory, Rand index, Similarity measure, Variation of information.

Cluster analysis

Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense) to each other than to those in other groups (clusters).

New!!: Adjusted mutual information and Cluster analysis · See more »

Contingency table

In statistics, a contingency table (also known as a cross tabulation or crosstab) is a type of table in a matrix format that displays the (multivariate) frequency distribution of the variables.

New!!: Adjusted mutual information and Contingency table · See more »

Entropy (information theory)

Information entropy is the average rate at which information is produced by a stochastic source of data.

New!!: Adjusted mutual information and Entropy (information theory) · See more »

Hypergeometric distribution

In probability theory and statistics, the hypergeometric distribution is a discrete probability distribution that describes the probability of k successes (random draws for which the object drawn has a specified feature) in n draws, without replacement, from a finite population of size N that contains exactly K objects with that feature, wherein each draw is either a success or a failure.

New!!: Adjusted mutual information and Hypergeometric distribution · See more »

Information theory

Information theory studies the quantification, storage, and communication of information.

New!!: Adjusted mutual information and Information theory · See more »

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

New!!: Adjusted mutual information and Mutual information · See more »

Partition of a set

In mathematics, a partition of a set is a grouping of the set's elements into non-empty subsets, in such a way that every element is included in one and only one of the subsets.

New!!: Adjusted mutual information and Partition of a set · See more »

Probability theory

Probability theory is the branch of mathematics concerned with probability.

New!!: Adjusted mutual information and Probability theory · See more »

Rand index

The Rand index or Rand measure (named after William M. Rand) in statistics, and in particular in data clustering, is a measure of the similarity between two data clusterings.

New!!: Adjusted mutual information and Rand index · See more »

Similarity measure

In statistics and related fields, a similarity measure or similarity function is a real-valued function that quantifies the similarity between two objects.

New!!: Adjusted mutual information and Similarity measure · See more »

Variation of information

In probability theory and information theory, the variation of information or shared information distance is a measure of the distance between two clusterings (partitions of elements).

New!!: Adjusted mutual information and Variation of information · See more »

Redirects here:

Adjusted Mutual Information.

References

[1] https://en.wikipedia.org/wiki/Adjusted_mutual_information

OutgoingIncoming
Hey! We are on Facebook now! »