34 relations: Arithmetic mean, Average, Bessel's correction, Bias of an estimator, Central tendency, Commutative property, Convex combination, Covariance matrix, Descriptive statistics, Design matrix, Empty set, Gauss–Markov theorem, Harmonic mean, Independent and identically distributed random variables, Invertible matrix, Least squares, Matrix multiplication, Maximum likelihood estimation, Mean, Normal distribution, Normalizing constant, Probability distribution, Reduced chi-squared statistic, Simpson's paradox, Standard deviation, Standard error, Summary statistics, Variance, Weight function, Weighted arithmetic mean, Weighted average cost of capital, Weighted geometric mean, Weighted median, Weighting.
In mathematics and statistics, the arithmetic mean (stress on third syllable of "arithmetic"), or simply the mean or average when the context is clear, is the sum of a collection of numbers divided by the number of numbers in the collection.
In colloquial language, an average is a middle or typical number of a list of numbers.
In statistics, Bessel's correction is the use of n − 1 instead of n in the formula for the sample variance and sample standard deviation, where n is the number of observations in a sample.
In statistics, the bias (or bias function) of an estimator is the difference between this estimator's expected value and the true value of the parameter being estimated.
In statistics, a central tendency (or measure of central tendency) is a central or typical value for a probability distribution.
In mathematics, a binary operation is commutative if changing the order of the operands does not change the result.
In convex geometry, a convex combination is a linear combination of points (which can be vectors, scalars, or more generally points in an affine space) where all coefficients are non-negative and sum to 1.
In probability theory and statistics, a covariance matrix (also known as dispersion matrix or variance–covariance matrix) is a matrix whose element in the i, j position is the covariance between the i-th and j-th elements of a random vector.
A descriptive statistic (in the count noun sense) is a summary statistic that quantitatively describes or summarizes features of a collection of information, while descriptive statistics in the mass noun sense is the process of using and analyzing those statistics.
In statistics, a design matrix, also known as model matrix or regressor matrix, is a matrix of values of explanatory variables of a set of objects, often denoted by X. Each row represents an individual object, with the successive columns corresponding to the variables and their specific values for that object.
In mathematics, and more specifically set theory, the empty set or null set is the unique set having no elements; its size or cardinality (count of elements in a set) is zero.
In statistics, the Gauss–Markov theorem, named after Carl Friedrich Gauss and Andrey Markov, states that in a linear regression model in which the errors have expectation zero and are uncorrelated and have equal variances, the best linear unbiased estimator (BLUE) of the coefficients is given by the ordinary least squares (OLS) estimator, provided it exists.
In mathematics, the harmonic mean (sometimes called the subcontrary mean) is one of several kinds of average, and in particular one of the Pythagorean means.
In probability theory and statistics, a sequence or other collection of random variables is independent and identically distributed (i.i.d. or iid or IID) if each random variable has the same probability distribution as the others and all are mutually independent.
In linear algebra, an n-by-n square matrix A is called invertible (also nonsingular or nondegenerate) if there exists an n-by-n square matrix B such that where In denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication.
The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems, i.e., sets of equations in which there are more equations than unknowns.
In mathematics, matrix multiplication or matrix product is a binary operation that produces a matrix from two matrices with entries in a field, or, more generally, in a ring or even a semiring.
In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of a statistical model, given observations.
In mathematics, mean has several different definitions depending on the context.
In probability theory, the normal (or Gaussian or Gauss or Laplace–Gauss) distribution is a very common continuous probability distribution.
The concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics.
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.
In statistics, the reduced chi-squared statistic is used extensively in goodness of fit testing.
Simpson's paradox, or the Yule–Simpson effect, is a phenomenon in probability and statistics, in which a trend appears in several different groups of data but disappears or reverses when these groups are combined.
In statistics, the standard deviation (SD, also represented by the Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values.
The standard error (SE) of a statistic (usually an estimate of a parameter) is the standard deviation of its sampling distribution or an estimate of that standard deviation.
In descriptive statistics, summary statistics are used to summarize a set of observations, in order to communicate the largest amount of information as simply as possible.
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.
A weight function is a mathematical device used when performing a sum, integral, or average to give some elements more "weight" or influence on the result than other elements in the same set.
The weighted arithmetic mean is similar to an ordinary arithmetic mean (the most common type of average), except that instead of each of the data points contributing equally to the final average, some data points contribute more than others.
The weighted average cost of capital (WACC) is the rate that a company is expected to pay on average to all its security holders to finance its assets.
In statistics, given a set of data, and corresponding weights, the weighted geometric mean is calculated as Note that if all the weights are equal, the weighted geometric mean is the same as the geometric mean.
In statistics, a weighted median of a sample is the 50% weighted percentile.
The process of weighting involves emphasizing the contribution of some aspects of a phenomenon (or of a set of data) to a final effect or result, giving them more weight in the analysis.
Average rating, Average score, Rating average, Standard error of the weighted mean, Weighted Mean, Weighted average, Weighted averages, Weighted mean, Weighted sample variance, Weighted standard deviation, Weighted variance, Weighted-average.