  Communication
Free Faster access than browser!

# Student's t-distribution

In probability and statistics, Student's t-distribution (or simply the t-distribution) is any member of a family of continuous probability distributions that arises when estimating the mean of a normally distributed population in situations where the sample size is small and population standard deviation is unknown. 

102 relations: Bayes' theorem, Bayesian inference, Bayesian statistics, Bessel function, Bessel's correction, Beta function, Biometrika, Box–Muller transform, Cauchy distribution, Chi-squared distribution, Cochran's theorem, Compound probability distribution, Confidence interval, Conjugate prior, Copula (probability theory), Cumulative distribution function, Curse of dimensionality, Data, Degrees of freedom (statistics), Digamma function, Dublin, Errors and residuals, Expected value, F-distribution, Folded-t and half-t distributions, Friedrich Robert Helmert, Gamma distribution, Gamma function, Generalised hyperbolic distribution, Guinness Brewery, Hotelling's T-squared distribution, Hypergeometric function, Independence (probability theory), Independent and identically distributed random variables, Indeterminate form, Inverse-gamma distribution, Irwin–Hall distribution, Jacob Lüroth, Jeffreys prior, Kurtosis, Linear function, Location parameter, Location–scale family, Marginal distribution, Mathematical Proceedings of the Cambridge Philosophical Society, Maximum entropy probability distribution, Mean, Michael Christopher Wendl, Moment (mathematics), Multivariate t-distribution, ... Expand index (52 more) »

## Bayes' theorem

In probability theory and statistics, Bayes’ theorem (alternatively Bayes’ law or Bayes' rule, also written as Bayes’s theorem) describes the probability of an event, based on prior knowledge of conditions that might be related to the event.

## Bayesian inference

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.

## Bayesian statistics

Bayesian statistics, named for Thomas Bayes (1701–1761), is a theory in the field of statistics in which the evidence about the true state of the world is expressed in terms of degrees of belief known as Bayesian probabilities.

## Bessel function

Bessel functions, first defined by the mathematician Daniel Bernoulli and then generalized by Friedrich Bessel, are the canonical solutions of Bessel's differential equation for an arbitrary complex number, the order of the Bessel function.

## Bessel's correction

In statistics, Bessel's correction is the use of n − 1 instead of n in the formula for the sample variance and sample standard deviation, where n is the number of observations in a sample.

## Beta function

In mathematics, the beta function, also called the Euler integral of the first kind, is a special function defined by for.

## Biometrika

Biometrika is a peer-reviewed scientific journal published by Oxford University Press for the Biometrika Trust.

## Box–Muller transform

The Box–Muller transform, by George Edward Pelham Box and Mervin Edgar Muller, is a pseudo-random number sampling method for generating pairs of independent, standard, normally distributed (zero expectation, unit variance) random numbers, given a source of uniformly distributed random numbers.

## Cauchy distribution

The Cauchy distribution, named after Augustin Cauchy, is a continuous probability distribution.

No description.

## Cochran's theorem

In statistics, Cochran's theorem, devised by William G. Cochran, is a theorem used to justify results relating to the probability distributions of statistics that are used in the analysis of variance.

## Compound probability distribution

In probability and statistics, a compound probability distribution (also known as a mixture distribution or contagious distribution) is the probability distribution that results from assuming that a random variable is distributed according to some parametrized distribution, with (some of) the parameters of that distribution themselves being random variables.

## Confidence interval

In statistics, a confidence interval (CI) is a type of interval estimate, computed from the statistics of the observed data, that might contain the true value of an unknown population parameter.

## Conjugate prior

In Bayesian probability theory, if the posterior distributions p(θ|x) are in the same probability distribution family as the prior probability distribution p(θ), the prior and posterior are then called conjugate distributions, and the prior is called a conjugate prior for the likelihood function.

## Copula (probability theory)

In probability theory and statistics, a copula is a multivariate probability distribution for which the marginal probability distribution of each variable is uniform.

## Cumulative distribution function

In probability theory and statistics, the cumulative distribution function (CDF, also cumulative density function) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. In the case of a continuous distribution, it gives the area under the probability density function from minus infinity to x. Cumulative distribution functions are also used to specify the distribution of multivariate random variables.

## Curse of dimensionality

The curse of dimensionality refers to various phenomena that arise when analyzing and organizing data in high-dimensional spaces (often with hundreds or thousands of dimensions) that do not occur in low-dimensional settings such as the three-dimensional physical space of everyday experience.

## Data

Data is a set of values of qualitative or quantitative variables.

## Degrees of freedom (statistics)

In statistics, the number of degrees of freedom is the number of values in the final calculation of a statistic that are free to vary.

## Digamma function

In mathematics, the digamma function is defined as the logarithmic derivative of the gamma function: It is the first of the polygamma functions.

## Dublin

Dublin is the capital of and largest city in Ireland.

## Errors and residuals

In statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "theoretical value".

## Expected value

In probability theory, the expected value of a random variable, intuitively, is the long-run average value of repetitions of the experiment it represents.

No description.

## Folded-t and half-t distributions

In statistics, the folded-t and half-t distributions are derived from Student's ''t''-distribution by taking the absolute values of variates.

## Friedrich Robert Helmert

Friedrich Robert Helmert (July 31, 1843 &ndash; June 15, 1917) was a German geodesist and an important writer on the theory of errors.

## Gamma distribution

In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions.

## Gamma function

In mathematics, the gamma function (represented by, the capital Greek alphabet letter gamma) is an extension of the factorial function, with its argument shifted down by 1, to real and complex numbers.

## Generalised hyperbolic distribution

The generalised hyperbolic distribution (GH) is a continuous probability distribution defined as the normal variance-mean mixture where the mixing distribution is the generalized inverse Gaussian distribution (GIG).

St.

## Hotelling's T-squared distribution

In statistics Hotelling's T-squared distribution (T2) is a multivariate distribution proportional to the ''F''-distribution and arises importantly as the distribution of a set of statistics which are natural generalizations of the statistics underlying Student's ''t''-distribution.

## Hypergeometric function

In mathematics, the Gaussian or ordinary hypergeometric function 2F1(a,b;c;z) is a special function represented by the hypergeometric series, that includes many other special functions as specific or limiting cases.

## Independence (probability theory)

In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of the other.

## Independent and identically distributed random variables

In probability theory and statistics, a sequence or other collection of random variables is independent and identically distributed (i.i.d. or iid or IID) if each random variable has the same probability distribution as the others and all are mutually independent.

## Indeterminate form

In calculus and other branches of mathematical analysis, limits involving an algebraic combination of functions in an independent variable may often be evaluated by replacing these functions by their limits; if the expression obtained after this substitution does not give enough information to determine the original limit, it is said to take on an indeterminate form.

## Inverse-gamma distribution

In probability theory and statistics, the inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution.

## Irwin–Hall distribution

In probability and statistics, the Irwin–Hall distribution, named after Joseph Oscar Irwin and Philip Hall, is a probability distribution for a random variable defined as the sum of a number of independent random variables, each having a uniform distribution.

## Jacob Lüroth

Jacob Lüroth (18 February 1844, Mannheim, Germany – 14 September 1910, Munich, Germany) was a German mathematician who proved Lüroth's theorem and introduced Lüroth quartics.

## Jeffreys prior

In Bayesian probability, the Jeffreys prior, named after Sir Harold Jeffreys, is a non-informative (objective) prior distribution for a parameter space; it is proportional to the square root of the determinant of the Fisher information matrix: It has the key feature that it is invariant under reparameterization of the parameter vector \vec\theta.

## Kurtosis

In probability theory and statistics, kurtosis (from κυρτός, kyrtos or kurtos, meaning "curved, arching") is a measure of the "tailedness" of the probability distribution of a real-valued random variable.

## Linear function

In mathematics, the term linear function refers to two distinct but related notions.

## Location parameter

In statistics, a location family is a class of probability distributions that is parametrized by a scalar- or vector-valued parameter x_0, which determines the "location" or shift of the distribution.

## Location–scale family

In probability theory, especially in mathematical statistics, a location–scale family is a family of probability distributions parametrized by a location parameter and a non-negative scale parameter.

## Marginal distribution

In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.

## Mathematical Proceedings of the Cambridge Philosophical Society

Mathematical Proceedings of the Cambridge Philosophical Society is a mathematical journal published by Cambridge University Press for the Cambridge Philosophical Society.

## Maximum entropy probability distribution

In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions.

## Mean

In mathematics, mean has several different definitions depending on the context.

## Michael Christopher Wendl

Michael Christopher Wendl is a mathematician and biomedical engineer who has worked on DNA sequencing theory, covering and matching problems in probability, theoretical fluid mechanics, and co-wrote Phred.

## Moment (mathematics)

In mathematics, a moment is a specific quantitative measure, used in both mechanics and statistics, of the shape of a set of points.

## Multivariate t-distribution

In statistics, the multivariate t-distribution (or multivariate Student distribution) is a multivariate probability distribution.

## Noncentral t-distribution

As with other probability distributions with noncentrality parameters, the noncentral t-distribution generalizes a probability distribution – Student's ''t''-distribution – using a noncentrality parameter.

## Noncentrality parameter

Noncentrality parameters are parameters of families of probability distributions that are related to other "central" families of distributions.

## Normal distribution

In probability theory, the normal (or Gaussian or Gauss or Laplace–Gauss) distribution is a very common continuous probability distribution.

## Null hypothesis

In inferential statistics, the term "null hypothesis" is a general statement or default position that there is no relationship between two measured phenomena, or no association among groups.

## One- and two-tailed tests

In statistical significance testing, a one-tailed test and a two-tailed test are alternative ways of computing the statistical significance of a parameter inferred from a data set, in terms of a test statistic.

## P-value

In statistical hypothesis testing, the p-value or probability value or asymptotic significance is the probability for a given statistical model that, when the null hypothesis is true, the statistical summary (such as the sample mean difference between two compared groups) would be the same as or of greater magnitude than the actual observed results.

## Parametrization

Parametrization (or parameterization; also parameterisation, parametrisation) is the process of finding parametric equations of a curve, a surface, or, more generally, a manifold or a variety, defined by an implicit equation.

## Pearson distribution

The Pearson distribution is a family of continuous probability distributions.

## Pivotal quantity

In statistics, a pivotal quantity or pivot is a function of observations and unobservable parameters such that the function's probability distribution does not depend on the unknown parameters (including nuisance parameters).

## Posterior predictive distribution

In Bayesian statistics, the posterior predictive distribution is the distribution of possible unobserved values conditional on the observed values.

## Posterior probability

In Bayesian statistics, the posterior probability of a random event or an uncertain proposition is the conditional probability that is assigned after the relevant evidence or background is taken into account.

## Power (statistics)

The power of a binary hypothesis test is the probability that the test correctly rejects the null hypothesis (H0) when a specific alternative hypothesis (H1) is true.

## Precision (statistics)

In statistics, precision is the reciprocal of the variance, and the precision matrix (also known as concentration matrix) is the matrix inverse of the covariance matrix.

## Prediction interval

In statistical inference, specifically predictive inference, a prediction interval is an estimate of an interval in which a future observation will fall, with a certain probability, given what has already been observed.

## Prior probability

In Bayesian statistical inference, a prior probability distribution, often simply called the prior, of an uncertain quantity is the probability distribution that would express one's beliefs about this quantity before some evidence is taken into account.

## Probability

Probability is the measure of the likelihood that an event will occur.

## Probability density function

In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function, whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample.

## Probability distribution

In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.

## Probability mass function

In probability and statistics, a probability mass function (pmf) is a function that gives the probability that a discrete random variable is exactly equal to some value.

## Prosecutor's fallacy

The prosecutor's fallacy is a fallacy of statistical reasoning, typically used by the prosecution to argue for the guilt of a defendant during a criminal trial.

## Quantile

In statistics and probability quantiles are cut points dividing the range of a probability distribution into contiguous intervals with equal probabilities, or dividing the observations in a sample in the same way.

## Quantile function

In probability and statistics, the quantile function specifies, for a given probability in the probability distribution of a random variable, the value at which the probability of the random variable is less than or equal to the given probability.

## Random variable

In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is a variable whose possible values are outcomes of a random phenomenon.

## Real number

In mathematics, a real number is a value of a continuous quantity that can represent a distance along a line.

## Regression analysis

In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships among variables.

## Robust statistics

Robust statistics are statistics with good performance for data drawn from a wide range of probability distributions, especially for distributions that are not normal.

## Ronald Fisher

Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962), who published as R. A. Fisher, was a British statistician and geneticist.

## Sample size determination

Sample size determination is the act of choosing the number of observations or replicates to include in a statistical sample.

## Sampling distribution

In statistics, a sampling distribution or finite-sample distribution is the probability distribution of a given random-sample-based statistic.

## Scale parameter

In probability theory and statistics, a scale parameter is a special kind of numerical parameter of a parametric family of probability distributions.

## Scaled inverse chi-squared distribution

The scaled inverse chi-squared distribution is the distribution for x.

## Science (journal)

Science, also widely referred to as Science Magazine, is the peer-reviewed academic journal of the American Association for the Advancement of Science (AAAS) and one of the world's top academic journals.

## Skewness

In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean.

## Spearman's rank correlation coefficient

In statistics, Spearman's rank correlation coefficient or Spearman's rho, named after Charles Spearman and often denoted by the Greek letter \rho (rho) or as r_s, is a nonparametric measure of rank correlation (statistical dependence between the rankings of two variables).

A spreadsheet is an interactive computer application for organization, analysis and storage of data in tabular form.

## Springer Science+Business Media

Springer Science+Business Media or Springer, part of Springer Nature since 2015, is a global publishing company that publishes books, e-books and peer-reviewed journals in science, humanities, technical and medical (STM) publishing.

## Standard deviation

In statistics, the standard deviation (SD, also represented by the Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values.

## Standard score

In statistics, the standard score is the signed number of standard deviations by which the value of an observation or data point differs from the mean value of what is being observed or measured.

## Statistical hypothesis testing

A statistical hypothesis, sometimes called confirmatory data analysis, is a hypothesis that is testable on the basis of observing a process that is modeled via a set of random variables.

## Statistical population

In statistics, a population is a set of similar items or events which is of interest for some question or experiment.

## Statistical significance

In statistical hypothesis testing, a result has statistical significance when it is very unlikely to have occurred given the null hypothesis.

## Statistics

Statistics is a branch of mathematics dealing with the collection, analysis, interpretation, presentation, and organization of data.

## Statistics in Medicine (journal)

Statistics in Medicine is a peer-reviewed statistics journal published by Wiley.

## Student's t-test

The t-test is any statistical hypothesis test in which the test statistic follows a Student's ''t''-distribution under the null hypothesis.

## Studentized residual

In statistics, a studentized residual is the quotient resulting from the division of a residual by an estimate of its standard deviation.

## Symmetric probability distribution

In statistics, a symmetric probability distribution is a probability distribution&mdash;an assignment of probabilities to possible occurrences&mdash;which is unchanged when its probability density function or probability mass function is reflected around a vertical line at some value of the random variable represented by the distribution.

## T-statistic

In statistics, the t-statistic is the ratio of the departure of the estimated value of a parameter from its hypothesized value to its standard error.

## Uniform

A uniform is a type of clothing worn by members of an organization while participating in that organization's activity.

## Variance

In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean.

## Wilks's lambda distribution

In statistics, Wilks's lambda distribution (named for Samuel S. Wilks), is a probability distribution used in multivariate hypothesis testing, especially with regard to the likelihood-ratio test and multivariate analysis of variance (MANOVA).

## William Sealy Gosset

William Sealy Gosset (13 June 1876 – 16 October 1937) was an English statistician.

## Wishart distribution

In statistics, the Wishart distribution is a generalization to multiple dimensions of the chi-squared distribution, or, in the case of non-integer degrees of freedom, of the gamma distribution.

## References

Hey! We are on Facebook now! »