17 relations: Adjugate matrix, Carl Gustav Jacob Jacobi, Cayley–Hamilton theorem, Chain rule, Characteristic polynomial, Derivative, Determinant, Differential (infinitesimal), Faddeev–LeVerrier algorithm, Invertible matrix, Kronecker delta, Laplace expansion, Matrix calculus, Matrix exponential, Minor (linear algebra), Trace (linear algebra), Transpose.
In linear algebra, the adjugate, classical adjoint, or adjunct of a square matrix is the transpose of its cofactor matrix.
Carl Gustav Jacob Jacobi (10 December 1804 – 18 February 1851) was a German mathematician, who made fundamental contributions to elliptic functions, dynamics, differential equations, and number theory.
In linear algebra, the Cayley–Hamilton theorem (named after the mathematicians Arthur Cayley and William Rowan Hamilton) states that every square matrix over a commutative ring (such as the real or complex field) satisfies its own characteristic equation.
In calculus, the chain rule is a formula for computing the derivative of the composition of two or more functions.
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots.
The derivative of a function of a real variable measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value).
In linear algebra, the determinant is a value that can be computed from the elements of a square matrix.
The term differential is used in calculus to refer to an infinitesimal (infinitely small) change in some varying quantity.
In mathematics (linear algebra), the Faddeev–LeVerrier algorithm is a recursive method to calculate the coefficients of the characteristic polynomial p(\lambda).
In linear algebra, an n-by-n square matrix A is called invertible (also nonsingular or nondegenerate) if there exists an n-by-n square matrix B such that where In denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication.
In mathematics, the Kronecker delta (named after Leopold Kronecker) is a function of two variables, usually just non-negative integers.
In linear algebra, the Laplace expansion, named after Pierre-Simon Laplace, also called cofactor expansion, is an expression for the determinant |B| of an n × n matrix B that is a weighted sum of the determinants of n sub-matrices of B, each of size (n−1) × (n−1).
In mathematics, matrix calculus is a specialized notation for doing multivariable calculus, especially over spaces of matrices.
In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.
In linear algebra, a minor of a matrix A is the determinant of some smaller square matrix, cut down from A by removing one or more of its rows or columns.
In linear algebra, the trace of an n-by-n square matrix A is defined to be the sum of the elements on the main diagonal (the diagonal from the upper left to the lower right) of A, i.e., where aii denotes the entry on the ith row and ith column of A. The trace of a matrix is the sum of the (complex) eigenvalues, and it is invariant with respect to a change of basis.
In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal, that is it switches the row and column indices of the matrix by producing another matrix denoted as AT (also written A′, Atr, tA or At).