48 relations: Basis (linear algebra), Bilinear form, Bivector, Blade (geometry), Block matrix, Carl Gustav Jacob Jacobi, Cayley transform, Characteristic (algebra), Commutator, Complex number, Conjugate transpose, Connected space, Cross product, Curl (mathematics), Determinant, Diagonal matrix, Dimension (vector space), Direct sum of modules, Eigenvalues and eigenvectors, Exponential map (Lie theory), Field (mathematics), Generating function, Imaginary number, Inner product space, Invertible matrix, Lie algebra, Lie group, Linear algebra, Linear map, Main diagonal, Matrix exponential, Matrix similarity, Normal matrix, Orthogonal group, Orthogonal matrix, Pfaffian, Polynomial, Skew-Hermitian matrix, Spectral theorem, Square matrix, Symmetric matrix, Symmetry in mathematics, Symplectic matrix, Tangent space, Trace (linear algebra), Transpose, Unitary matrix, Vector space.
In mathematics, a set of elements (vectors) in a vector space V is called a basis, or a set of, if the vectors are linearly independent and every vector in the vector space is a linear combination of this set.
In mathematics, more specifically in abstract algebra and linear algebra, a bilinear form on a vector space V is a bilinear map, where K is the field of scalars.
In mathematics, a bivector or 2-vector is a quantity in exterior algebra or geometric algebra that extends the idea of scalars and vectors.
In the study of geometric algebras, a blade is a generalization of the concept of scalars and vectors to include simple bivectors, trivectors, etc.
In mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices.
Carl Gustav Jacob Jacobi (10 December 1804 – 18 February 1851) was a German mathematician, who made fundamental contributions to elliptic functions, dynamics, differential equations, and number theory.
In mathematics, the Cayley transform, named after Arthur Cayley, is any of a cluster of related things.
In mathematics, the characteristic of a ring R, often denoted char(R), is defined to be the smallest number of times one must use the ring's multiplicative identity (1) in a sum to get the additive identity (0) if the sum does indeed eventually attain 0.
In mathematics, the commutator gives an indication of the extent to which a certain binary operation fails to be commutative.
A complex number is a number that can be expressed in the form, where and are real numbers, and is a solution of the equation.
In mathematics, the conjugate transpose or Hermitian transpose of an m-by-n matrix A with complex entries is the n-by-m matrix A∗ obtained from A by taking the transpose and then taking the complex conjugate of each entry.
In topology and related branches of mathematics, a connected space is a topological space that cannot be represented as the union of two or more disjoint nonempty open subsets.
In mathematics and vector algebra, the cross product or vector product (occasionally directed area product to emphasize the geometric significance) is a binary operation on two vectors in three-dimensional space \left(\mathbb^3\right) and is denoted by the symbol \times.
In vector calculus, the curl is a vector operator that describes the infinitesimal rotation of a vector field in three-dimensional Euclidean space.
In linear algebra, the determinant is a value that can be computed from the elements of a square matrix.
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero.
In mathematics, the dimension of a vector space V is the cardinality (i.e. the number of vectors) of a basis of V over its base field.
In abstract algebra, the direct sum is a construction which combines several modules into a new, larger module.
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that changes by only a scalar factor when that linear transformation is applied to it.
In the theory of Lie groups, the exponential map is a map from the Lie algebra \mathfrak g of a Lie group G to the group, which allows one to recapture the local group structure from the Lie algebra.
In mathematics, a field is a set on which addition, subtraction, multiplication, and division are defined, and behave as when they are applied to rational and real numbers.
In mathematics, a generating function is a way of encoding an infinite sequence of numbers (an) by treating them as the coefficients of a power series.
An imaginary number is a complex number that can be written as a real number multiplied by the imaginary unit,j is usually used in Engineering contexts where i has other meanings (such as electrical current) which is defined by its property.
In linear algebra, an inner product space is a vector space with an additional structure called an inner product.
In linear algebra, an n-by-n square matrix A is called invertible (also nonsingular or nondegenerate) if there exists an n-by-n square matrix B such that where In denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication.
In mathematics, a Lie algebra (pronounced "Lee") is a vector space \mathfrak g together with a non-associative, alternating bilinear map \mathfrak g \times \mathfrak g \rightarrow \mathfrak g; (x, y) \mapsto, called the Lie bracket, satisfying the Jacobi identity.
In mathematics, a Lie group (pronounced "Lee") is a group that is also a differentiable manifold, with the property that the group operations are compatible with the smooth structure.
Linear algebra is the branch of mathematics concerning linear equations such as linear functions such as and their representations through matrices and vector spaces.
In mathematics, a linear map (also called a linear mapping, linear transformation or, in some contexts, linear function) is a mapping between two modules (including vector spaces) that preserves (in the sense defined below) the operations of addition and scalar multiplication.
In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, or major diagonal) of a matrix A is the collection of entries A_ where i.
In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.
In linear algebra, two n-by-n matrices and are called similar if for some invertible n-by-n matrix.
In mathematics, a complex square matrix is normal if where is the conjugate transpose of.
In mathematics, the orthogonal group in dimension, denoted, is the group of distance-preserving transformations of a Euclidean space of dimension that preserve a fixed point, where the group operation is given by composing transformations.
In linear algebra, an orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors (i.e., orthonormal vectors), i.e. where I is the identity matrix.
In mathematics, the determinant of a skew-symmetric matrix can always be written as the square of a polynomial in the matrix entries, a polynomial with integer coefficients that only depend on the size of the matrix.
In mathematics, a polynomial is an expression consisting of variables (also called indeterminates) and coefficients, that involves only the operations of addition, subtraction, multiplication, and non-negative integer exponents of variables.
In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or antihermitian if its conjugate transpose is equal to the original matrix, with all the entries being of opposite sign.
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis).
In mathematics, a square matrix is a matrix with the same number of rows and columns.
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose.
Symmetry occurs not only in geometry, but also in other branches of mathematics.
In mathematics, a symplectic matrix is a 2n×2n matrix M with real entries that satisfies the condition where MT denotes the transpose of M and Ω is a fixed 2n×2n nonsingular, skew-symmetric matrix.
In mathematics, the tangent space of a manifold facilitates the generalization of vectors from affine spaces to general manifolds, since in the latter case one cannot simply subtract two points to obtain a vector that gives the displacement of the one point from the other.
In linear algebra, the trace of an n-by-n square matrix A is defined to be the sum of the elements on the main diagonal (the diagonal from the upper left to the lower right) of A, i.e., where aii denotes the entry on the ith row and ith column of A. The trace of a matrix is the sum of the (complex) eigenvalues, and it is invariant with respect to a change of basis.
In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal, that is it switches the row and column indices of the matrix by producing another matrix denoted as AT (also written A′, Atr, tA or At).
In mathematics, a complex square matrix is unitary if its conjugate transpose is also its inverse—that is, if where is the identity matrix.
A vector space (also called a linear space) is a collection of objects called vectors, which may be added together and multiplied ("scaled") by numbers, called scalars.