  Communication
Install Faster access than browser!

# Skew-symmetric matrix

In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative; that is, it satisfies the condition In terms of the entries of the matrix, if aij denotes the entry in the and; i.e.,, then the skew-symmetric condition is For example, the following matrix is skew-symmetric: 0 & 2 & -1 \\ -2 & 0 & -4 \\ 1 & 4 & 0\end. 

## Basis (linear algebra)

In mathematics, a set of elements (vectors) in a vector space V is called a basis, or a set of, if the vectors are linearly independent and every vector in the vector space is a linear combination of this set.

## Bilinear form

In mathematics, more specifically in abstract algebra and linear algebra, a bilinear form on a vector space V is a bilinear map, where K is the field of scalars.

## Bivector

In mathematics, a bivector or 2-vector is a quantity in exterior algebra or geometric algebra that extends the idea of scalars and vectors.

In the study of geometric algebras, a blade is a generalization of the concept of scalars and vectors to include simple bivectors, trivectors, etc.

## Block matrix

In mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices.

## Carl Gustav Jacob Jacobi

Carl Gustav Jacob Jacobi (10 December 1804 – 18 February 1851) was a German mathematician, who made fundamental contributions to elliptic functions, dynamics, differential equations, and number theory.

## Cayley transform

In mathematics, the Cayley transform, named after Arthur Cayley, is any of a cluster of related things.

## Characteristic (algebra)

In mathematics, the characteristic of a ring R, often denoted char(R), is defined to be the smallest number of times one must use the ring's multiplicative identity (1) in a sum to get the additive identity (0) if the sum does indeed eventually attain 0.

## Commutator

In mathematics, the commutator gives an indication of the extent to which a certain binary operation fails to be commutative.

## Complex number

A complex number is a number that can be expressed in the form, where and are real numbers, and is a solution of the equation.

## Conjugate transpose

In mathematics, the conjugate transpose or Hermitian transpose of an m-by-n matrix A with complex entries is the n-by-m matrix A∗ obtained from A by taking the transpose and then taking the complex conjugate of each entry.

## Connected space

In topology and related branches of mathematics, a connected space is a topological space that cannot be represented as the union of two or more disjoint nonempty open subsets.

## Cross product

In mathematics and vector algebra, the cross product or vector product (occasionally directed area product to emphasize the geometric significance) is a binary operation on two vectors in three-dimensional space \left(\mathbb^3\right) and is denoted by the symbol \times.

## Curl (mathematics)

In vector calculus, the curl is a vector operator that describes the infinitesimal rotation of a vector field in three-dimensional Euclidean space.

## Determinant

In linear algebra, the determinant is a value that can be computed from the elements of a square matrix.

## Diagonal matrix

In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero.

## Dimension (vector space)

In mathematics, the dimension of a vector space V is the cardinality (i.e. the number of vectors) of a basis of V over its base field.

## Direct sum of modules

In abstract algebra, the direct sum is a construction which combines several modules into a new, larger module.

## Eigenvalues and eigenvectors

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that changes by only a scalar factor when that linear transformation is applied to it.

## Exponential map (Lie theory)

In the theory of Lie groups, the exponential map is a map from the Lie algebra \mathfrak g of a Lie group G to the group, which allows one to recapture the local group structure from the Lie algebra.

## Field (mathematics)

In mathematics, a field is a set on which addition, subtraction, multiplication, and division are defined, and behave as when they are applied to rational and real numbers.

## Generating function

In mathematics, a generating function is a way of encoding an infinite sequence of numbers (an) by treating them as the coefficients of a power series.

## Imaginary number

An imaginary number is a complex number that can be written as a real number multiplied by the imaginary unit,j is usually used in Engineering contexts where i has other meanings (such as electrical current) which is defined by its property.

## Inner product space

In linear algebra, an inner product space is a vector space with an additional structure called an inner product.

## Invertible matrix

In linear algebra, an n-by-n square matrix A is called invertible (also nonsingular or nondegenerate) if there exists an n-by-n square matrix B such that where In denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication.

## Lie algebra

In mathematics, a Lie algebra (pronounced "Lee") is a vector space \mathfrak g together with a non-associative, alternating bilinear map \mathfrak g \times \mathfrak g \rightarrow \mathfrak g; (x, y) \mapsto, called the Lie bracket, satisfying the Jacobi identity.

## Lie group

In mathematics, a Lie group (pronounced "Lee") is a group that is also a differentiable manifold, with the property that the group operations are compatible with the smooth structure.

## Linear algebra

Linear algebra is the branch of mathematics concerning linear equations such as linear functions such as and their representations through matrices and vector spaces.

## Linear map

In mathematics, a linear map (also called a linear mapping, linear transformation or, in some contexts, linear function) is a mapping between two modules (including vector spaces) that preserves (in the sense defined below) the operations of addition and scalar multiplication.

## Main diagonal

In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, or major diagonal) of a matrix A is the collection of entries A_ where i.

## Matrix exponential

In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.

## Matrix similarity

In linear algebra, two n-by-n matrices and are called similar if for some invertible n-by-n matrix.

## Normal matrix

In mathematics, a complex square matrix is normal if where is the conjugate transpose of.

## Orthogonal group

In mathematics, the orthogonal group in dimension, denoted, is the group of distance-preserving transformations of a Euclidean space of dimension that preserve a fixed point, where the group operation is given by composing transformations.

## Orthogonal matrix

In linear algebra, an orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors (i.e., orthonormal vectors), i.e. where I is the identity matrix.

## Pfaffian

In mathematics, the determinant of a skew-symmetric matrix can always be written as the square of a polynomial in the matrix entries, a polynomial with integer coefficients that only depend on the size of the matrix.

## Polynomial

In mathematics, a polynomial is an expression consisting of variables (also called indeterminates) and coefficients, that involves only the operations of addition, subtraction, multiplication, and non-negative integer exponents of variables.

## Skew-Hermitian matrix

In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or antihermitian if its conjugate transpose is equal to the original matrix, with all the entries being of opposite sign.

## Spectral theorem

In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis).

## Square matrix

In mathematics, a square matrix is a matrix with the same number of rows and columns.

## Symmetric matrix

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose.

## Symmetry in mathematics

Symmetry occurs not only in geometry, but also in other branches of mathematics.

## Symplectic matrix

In mathematics, a symplectic matrix is a 2n×2n matrix M with real entries that satisfies the condition where MT denotes the transpose of M and Ω is a fixed 2n×2n nonsingular, skew-symmetric matrix.

## Tangent space

In mathematics, the tangent space of a manifold facilitates the generalization of vectors from affine spaces to general manifolds, since in the latter case one cannot simply subtract two points to obtain a vector that gives the displacement of the one point from the other.

## Trace (linear algebra)

In linear algebra, the trace of an n-by-n square matrix A is defined to be the sum of the elements on the main diagonal (the diagonal from the upper left to the lower right) of A, i.e., where aii denotes the entry on the ith row and ith column of A. The trace of a matrix is the sum of the (complex) eigenvalues, and it is invariant with respect to a change of basis.

## Transpose

In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal, that is it switches the row and column indices of the matrix by producing another matrix denoted as AT (also written A′, Atr, tA or At).

## Unitary matrix

In mathematics, a complex square matrix is unitary if its conjugate transpose is also its inverse—that is, if where is the identity matrix.

## Vector space

A vector space (also called a linear space) is a collection of objects called vectors, which may be added together and multiplied ("scaled") by numbers, called scalars.

## References

Hey! We are on Facebook now! »