Table of Contents
96 relations: Aequationes Mathematicae, Algorithm, Banach space, Basis (linear algebra), Bounded operator, Cauchy–Schwarz inequality, Centering matrix, Characteristic polynomial, Closed graph theorem, Commuting matrices, Complete metric space, Complex number, Condition number, Conjugate transpose, Converse (logic), Definite matrix, Diagonalizable matrix, Dimension (vector space), Direct sum of modules, Domain of a function, Dot product, Dykstra's projection algorithm, Eigenvalue algorithm, Eigenvalues and eigenvectors, Einstein notation, Endomorphism, Euclidean vector, Field (mathematics), Frame (linear algebra), Functional analysis, Gram–Schmidt process, Hahn–Banach theorem, Hessenberg matrix, Hilbert projection theorem, Hilbert space, Householder transformation, Idempotence, Identity function, Identity matrix, If and only if, Image (mathematics), Inner product space, Instrumental variables estimation, Integer, Invariant subspace, Kernel (linear algebra), Lattice (order), Least-squares spectral analysis, Linear algebra, Linear form, ... Expand index (46 more) »
Aequationes Mathematicae
Aequationes Mathematicae is a mathematical journal.
See Projection (linear algebra) and Aequationes Mathematicae
Algorithm
In mathematics and computer science, an algorithm is a finite sequence of mathematically rigorous instructions, typically used to solve a class of specific problems or to perform a computation.
See Projection (linear algebra) and Algorithm
Banach space
In mathematics, more specifically in functional analysis, a Banach space (pronounced) is a complete normed vector space. Projection (linear algebra) and Banach space are functional analysis.
See Projection (linear algebra) and Banach space
Basis (linear algebra)
In mathematics, a set of vectors in a vector space is called a basis (bases) if every element of may be written in a unique way as a finite linear combination of elements of. Projection (linear algebra) and basis (linear algebra) are linear algebra.
See Projection (linear algebra) and Basis (linear algebra)
Bounded operator
In functional analysis and operator theory, a bounded linear operator is a linear transformation L: X \to Y between topological vector spaces (TVSs) X and Y that maps bounded subsets of X to bounded subsets of Y. If X and Y are normed vector spaces (a special type of TVS), then L is bounded if and only if there exists some M > 0 such that for all x \in X, \|Lx\|_Y \leq M \|x\|_X. Projection (linear algebra) and bounded operator are linear operators.
See Projection (linear algebra) and Bounded operator
Cauchy–Schwarz inequality
The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is an upper bound on the inner product between two vectors in an inner product space in terms of the product of the vector norms. Projection (linear algebra) and Cauchy–Schwarz inequality are linear algebra.
See Projection (linear algebra) and Cauchy–Schwarz inequality
Centering matrix
In mathematics and multivariate statistics, the centering matrix is a symmetric and idempotent matrix, which when multiplied with a vector has the same effect as subtracting the mean of the components of the vector from every component of that vector.
See Projection (linear algebra) and Centering matrix
Characteristic polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. Projection (linear algebra) and characteristic polynomial are linear algebra.
See Projection (linear algebra) and Characteristic polynomial
Closed graph theorem
In mathematics, the closed graph theorem may refer to one of several basic results characterizing continuous functions in terms of their graphs.
See Projection (linear algebra) and Closed graph theorem
Commuting matrices
In linear algebra, two matrices A and B are said to commute if AB.
See Projection (linear algebra) and Commuting matrices
Complete metric space
In mathematical analysis, a metric space is called complete (or a Cauchy space) if every Cauchy sequence of points in has a limit that is also in.
See Projection (linear algebra) and Complete metric space
Complex number
In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted, called the imaginary unit and satisfying the equation i^.
See Projection (linear algebra) and Complex number
Condition number
In numerical analysis, the condition number of a function measures how much the output value of the function can change for a small change in the input argument.
See Projection (linear algebra) and Condition number
Conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate of a+ib being a-ib, for real numbers a and b). Projection (linear algebra) and conjugate transpose are linear algebra.
See Projection (linear algebra) and Conjugate transpose
Converse (logic)
In logic and mathematics, the converse of a categorical or implicational statement is the result of reversing its two constituent statements.
See Projection (linear algebra) and Converse (logic)
Definite matrix
In mathematics, a symmetric matrix \ M\ with real entries is positive-definite if the real number \ \mathbf^\top M \mathbf\ is positive for every nonzero real column vector \ \mathbf\, where \ \mathbf^\top\ is the row vector transpose of \ \mathbf ~. More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number \ \mathbf^* M \mathbf\ is positive for every nonzero complex column vector \ \mathbf\, where \ \mathbf^*\ denotes the conjugate transpose of \ \mathbf ~.
See Projection (linear algebra) and Definite matrix
Diagonalizable matrix
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix.
See Projection (linear algebra) and Diagonalizable matrix
Dimension (vector space)
In mathematics, the dimension of a vector space V is the cardinality (i.e., the number of vectors) of a basis of V over its base field. Projection (linear algebra) and dimension (vector space) are linear algebra.
See Projection (linear algebra) and Dimension (vector space)
Direct sum of modules
In abstract algebra, the direct sum is a construction which combines several modules into a new, larger module. Projection (linear algebra) and direct sum of modules are linear algebra.
See Projection (linear algebra) and Direct sum of modules
Domain of a function
In mathematics, the domain of a function is the set of inputs accepted by the function.
See Projection (linear algebra) and Domain of a function
Dot product
In mathematics, the dot product or scalar productThe term scalar product means literally "product with a scalar as a result".
See Projection (linear algebra) and Dot product
Dykstra's projection algorithm
Dykstra's algorithm is a method that computes a point in the intersection of convex sets, and is a variant of the alternating projection method (also called the projections onto convex sets method).
See Projection (linear algebra) and Dykstra's projection algorithm
Eigenvalue algorithm
In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix.
See Projection (linear algebra) and Eigenvalue algorithm
Eigenvalues and eigenvectors
In linear algebra, an eigenvector or characteristic vector is a vector that has its direction unchanged by a given linear transformation. Projection (linear algebra) and Eigenvalues and eigenvectors are linear algebra.
See Projection (linear algebra) and Eigenvalues and eigenvectors
Einstein notation
In mathematics, especially the usage of linear algebra in mathematical physics and differential geometry, Einstein notation (also known as the Einstein summation convention or Einstein summation notation) is a notational convention that implies summation over a set of indexed terms in a formula, thus achieving brevity.
See Projection (linear algebra) and Einstein notation
Endomorphism
In mathematics, an endomorphism is a morphism from a mathematical object to itself.
See Projection (linear algebra) and Endomorphism
Euclidean vector
In mathematics, physics, and engineering, a Euclidean vector or simply a vector (sometimes called a geometric vector or spatial vector) is a geometric object that has magnitude (or length) and direction. Projection (linear algebra) and Euclidean vector are linear algebra.
See Projection (linear algebra) and Euclidean vector
Field (mathematics)
In mathematics, a field is a set on which addition, subtraction, multiplication, and division are defined and behave as the corresponding operations on rational and real numbers.
See Projection (linear algebra) and Field (mathematics)
Frame (linear algebra)
In linear algebra, a frame of an inner product space is a generalization of a basis of a vector space to sets that may be linearly dependent. Projection (linear algebra) and frame (linear algebra) are linear algebra.
See Projection (linear algebra) and Frame (linear algebra)
Functional analysis
Functional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure (for example, inner product, norm, or topology) and the linear functions defined on these spaces and suitably respecting these structures.
See Projection (linear algebra) and Functional analysis
Gram–Schmidt process
In mathematics, particularly linear algebra and numerical analysis, the Gram–Schmidt process or Gram-Schmidt algorithm is a way of finding a set of two or more vectors that are perpendicular to each other. Projection (linear algebra) and Gram–Schmidt process are functional analysis and linear algebra.
See Projection (linear algebra) and Gram–Schmidt process
Hahn–Banach theorem
The Hahn–Banach theorem is a central tool in functional analysis. Projection (linear algebra) and Hahn–Banach theorem are linear algebra.
See Projection (linear algebra) and Hahn–Banach theorem
Hessenberg matrix
In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular.
See Projection (linear algebra) and Hessenberg matrix
Hilbert projection theorem
In mathematics, the Hilbert projection theorem is a famous result of convex analysis that says that for every vector x in a Hilbert space H and every nonempty closed convex C \subseteq H, there exists a unique vector m \in C for which \|c - x\| is minimized over the vectors c \in C; that is, such that \|m - x\| \leq \|c - x\| for every c \in C.
See Projection (linear algebra) and Hilbert projection theorem
Hilbert space
In mathematics, Hilbert spaces (named after David Hilbert) allow the methods of linear algebra and calculus to be generalized from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Projection (linear algebra) and Hilbert space are functional analysis and linear algebra.
See Projection (linear algebra) and Hilbert space
Householder transformation
In linear algebra, a Householder transformation (also known as a Householder reflection or elementary reflector) is a linear transformation that describes a reflection about a plane or hyperplane containing the origin.
See Projection (linear algebra) and Householder transformation
Idempotence
Idempotence is the property of certain operations in mathematics and computer science whereby they can be applied multiple times without changing the result beyond the initial application.
See Projection (linear algebra) and Idempotence
Identity function
Graph of the identity function on the real numbers In mathematics, an identity function, also called an identity relation, identity map or identity transformation, is a function that always returns the value that was used as its argument, unchanged.
See Projection (linear algebra) and Identity function
Identity matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere.
See Projection (linear algebra) and Identity matrix
If and only if
In logic and related fields such as mathematics and philosophy, "if and only if" (often shortened as "iff") is paraphrased by the biconditional, a logical connective between statements.
See Projection (linear algebra) and If and only if
Image (mathematics)
In mathematics, for a function f: X \to Y, the image of an input value x is the single output value produced by f when passed x. The preimage of an output value y is the set of input values that produce y. More generally, evaluating f at each element of a given subset A of its domain X produces a set, called the "image of A under (or through) f".
See Projection (linear algebra) and Image (mathematics)
Inner product space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product.
See Projection (linear algebra) and Inner product space
Instrumental variables estimation
In statistics, econometrics, epidemiology and related disciplines, the method of instrumental variables (IV) is used to estimate causal relationships when controlled experiments are not feasible or when a treatment is not successfully delivered to every unit in a randomized experiment.
See Projection (linear algebra) and Instrumental variables estimation
Integer
An integer is the number zero (0), a positive natural number (1, 2, 3,...), or the negation of a positive natural number (−1, −2, −3,...). The negations or additive inverses of the positive natural numbers are referred to as negative integers.
See Projection (linear algebra) and Integer
Invariant subspace
In mathematics, an invariant subspace of a linear mapping T: V → V i.e. from some vector space V to itself, is a subspace W of V that is preserved by T. More generally, an invariant subspace for a collection of linear mappings is a subspace preserved by each mapping individually. Projection (linear algebra) and invariant subspace are linear algebra.
See Projection (linear algebra) and Invariant subspace
Kernel (linear algebra)
In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the part of the domain which is mapped to the zero vector of the co-domain; the kernel is always a linear subspace of the domain. Projection (linear algebra) and kernel (linear algebra) are functional analysis and linear algebra.
See Projection (linear algebra) and Kernel (linear algebra)
Lattice (order)
A lattice is an abstract structure studied in the mathematical subdisciplines of order theory and abstract algebra.
See Projection (linear algebra) and Lattice (order)
Least-squares spectral analysis
Least-squares spectral analysis (LSSA) is a method of estimating a frequency spectrum based on a least-squares fit of sinusoids to data samples, similar to Fourier analysis. Projection (linear algebra) and least-squares spectral analysis are linear algebra.
See Projection (linear algebra) and Least-squares spectral analysis
Linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as: linear maps such as: and their representations in vector spaces and through matrices.
See Projection (linear algebra) and Linear algebra
Linear form
In mathematics, a linear form (also known as a linear functional, a one-form, or a covector) is a linear mapIn some texts the roles are reversed and vectors are defined as linear maps from covectors to scalars from a vector space to its field of scalars (often, the real numbers or the complex numbers). Projection (linear algebra) and linear form are functional analysis, linear algebra and linear operators.
See Projection (linear algebra) and Linear form
Linear map
In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that preserves the operations of vector addition and scalar multiplication. Projection (linear algebra) and linear map are linear operators.
See Projection (linear algebra) and Linear map
Linear regression
In statistics, linear regression is a statistical model which estimates the linear relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables).
See Projection (linear algebra) and Linear regression
Linear subspace
In mathematics, and more specifically in linear algebra, a linear subspace or vector subspaceThe term linear subspace is sometimes used for referring to flats and affine subspaces. Projection (linear algebra) and linear subspace are functional analysis and linear algebra.
See Projection (linear algebra) and Linear subspace
Machine learning
Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data and thus perform tasks without explicit instructions.
See Projection (linear algebra) and Machine learning
Matrix (mathematics)
In mathematics, a matrix (matrices) is a rectangular array or table of numbers, symbols, or expressions, with elements or entries arranged in rows and columns, which is used to represent a mathematical object or property of such an object.
See Projection (linear algebra) and Matrix (mathematics)
Matrix multiplication
In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices.
See Projection (linear algebra) and Matrix multiplication
Matrix norm
In the field of mathematics, norms are defined for elements within a vector space. Projection (linear algebra) and Matrix norm are linear algebra.
See Projection (linear algebra) and Matrix norm
Measure (mathematics)
In mathematics, the concept of a measure is a generalization and formalization of geometrical measures (length, area, volume) and other common notions, such as magnitude, mass, and probability of events.
See Projection (linear algebra) and Measure (mathematics)
Minimal polynomial (linear algebra)
In linear algebra, the minimal polynomial of an matrix over a field is the monic polynomial over of least degree such that.
See Projection (linear algebra) and Minimal polynomial (linear algebra)
Moore–Penrose inverse
In mathematics, and in particular linear algebra, the Moore–Penrose inverse of a matrix, often called the pseudoinverse, is the most widely known generalization of the inverse matrix.
See Projection (linear algebra) and Moore–Penrose inverse
Normed vector space
In mathematics, a normed vector space or normed space is a vector space over the real or complex numbers on which a norm is defined.
See Projection (linear algebra) and Normed vector space
Oblique projection
Oblique projection is a simple type of technical drawing of graphical projection used for producing two-dimensional (2D) images of three-dimensional (3D) objects.
See Projection (linear algebra) and Oblique projection
Open and closed maps
In mathematics, more specifically in topology, an open map is a function between two topological spaces that maps open sets to open sets.
See Projection (linear algebra) and Open and closed maps
Open set
In mathematics, an open set is a generalization of an open interval in the real line.
See Projection (linear algebra) and Open set
Operator algebra
In functional analysis, a branch of mathematics, an operator algebra is an algebra of continuous linear operators on a topological vector space, with the multiplication given by the composition of mappings. Projection (linear algebra) and operator algebra are functional analysis.
See Projection (linear algebra) and Operator algebra
Operator K-theory
In mathematics, operator K-theory is a noncommutative analogue of topological K-theory for Banach algebras with most applications used for C*-algebras.
See Projection (linear algebra) and Operator K-theory
Ordinary least squares
In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the input dataset and the output of the (linear) function of the independent variable.
See Projection (linear algebra) and Ordinary least squares
Orthogonal complement
In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace W of a vector space V equipped with a bilinear form B is the set W^\perpof all vectors in V that are orthogonal to every vector in W. Informally, it is called the perp, short for perpendicular complement. Projection (linear algebra) and orthogonal complement are functional analysis and linear algebra.
See Projection (linear algebra) and Orthogonal complement
Orthogonality
In mathematics, orthogonality is the generalization of the geometric notion of perpendicularity.
See Projection (linear algebra) and Orthogonality
Orthogonalization
In linear algebra, orthogonalization is the process of finding a set of orthogonal vectors that span a particular subspace. Projection (linear algebra) and orthogonalization are linear algebra.
See Projection (linear algebra) and Orthogonalization
Orthonormal basis
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite dimension is a basis for V whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other. Projection (linear algebra) and orthonormal basis are functional analysis and linear algebra.
See Projection (linear algebra) and Orthonormal basis
Outer product
In linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector.
See Projection (linear algebra) and Outer product
Partial isometry
In mathematical functional analysis a partial isometry is a linear map between Hilbert spaces such that it is an isometry on the orthogonal complement of its kernel.
See Projection (linear algebra) and Partial isometry
Pavel Grinfeld
Pavel Grinfeld (also known as Greenfield) is an American mathematician and associate professor of Applied Mathematics at Drexel University working on problems in moving surfaces in applied mathematics (particularly calculus of variations), geometry, physics, and engineering.
See Projection (linear algebra) and Pavel Grinfeld
Perpendicular distance
In geometry, the perpendicular distance between two objects is the distance from one to the other, measured along a line that is perpendicular to one or both.
See Projection (linear algebra) and Perpendicular distance
Point (geometry)
In geometry, a point is an abstract idealization of an exact position, without size, in physical space, or its generalization to other kinds of mathematical spaces.
See Projection (linear algebra) and Point (geometry)
Positive definiteness
In mathematics, positive definiteness is a property of any object to which a bilinear form or a sesquilinear form may be naturally associated, which is positive-definite.
See Projection (linear algebra) and Positive definiteness
QR decomposition
In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix A into a product A.
See Projection (linear algebra) and QR decomposition
Rank (linear algebra)
In linear algebra, the rank of a matrix is the dimension of the vector space generated (or spanned) by its columns. Projection (linear algebra) and rank (linear algebra) are linear algebra.
See Projection (linear algebra) and Rank (linear algebra)
Real number
In mathematics, a real number is a number that can be used to measure a continuous one-dimensional quantity such as a distance, duration or temperature.
See Projection (linear algebra) and Real number
Riemannian geometry
Riemannian geometry is the branch of differential geometry that studies Riemannian manifolds, defined as smooth manifolds with a Riemannian metric (an inner product on the tangent space at each point that varies smoothly from point to point).
See Projection (linear algebra) and Riemannian geometry
Riemannian submersion
In differential geometry, a branch of mathematics, a Riemannian submersion is a submersion from one Riemannian manifold to another that respects the metrics, meaning that it is an orthogonal projection on tangent spaces.
See Projection (linear algebra) and Riemannian submersion
Self-adjoint operator
In mathematics, a self-adjoint operator on a complex vector space V with inner product \langle\cdot,\cdot\rangle is a linear map A (from V to itself) that is its own adjoint. Projection (linear algebra) and self-adjoint operator are linear operators.
See Projection (linear algebra) and Self-adjoint operator
Semisimple algebra
In ring theory, a branch of mathematics, a semisimple algebra is an associative artinian algebra over a field which has trivial Jacobson radical (only the zero element of the algebra is in the Jacobson radical).
See Projection (linear algebra) and Semisimple algebra
Singular value decomposition
In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix into a rotation, followed by a rescaling followed by another rotation. Projection (linear algebra) and singular value decomposition are functional analysis and linear algebra.
See Projection (linear algebra) and Singular value decomposition
Spectrum (functional analysis)
In mathematics, particularly in functional analysis, the spectrum of a bounded linear operator (or, more generally, an unbounded linear operator) is a generalisation of the set of eigenvalues of a matrix.
See Projection (linear algebra) and Spectrum (functional analysis)
Spherical trigonometry
Spherical trigonometry is the branch of spherical geometry that deals with the metrical relationships between the sides and angles of spherical triangles, traditionally expressed using trigonometric functions.
See Projection (linear algebra) and Spherical trigonometry
Square matrix
In mathematics, a square matrix is a matrix with the same number of rows and columns.
See Projection (linear algebra) and Square matrix
Subspace topology
In topology and related areas of mathematics, a subspace of a topological space X is a subset S of X which is equipped with a topology induced from that of X called the subspace topology (or the relative topology, or the induced topology, or the trace topology).
See Projection (linear algebra) and Subspace topology
Surjective function
In mathematics, a surjective function (also known as surjection, or onto function) is a function such that, for every element of the function's codomain, there exists one element in the function's domain such that.
See Projection (linear algebra) and Surjective function
Transpose
In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other notations). Projection (linear algebra) and transpose are linear algebra.
See Projection (linear algebra) and Transpose
Unit vector
In mathematics, a unit vector in a normed vector space is a vector (often a spatial vector) of length 1. Projection (linear algebra) and unit vector are linear algebra.
See Projection (linear algebra) and Unit vector
Vector space
In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', can be added together and multiplied ("scaled") by numbers called ''scalars''.
See Projection (linear algebra) and Vector space
Von Neumann algebra
In mathematics, a von Neumann algebra or W*-algebra is a *-algebra of bounded operators on a Hilbert space that is closed in the weak operator topology and contains the identity operator.
See Projection (linear algebra) and Von Neumann algebra
Zero matrix
In mathematics, particularly linear algebra, a zero matrix or null matrix is a matrix all of whose entries are zero.
See Projection (linear algebra) and Zero matrix
3D projection
A 3D projection (or graphical projection) is a design technique used to display a three-dimensional (3D) object on a two-dimensional (2D) surface. Projection (linear algebra) and 3D projection are linear algebra.
See Projection (linear algebra) and 3D projection
References
Also known as Linear projection, Orthogonal projection, Orthogonal projection operator, Orthogonal projections, Orthogonal projector, Projection (functional analysis), Projection operator, Projection operators, Projector (linear algebra), Projector operator.