In linear algebra, a column vector with
elements is an
matrix[1] consisting of a single column of
entries. Similarly, a row vector is a
matrix, consisting of a single row of
entries. For example,
is a column vector and
is a row matrix:
(Throughout this article, boldface is used for both row and column vectors.)
The transpose (indicated by T) of any row vector is a column vector, and the transpose of any column vector is a row vector:
Taking the transpose twice returns the original (row or column) vector:
.
The set of all row vectors with n entries in a given field (such as the real numbers) forms an n-dimensional vector space; similarly, the set of all column vectors with m entries forms an m-dimensional vector space.
The space of row vectors with n entries can be regarded as the dual space of the space of column vectors with n entries, since any linear functional on the space of column vectors can be represented as the left-multiplication of a unique row vector.
Notation
To simplify writing column vectors in-line with other text, sometimes they are written as row vectors with the transpose operation applied to them.
or
Some authors also use the convention of writing both column vectors and row vectors as rows, but separating row vector elements with commas and column vector elements with semicolons (see alternative notation 2 in the table below).
|
Row vector |
Column vector
|
Standard matrix notation (array spaces, no commas, transpose signs)
|
|
|
Alternative notation 1 (commas, transpose signs)
|
|
|
Alternative notation 2 (commas and semicolons, no transpose signs)
|
|
|
Operations
Matrix multiplication involves the action of multiplying each row vector of one matrix by each column vector of another matrix.
The dot product of two column vectors a, b, considered as elements of a coordinate space, is equal to the matrix product of the transpose of a with b,
By the symmetry of the dot product, the dot product of two column vectors a, b is also equal to the matrix product of the transpose of b with a,
The matrix product of a column and a row vector gives the outer product of two vectors a, b, an example of the more general tensor product. The matrix product of the column vector representation of a and the row vector representation of b gives the components of their dyadic product,
which is the transpose of the matrix product of the column vector representation of b and the row vector representation of a,
An n × n matrix M can represent a linear map and act on row and column vectors as the linear map's transformation matrix. For a row vector v, the product vM is another row vector p:
Another n × n matrix Q can act on p,
Then one can write t = pQ = vMQ, so the matrix product transformation MQ maps v directly to t. Continuing with row vectors, matrix transformations further reconfiguring n-space can be applied to the right of previous outputs.
When a column vector is transformed to another column vector under an n × n matrix action, the operation occurs to the left,
leading to the algebraic expression QM vT for the composed output from vT input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation.
See also
Notes
- ^ Artin, Michael (1991). Algebra. Englewood Cliffs, NJ: Prentice-Hall. p. 2. ISBN 0-13-004763-5.
References
- Axler, Sheldon Jay (1997), Linear Algebra Done Right (2nd ed.), Springer-Verlag, ISBN 0-387-98259-0
- Lay, David C. (August 22, 2005), Linear Algebra and Its Applications (3rd ed.), Addison Wesley, ISBN 978-0-321-28713-7
- Meyer, Carl D. (February 15, 2001), Matrix Analysis and Applied Linear Algebra, Society for Industrial and Applied Mathematics (SIAM), ISBN 978-0-89871-454-8, archived from the original on March 1, 2001
- Poole, David (2006), Linear Algebra: A Modern Introduction (2nd ed.), Brooks/Cole, ISBN 0-534-99845-3
- Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International
- Leon, Steven J. (2006), Linear Algebra With Applications (7th ed.), Pearson Prentice Hall
|
---|
|
Linear equations | | |
---|
Matrices |
- Matrix
- Matrix addition
- Matrix multiplication
- Basis transformation matrix
- Characteristic polynomial
- Spectrum
- Trace
- Eigenvalue, eigenvector and eigenspace
- Cayley–Hamilton theorem
- Jordan normal form
- Weyr canonical form
- Rank
- Inverse, Pseudoinverse
- Adjugate, Transpose
- Dot product
- Symmetric matrix, Skew-symmetric matrix
- Orthogonal matrix, Unitary matrix
- Hermitian matrix, Antihermitian matrix
- Positive-(semi)definite
- Pfaffian
- Projection
- Spectral theorem
- Perron–Frobenius theorem
- Diagonal matrix, Triangular matrix, Tridiagonal matrix
- Block matrix
- Sparse matrix
- Hessenberg matrix, Hessian matrix
- Vandermonde matrix
- Stochastic matrix, Toeplitz matrix, Circulant matrix, Hankel matrix
- (0,1)-matrix
- List of matrices
|
---|
Matrix decompositions | |
---|
Relations and computations | |
---|
Vector spaces | |
---|
Structures | |
---|
Multilinear algebra | |
---|
Affine and projective | |
---|
Numerical linear algebra | |
---|
Category
|