Branch of mathematics concerned with the behavior of linear equations — . It initially started out as the study of systems of linear equations. However a more abstract view (via vector spaces) which preserves its properties and can be better generalized is more commonly used presently.

A vector is a set of elements from a field such that it satisfies the vector axioms. E.g. where are elements from a set in field satisfying all the vector axioms. A scalar is any element from the same field . For example, and are scalars. The vector axioms are defined below:

Vector Spaces

A vector space is a set of elements called vectors over a field , with a binary operation on called vector addition , and another binary operation on with a scalar called scalar multiplication such that it satisfies vector axioms:

  • is an abelian group under vector addition:
    • Associativity of addition — For , .
    • Identity element of addition — For , , where is called the zero vector.
    • Inverse elements of addition — For all , there such that , where is the zero vector.
    • Commutativity of addition — For , .
  • Scalar multiplication is distributed over vector addition — For and , .
  • Scalar multiplication is distributed over field addition — For and , .
  • Scalar multiplication and field multiplication are compatible — For and , .
  • Identity element of scalar multiplication — For , where is the multiplication identity of .
  • E.g. the euclidean space is an dimensional vector space over .

Vector spaces can be visualized as an n-dimensional geometric space where vectors are arrows or points in that space, and vector operations transform the vectors (and the space itself). Scalar multiplication scales the vector arrows, while vector addition follows parallelogram vector addition.

geometric analogue of vector addition and multiplication

Basis vectors

  • Span — The entire space spanned by the linear combination of a set of (basis) vectors.

two vectors spanning the entire R2 vector space

  • Linearly independent — Two or more vectors that cannot be written as a linear combination of the others are linearly independent. That is, vectors that span different vector spaces are linearly independent.

three vectors spanning R2 vs three vectors spanning R3

  • In a vector space , a set of vectors is called a basis vector if every element of can be written using a linear combination of . Simply put, basis vectors must span the entire space of . As such must include a set of linearly independent vectors. Since basis vectors span an entire vector space, they can act as coordinate system via which all vectors in that space can be represented.
  • E.g. is the basis for the space ,
  • As evident, basis vectors must be a linearly independent spanning set to be able to span an entire vector space.

same point represented using different basis vectors

Linear Transformations

  • Linear map — Mapping between two vector spaces such that the mapping preserves vector additions and scalar multiplication. Also called a linear transformation in the context of linear algebra.
    • Vector addition: where are vectors.
    • Scalar multiplication: , where is a vector and is a scalar in a field.

Matrices

  • A way to represent linear transformations.
  • Each row of a matrix corresponds to the linear combination of the input vectors for the following row in the output vector (or matrix). Each column in the input vector (or matrix) corresponds to the row in the output vector (or matrix). That is:
  • Can be composed with one another to create a resulting product matrix. Operations are performed from right to left. E.g. the composition of matrices implies first multiplying matrix with and then multiplying the product with . Matrix multiplication is associative: . It is not commutative however — for matrices and , .
  • Non-square matrices transform vectors (or matrices) to a higher or lower dimensional space — depending on the number of rows and columns of the transformation matrix and its rank.
  • Geometrically, linear transformations preserve the origin (zero vector is unaffected) and parallel lines (all vectors are scaled linearly). Linear transformations can be visualized as scaling the geometric space with a linear combination of the row vectors of the transformation matrix.

graphic to show matrix multiplication geometrically

Matrix Properties

  • Rank — The number of dimensions of the vector space spanned by the column (or row) vectors of a matrix. The row rank of matrix is equal to the column rank of a matrix.

example of linearly dependent vectors in a matrix

  • Determinant — The ‘scaling factor’ of a linear transformation (a matrix), from a geometric perspective. A matrix will transform the space such that the parallelotope (n-dimensional parallelogram/parallelepiped) covered by any vector is scaled by some factor — the (signed) factor is the determinant. A matrix with zero determinant implies the reduction of rank. Since higher dimensional information is condensed into lower dimensions, information is lost and it is impossible to invert the transformation — ie. a matrix with zero determinant is non-invertible; it does not have an inverse.

geometric analogue of determinant

  • Inverse matrices — The linear transformation which undoes a linear transformation. That is, for a matrix , its inverse will undo the transformation of . In essence, applying is the same as applying the identity transform where, is the idenentity matrix.
  • Identity matrix — Matrix with on the main diagonal and zeros elsewhere. Applying the identity transformation on a vector or matrix returns the same vector or matrix — it is analogous to an identity element.
  • Transpose — The matrix obtained after reflecting the elements of a matrix along its main diagonal. The transpose of a matrix (or a vector) is represented as .
  • Null space — The vector space which gets mapped to the zero vector after some given linear transform . That is null space is .
  • Cross product — The 3-dimensional vector that is orthogonal to two other 3-dimensional vectors, scaled by the area of the parallelogram formed by the two 3-dimensional vectors. That is, where is the angle between the two vectors and , and is the unit vector perpendicular to both the vectors. Cross products are anti-commutative . Lie algebra defines the generalized cross product in higher dimensions.
  • Dot product — Point-wise product of two vectors. Geometrically, the dot product represents the projection of one vector on another vector, scaled by the the product of the magnitude of both the vectors. In a way, it measures the angle of two vectors — the dot product is analogous to the cosine function in higher dimensions (but also scaled by the product of the magnitude of the two vectors). It can be written as or .

geometric visualization of the dot product

  • Duality — A dot product can be alternatively defined as because of duality. The dual of a vector is a linear transformation that maps vectors to a scalar. E.g. in physics, velocity can be represented as a vector in a vector space , and its dual can act as a transformation to return a scalar displacement.
  • Eigenvectors — Some linear transformations may not change (or may reverse) the direction of a few certain vectors, and may only change its magnitude. That is, for a non-identity transformation matrix , there may exist vectors such that where is a scalar. Then the vector is an eigenvector for the matrix and the scalar is one of the eigenvalues.

Affine Transformation

Affine transformations are similar to linear transformations, but do not necessarily preserve the zero vector. Geometrically, affine transformations preserve parallel lines (since vectors are still scaled linearly) but do not preserve the origin (zero vector) — the transformation acts as a linear transformation with a ‘translation’ transformation.


References


This page is a part of Mathematics.