Linear Algebra

Cards (37)

  • norm of a vector = ||x||= sqrt(x . x)
  • A vector x is a linear combination of a collection of vectors x1, x2, ..., xm if there exist real numbers such that x = a1x1 + a2x2 + ... + amxm
  • A collection of vectors is linearly independent if none of them is a linear combination of the others
  • Intuition for linear independence: you cannot make a menu consisting of French Fries out of a Big Mac (independence) but you can make a menu consisting of 2 Big Macs and French Fries out of solo menus (linear combinations of linearly independent vectors)
  • A collection of linearly independent vectors is a basis in Rn if any vector is a linear combination of these vectors
  • Row echelon matrices only have nonzero elements above the diagonal
  • inner product = scalar product = dot product
  • the inverse operation can only be defined for square matrices
  • if an inverse of a matrix exists, it is unique
  • whenever the r.h.s of all equations in a linear system is zero, the system is homogeneous and must have either one or infinite solutions
  • Invertibility is equivalent to having one solution
  • non-invertibility is equivalent to having none or infinite solutions
  • Gauss-Jordan elimination operations
    swapping rows, adding k times one row to another, multiplying a row by a nonzero scalar
  • Gauss-Jordan elimination
    create a row eschelon matrix using the allowed operations. If the resulting matrix is full rank, has a unique solution
  • the determinant of a triangular matrix is the product of its diagonal entries
  • the linear system A . x = b, with n equations and n unknowns, has a unique solution if and only if det A is nonzero
  • an n x m matrix is full rank if rank A = min {n,m}
  • A system of linear equations A . x = b has solutions if and only if the rank of the matrix of coefficients A is equal to the rank of the augmented matrix (A,b)
  • trace of a square matrix is the sum of the elements of its diagonal
  • the span of a collection of n-dimensional vectors is the set of all their linear combinations
  • span [x1, x2, ..., xm]

    {x: x = a1x1 + a2x2 + ... + amxm}
  • a basis is a minimal collection of vectors which spans Rn
  • a collection of n-dimensional vectors, x1, x2, ..., xm spans Rn if and only if rank X=n
  • Vector b is a linear combination of vectors x1, x2, ..., xm (i.e belongs to their span or is spanned by these vectors) if and only if the linear system Xy=b has at least one solution
  • linear independence is equivalent to a1x1 + ... amxm = 0 implies a1 = ... = am = 0, and this can be written as the homogeneous system Xa=0 has at most one solution
  • a set of n vectors is a basis if they are linearly independent
  • a linearly independent collection of vectors is a basis if adding any other vector makes the collection dependent
  • every set of independent vectors can be extended to a basis
  • an eigenvalue and an eigenvector of an n-square matrix A are a scalar lambda and an n-dimensional vector x such that A . x = lambda x
  • the eigenvalues of A are the roots of the characteristic polynomial i.e the determinant of A-lambda . I
  • det A = product of eigenvalues
  • tr(A) = sum of eigenvalues
  • det A < 0

    eigenvalues have opposite sign
  • det A > 0

    eigenvalues have same sign
  • If a matrix has all the eigenvalues real and different, the eigenvectors are independent and span Rn
  • Sylvester criterion for checking whether a symmetric matrix A is negative semi-definite:

    if corner minors change sign
  • Matrix A is negative semi-definite if all its eigenvalues are non-positive