• Adjugate matrix: The transpose of the matrix of cofactors of a matrix.
  • Algebraic expression: a combination of numbers, variables, and operators.
  • Automorphism: An endomorphism that has an inverse.
  • Basis: A set of linearly independent vectors that span a vector space.
  • Bullet Point List All Equation Terminology and Related Definitions.
  • Characteristic equation: The equation obtained by equating the characteristic polynomial to zero.
  • Characteristic polynomial: The polynomial that is the determinant of the matrix A – λI.
  • Cholesky decomposition: A factorization of a matrix into the product of a lower triangular matrix and its conjugate transpose.
  • Coefficient: a number that multiplies a variable in a term. For example, in the term “3x”, the coefficient is 3.
  • Cofactor matrix: A matrix whose entries are the determinants of the submatrices of a given matrix, with alternating signs.
  • Column space: The set of all linear combinations of the columns of a matrix.
  • Commuting matrix: Two matrices that commute, meaning that their product is equal to the product of the other matrix.
  • Conjugate transpose: The transpose of a matrix with the signs of the imaginary entries flipped.
  • Constant: a term with no variable, such as “5” or “-2”.
  • Cross product: a vector product of two vectors.
  • Cyclic matrix: A matrix whose powers generate the whole space.
  • Cyclic vector: A vector that generates the whole space.
  • Derivative: the rate of change of a function at a particular point.
  • Determinant: A scalar value that can be computed from a square matrix, used in linear algebra to represent the scaling factor of linear transformation.
  • Determinant: a scalar value that can be computed from a square matrix, used in linear algebra.
  • Diagonal matrix: A matrix in which the entries outside the main diagonal are all zero.
  • Diagonalizable matrix: A matrix that can be transformed into a diagonal matrix through a similarity transformation.
  • Diagonalizable matrix: A square matrix that can be transformed into a diagonal matrix through a similarity transformation.
  • Diagonalizable: A matrix that can be transformed into a diagonal matrix through a similarity transformation.
  • Differentiation: the process of finding the derivative of a function.
  • Dot product: a scalar product of two vectors.
  • Eigenvalue problem: The problem of finding the scalars (eigenvalues) and non-zero vectors (eigenvectors) that satisfy a certain relationship for a given linear operator or matrix.
  • Eigenvalue: A scalar value that satisfies the equation Av = λv for a given matrix A and a non-zero vector v.
  • Eigenvalues: scalar values associated with a matrix, used in linear algebra.
  • Eigenvector: A non-zero vector that satisfies the equation Av = λv for a given matrix A and a scalar λ.
  • Eigenvectors: non-zero vectors associated with a matrix, used in linear algebra.
  • Endomorphism: A function from a vector space to itself that preserves the operations of the vector space.
  • Equation: a statement asserting the equality of two expressions, often written using the symbol “=”.
  • Exponent: the power to which a number is raised.
  • Factor: a number or expression that divides into another number or expression without leaving a remainder.
  • Hermitian matrix: A square matrix that is equal to its conjugate transpose.
  • Homomorphism: A function between two algebraic structures that preserves the operations of the structures.
  • Identity: An equation that is always true, such as x + 0 = x.
  • Inner product space: A vector space with an inner product defined on it.
  • Inner product: A mathematical operation that combines two vectors to form a scalar, used in inner product spaces.
  • Integral: A mathematical function that is the sum of the values of a function over an interval.
  • Inverse matrix: a matrix that when multiplied by the original matrix results in the identity matrix.
  • Inverse operation: an operation that undoes the effect of another operation. For example, addition is the inverse of subtraction, and multiplication is the inverse of division.
  • Invertible matrix: A square matrix that has an inverse.
  • Isomorphism: A one-to-one correspondence between two mathematical structures that preserves the operations of the structures.
  • Jordan form: A canonical form for a matrix in which the matrix is transformed into a block-diagonal form, with each block corresponding to a single eigenvalue.
  • Like terms: terms that have the same variable raised to the same power.
  • Linear dependence: A set of vectors are said to be linearly dependent if one of them can be represented as a linear combination of the others.
  • Linear equation: an equation in which the highest power of the variable is 1. Examples: “3x + 2 = 0” and “y = -5x + 3”
  • Linear independence: A set of vectors are said to be linearly independent if none of them can be represented as a linear combination of the others.
  • Linear operator: A function that maps one vector space to another vector space.
  • Linear transformation: a mathematical function that maps a vector space to another vector space.
  • Logarithm: the exponent to which a base must be raised to produce a given value.
  • LU decomposition: A factorization of a matrix into the product of a lower triangular matrix and an upper triangular matrix.
  • Matrix inverse: If a matrix A is invertible, there exist a matrix A^(-1) such that A A^(-1) = A^(-1) A = I, where I is the identity matrix.
  • Matrix: a rectangular array of numbers or symbols arranged in rows and columns, used in linear algebra and related fields.
  • Negative definite matrix: A Hermitian matrix with negative eigenvalues.
  • Negative definite matrix: A symmetric matrix that has all negative eigenvalues.
  • Negative semi-definite matrix: A Hermitian matrix with non-positive eigenvalues.
  • Negative semi-definite matrix: A symmetric matrix that has only non-positive eigenvalues.
  • Negative-definite matrix: A Hermitian matrix with negative eigenvalues.
  • Negative-semidefinite matrix: A Hermitian matrix with non-positive eigenvalues.
  • Non-singular matrix: A square matrix that has an inverse.
  • Norm: a scalar value that represents the magnitude of a vector.
  • Normal matrix: A matrix that commutes with its conjugate transpose.
  • Null space: The set of all vectors that are mapped to the zero vector by a linear operator.
  • Orthogonal complement: The set of all vectors in a vector space that are orthogonal to a given subspace.
  • Orthogonal matrix: A square matrix whose columns and rows are mutually orthonormal and whose determinant is equal to 1 or -1.
  • Orthogonal projection: A projection of one vector onto another vector that is orthogonal to the vector being projected
  • Orthogonal projection: A projection of one vector onto another vector that is orthogonal to the vector being projected.
  • Orthogonal: Two vectors are said to be orthogonal if their dot product is equal to zero.
  • Orthonormal basis: A basis of a vector space consisting of mutually orthonormal vectors.
  • Orthonormal matrix: A square matrix whose columns and rows are mutually orthonormal and whose determinant is equal to 1 or -1.
  • Orthonormal set: A set of vectors that are mutually orthonormal.
  • Orthonormal: A set of vectors are said to be orthonormal if they are orthogonal and each vector has a norm equal to one.
  • Positive definite matrix: A Hermitian matrix with positive eigenvalues.
  • Positive definite matrix: A symmetric matrix that has all positive eigenvalues.
  • Positive semi-definite matrix: A Hermitian matrix with non-negative eigenvalues.
  • Positive semi-definite matrix: A symmetric matrix that has only non-negative eigenvalues.
  • Positive-definite matrix: A Hermitian matrix with positive eigenvalues.
  • Positive-semidefinite matrix: A Hermitian matrix with non-negative eigenvalues.
  • Power: a mathematical operation that calculates the value of a number raised to a given exponent.
  • Pseudoinverse: A generalization of the inverse matrix, used when a matrix is not invertible.
  • QR decomposition: A factorization of a matrix into the product of an orthogonal matrix and an upper triangular matrix.
  • Quadratic equation: an equation in which the highest power of the variable is 2. Example: “x^2 + 3x – 4 = 0”
  • Rank: The dimension of the vector space spanned by a set of vectors.
  • Rank: The number of linearly independent rows or columns in a matrix.
  • Rank-nullity theorem: The theorem states that for any matrix A, the rank of A plus the dimension of the null space of A is equal to the number of columns in A.
  • Row space: The set of all linear combinations of the rows of a matrix.
  • Scalar product: a type of mathematical operation that combines two vectors to form a scalar.
  • Scalar: a single value that represents a magnitude of a physical quantity.
  • Similar matrix: Two matrices that can be transformed into each other through a similarity transformation.
  • Simplest form: a simplified version of an expression with no negative exponents or fractions in denominator.
  • Simplifying: the process of making an expression or equation simpler or easier to understand or solve.
  • Singular matrix: A square matrix that does not have an inverse.
  • Skew-Hermitian matrix: A square matrix that is equal to the negative of its conjugate transpose.
  • Skew-symmetric matrix: A square matrix that is equal to the negative of its transpose.
  • Solution: the value or values of the variable(s) that make an equation true.
  • Spectral norm: The maximum absolute value of the eigenvalues of a matrix, also known as the induced norm or operator norm.
  • Spectral radius: The maximum absolute value of the eigenvalues of a matrix.
  • Spectral theorem: A theorem that states that any normal matrix is unitarily diagonalizable.
  • Symmetric matrix: A square matrix that is equal to its transpose.
  • Synthetic Division: a method to divide a polynomial by a binomial of the form x – c.
  • System of equations: a set of equations with multiple variables that are solved simultaneously.
  • Term: a mathematical expression that may be combined with other terms to form a larger expression or equation.
  • Trace: The sum of the diagonal entries of a matrix.
  • Trace: The sum of the diagonal entries of a square matrix.
  • Transpose: The operation of flipping a matrix along its main diagonal, interchanging its rows and columns.
  • Triangular matrix: A matrix that is either upper triangular or lower triangular, meaning all entries below or above the main diagonal are zero.
  • Unitary matrix: A square matrix whose inverse is equal to its conjugate transpose.
  • Variable: a letter or symbol used to represent an unknown value in an equation.
  • Vector product: a type of mathematical operation that combines two vectors to form another vector.
  • Vector: an element of a vector space that can be represented by a set of coordinates.