Matrix Algebra

Addition and Subtraction

Involving Scalars

Subtraction of a scalar proceeds as we would expect for the same operation purely on scalars, with the exception that we apply the operation only to the principle diagonal, ie.,

`bb"M"-lambda=[[1,2],[3,4]]-lambda=[[1-lambda,2],[3,4-lambda]]`

for example,

`bb"M"-1=[[1,2],[3,4]]-1=[[1-1,2],[3,4-1]]=[[0,2],[3,3]]`

Addition of a scalar proceeds similarly,

`bb"M"+lambda=[[1,2],[3,4]]+lambda=[[1+lambda,2],[3,4+lambda]]`

for example,

`bb"M"+1=[[1,2],[3,4]]+1=[[1+1,2],[3,4+1]]=[[2,2],[3,5]]`

Involving Matrices

When subtracting one matrix from another, we handle each matrix element individually: subtracting the element of one matrix from the corresponding element in the other matrix, ie.,

`bb"N"-bb"M"=[[5,7],[9,11]]-[[1,2],[3,4]]=[[5-1,7-2],[9-3,11-4]]=[[4,5],[6,7]]`

Addition of matrices proceeds similarly,

`bb"N"+bb"M"=[[5,7],[9,11]]+[[1,2],[3,4]]=[[5+1,7+2],[9+3,11+4]]=[[6,9],[12,15]]`

Multiplication

Multiplication by a scalar

Matrices behave like regular numbers when we multiply them by a scalar. For example, we simply multiply each matrix element by the scalar,

`lambda bb"M"=bb"M"lambda=lambda [[1,2],[3,4]]=[[lambda,2lambda],[3lambda,4lambda]]`

Multiplication by a Matrix

Multiplication by a matrix is a different beast. The order of the multiplication is significant and leads to different answers.

Powers

For a square matrix, a matrix exponentiated can be evaluated as a product chain of the original matrix, for example,

`bb"M"^3=bb"M"*bb"M"*bb"M"`

However, due to the odd rules of matrix multiplication, it is not possible to raise a rectangular matrix to any integer power greater than 1.

 

The Identity Matrix

The identity matrix is the matrix equivalent to the scalar value 1. In group theory, the Identity matrix is referred to as the 'do nothing' operation, which when applied to a matrix of equal dimensions, returns the matrix un-changed. The identity matrix is one which has all zero values, excepting the elements of the principle diagonal which are all unit valued, for example, the identity matrix of a 2x2 matrix is,

`bb"I"_2=[[1,0],[0,1]]`

The Inverse

The inverse of a matrix is one which when multiplied by the original matrix gives the identity matrix, similar to the algebraic expression `x*x^(-1)=1`. Following this formulation we have,

`bb"M"*bb"M"^(-1)=I`

The Transpose

The transpose of a matrix essentially involves swapping the rows for the columns; most simply for a column vector, we simply rotate the matrix onto its side,

`bb"M"=[[1],[2],[3]]`

Where MT is the transpose, we have,

`bb"M"^"T"=[[1,2,3]]`

For a larger matrix, we reflect the matrix horizontally and then rotate it anti-clockwise, or alternatively, we can rotate the matrix across the principal diagonal,

A more formal description of the transpose process, swapping the column-row indicates for each element of a matrix. For example, `bb"A"_(i,j)=bb"A"_(j,i)^"T"`, hence we can see why the elements along the principle diagonal, for which the elements indices are equal (ie., `bb"A"_(1,1)`, `bb"A"_(2,2)` and `bb"A"_(3,3)`) remain unchanged by the transpose.

The Determinant