We have seen how to compute the determinant of a matrix, and the incredible fact that we can perform expansion about any row or column to make this computation. In this largely theoretical section, we will state and prove several more intriguing properties about determinants. Our main goal will be the two results in Theorem SMZD and Theorem DRMM, but more specifically, we will see how the value of a determinant will allow us to gain insight into the various properties of a square matrix.

## Determinants and Row Operations

We start easy with a straightforward theorem whose proof presages the style of subsequent proofs in this subsection.

Theorem DZRC (Determinant with Zero Row or Column) Suppose that $A$ is a square matrix with a row where every entry is zero, or a column where every entry is zero. Then $\detname{A}=0$.

Theorem DRCS (Determinant for Row or Column Swap) Suppose that $A$ is a square matrix. Let $B$ be the square matrix obtained from $A$ by interchanging the location of two rows, or interchanging the location of two columns. Then $\detname{B}=-\detname{A}$.

So Theorem DRCS tells us the effect of the first row operation (Definition RO) on the determinant of a matrix. Here's the effect of the second row operation.

Theorem DRCM (Determinant for Row or Column Multiples) Suppose that $A$ is a square matrix. Let $B$ be the square matrix obtained from $A$ by multiplying a single row by the scalar $\alpha$, or by multiplying a single column by the scalar $\alpha$. Then $\detname{B}=\alpha\detname{A}$.

Let's go for understanding the effect of all three row operations. But first we need an intermediate result, but it is an easy one.

Theorem DERC (Determinant with Equal Rows or Columns) Suppose that $A$ is a square matrix with two equal rows, or two equal columns. Then $\detname{A}=0$.

Now explain the third row operation. Here we go.

Theorem DRCMA (Determinant for Row or Column Multiples and Addition) Suppose that $A$ is a square matrix. Let $B$ be the square matrix obtained from $A$ by multiplying a row by the scalar $\alpha$ and then adding it to another row, or by multiplying a column by the scalar $\alpha$ and then adding it to another column. Then $\detname{B}=\detname{A}$.

Is this what you expected? We could argue that the third row operation is the most popular, and yet it has no effect whatsoever on the determinant of a matrix! We can exploit this, along with our understanding of the other two row operations, to provide another approach to computing a determinant. We'll explain this in the context of an example.

Example DRO: Determinant by row operations.

## Determinants, Row Operations, Elementary Matrices

As a final preparation for our two most important theorems about determinants, we prove a handful of facts about the interplay of row operations and matrix multiplication with elementary matrices with regard to the determinant. But first, a simple, but crucial, fact about the identity matrix.

Theorem DIM (Determinant of the Identity Matrix) For every $n\geq 1$, $\detname{I_n}=1$.

Theorem DEM (Determinants of Elementary Matrices) For the three possible versions of an elementary matrix (Definition ELEM) we have the determinants,

1. $\detname{\elemswap{i}{j}}=-1$
2. $\detname{\elemmult{\alpha}{i}}=\alpha$
3. $\detname{\elemadd{\alpha}{i}{j}}=1$

Theorem DEMMM (Determinants, Elementary Matrices, Matrix Multiplication) Suppose that $A$ is a square matrix of size $n$ and $E$ is any elementary matrix of size $n$. Then \begin{equation*} \detname{EA}=\detname{E}\detname{A} \end{equation*}

## Determinants, Nonsingular Matrices, Matrix Multiplication

If you asked someone with substantial experience working with matrices about the value of the determinant, they'd be likely to quote the following theorem as the first thing to come to mind.

Theorem SMZD (Singular Matrices have Zero Determinants) Let $A$ be a square matrix. Then $A$ is singular if and only if $\detname{A}=0$.

For the case of $2\times 2$ matrices you might compare the application of Theorem SMZD with the combination of the results stated in Theorem DMST and Theorem TTMI.

Example ZNDAB: Zero and nonzero determinant, Archetypes A and B.

Since Theorem SMZD is an equivalence (technique E) we can expand on our growing list of equivalences about nonsingular matrices. The addition of the condition $\detname{A}\neq 0$ is one of the best motivations for learning about determinants.

Theorem NME7 (Nonsingular Matrix Equivalences, Round 7) Suppose that $A$ is a square matrix of size $n$. The following are equivalent.

1. $A$ is nonsingular.
2. $A$ row-reduces to the identity matrix.
3. The null space of $A$ contains only the zero vector, $\nsp{A}=\set{\zerovector}$.
4. The linear system $\linearsystem{A}{\vect{b}}$ has a unique solution for every possible choice of $\vect{b}$.
5. The columns of $A$ are a linearly independent set.
6. $A$ is invertible.
7. The column space of $A$ is $\complex{n}$, $\csp{A}=\complex{n}$.
8. The columns of $A$ are a basis for $\complex{n}$.
9. The rank of $A$ is $n$, $\rank{A}=n$.
10. The nullity of $A$ is zero, $\nullity{A}=0$.
11. The determinant of $A$ is nonzero, $\detname{A}\neq 0$.

Computationally, row-reducing a matrix is the most efficient way to determine if a matrix is nonsingular, though the effect of using division in a computer can lead to round-off errors that confuse small quantities with critical zero quantities. Conceptually, the determinant may seem the most efficient way to determine if a matrix is nonsingular. The definition of a determinant uses just addition, subtraction and multiplication, so division is never a problem. And the final test is easy: is the determinant zero or not? However, the number of operations involved in computing a determinant by the definition very quickly becomes so excessive as to be impractical.

Now for the {\it coup de gr\^{a}ce}. We will generalize Theorem DEMMM to the case of any two square matrices. You may recall thinking that matrix multiplication was defined in a needlessly complicated manner. For sure, the definition of a determinant seems even stranger. (Though Theorem SMZD might be forcing you to reconsider.) Read the statement of the next theorem and contemplate how nicely matrix multiplication and determinants play with each other.

Theorem DRMM (Determinant Respects Matrix Multiplication) Suppose that $A$ and $B$ are square matrices of the same size. Then $\detname{AB}=\detname{A}\detname{B}$.

It is amazing that matrix multiplication and the determinant interact this way. Might it also be true that $\detname{A+B}=\detname{A}+\detname{B}$? (See exercise PDM.M30.)