# Sylvester Identity

The Sylvester identity states that for any two matrices and such that the both products and exist (i.e., is of the same size as and therefore the both matrices and are square), we have

In this equation denotes identity matrices (also called unit matrices) possibly of different sizes; in this article we will use the notation without a reference to size.The reader may refer to [Henderson and Searle] for the history of the identity (see eq. (6)).

# Proof by Partitioned Matrices

Perhaps the shortest proof of the identity is as follows. The matrix identity

can be checked by direct calculation dealing with block matrices. It remains to take the determinants of the right- and left-hand sides.# Proof by Eigenvalues

Although the proof based on partitioned matrices is concise and easy to check, the intuition behind it is not so easy to grasp. In what remains we present a lengthier but arguably more intuitive proof done in the operator theory fashion promoted by [Axler].

We derive the identity from the following statement of independent interest.

**Theorem 1.**Matrices and have the same non-zero eigenvalues with the same multiplicities.

The "non-zero" clause cannot be dropped. Indeed, let and be of different sizes (this happens when and are not square). One of the matrices clearly has more eigenvalues than the other. We will show that the "extra" eigenvalues will be zeroes.

It is easy to see that non-zero eigenvalues of and are the same. Indeed, by definition if is an eigenvalue of , then

for some non-zero vector . Multiplying this by on the left we get where (or and the first equation implies that ).It is a bit more difficult to show that the multiplicities are the same. We will rely on the "geometric" definition of multiplicity from [Axler, Chapter 8 (p. 171)]. A vector is a generalised eigenvector corresponding to an eigenvalue of a square matrix (or an operator) if for some positive integer . It is easy to see that generalised eigenvectors for the same form a subspace; the dimension of this subspace is the multiplicity of the eigenvalue .

One can show that it is sufficient to consider powers , where is the dimension of the space where operates (see [Axler, Corollary 8.7, p. 166]. Therefore the space of generalised eigenvectors can be defined as the null space of the operator ; however, we will not need this definition.

The reader who is more familiar with the "algebraic" definition of the multiplicity as that of a root of the characteristic polynomial may want to recall the Jordan form of a matrix in order to see that the definitions are equivalent.

**Proof:** The idea of the proof is suggested by the argument above. Let be linearly independent generalised eigenvectors of corresponding to the same eigenvalue . We will show that are generalised eigenvectors of and are linearly independent.

Let us start with the eigenvector property. For every there is a positive integer such that . Let us open up the brackets. The binomial identity implies that

Multiplying this equation by on the left and observing that yieldsNow suppose that are linearly dependent and there are coefficients , not all zero, such that . The vector is not zero because were assumed to be independent, but .

Multiplying the last equation by on the left yields and subtracting yields . Thus is an eigenvector of with a non-zero eigenvalue . On the other hand, is a generalised eigenvector of and applying to it repeatedly should turn it to zero (after finitely many iterations). This is a contradiction.

We can now prove the Sylvester identity. We will rely on the representation (taken by [Axler] to be the definition) of the determinant as the product of (complex) eigenvalues counting multiplicities. The eigenvalues of equal the eigenvalues of plus 1. All non-zero eigenvalues of equal the non-zero eigenvalues of , while zero eigenvalues are unimportant as they contribute 1s as eigenvalues of and . The products of eigenvalues are therefore equal.

# References

[Henderson and Searle] H.V.Henderson and S.R.Searle. On deriving the inverse of a sum of matrices. SIAM Review, 23(1), 1981.

[Axler] S.Axler. Linear Algebra Done Right. Springer, 2nd edition, 1997.