# Diagonalization and Jordan Form I

This post talks about the existence of Jordan form by decomposing a vector space into direct sum of generalized eigenspaces.

In this post we will work over algebraically closed fields.

The diagonalization problem asks whether a linear transformation admits an eigenbasis. Eequivalently it asks when an matrix is similar to a diagonal matrix, i.e. for some invertible matrix and a diagonal matrix .

Let the characteristic polynomial of be , where are the distinct **eigenvalues** of and are the respective **algebraic multiplicities**.

Theorem 1Let is an endomorphism for finite dimensional vector space (over an algebraically closed field). Let its characteristic polynomial be . Then .

**Proof**

**1. The sum part**

Notice that , , …, are relatively prime. Therefore we can find polynomials such that

For any vector , notice that , by Cayley-Hamilton theorem. Analogously we have other inclusions. Therefore by applying (1) to , we have

This gives a required decomposition.

**2. The direct part**

WLOG suppose that

where , is the shortest possible relation for which .(). Applying on both sides gives

(3) and (4) are linearly independent equations, because are all distinct. Therefore it is possible to eliminate one of the vectors involved, e.g. to eliminate , and still have other coefficients to be nonzero. This contradicts the minimality assumption, so must be linearly independent.

Corollary 1is diagonalizable if and only if for each , .

This is because being able to find an eigenbasis means exactly that .

Proposition 2.

This will be clear using Jordan form.

Corollary 2is diagonalizable if and only if for each , .

is usually called the **geometric multiplicity **of the eigenvalue . Thus the above corollary may be rephrased as

Corollary 2′is diagonalizable if and only if for each , the algebraic multiplicity coincides with the geometric multiplicity.

Proposition 2 also gives the following quick corollary.

Corollary 3The algebraic multiplicity is always no less than the geometric multiplicity.

So the diagonalization problem boils down to whether . This may not hold in many situations. The natural question is then: can we choose a nice basis from each such that when is diagonalizable, the selected basis is an eigenbasis?

one answer to this qusetion is the Jordan form. The Jordan canonical form picks a basis such that the matrix constitutes **Jordan blocks** of the form

Theorem 2 (Existence of Jordan form)Let is an endomorphism for finite dimensional vector space (over an algebraically closed field). Let its characteristic polynomial be . Then there exists a basis of so that the matrix representation of is a direct sum of Jordan blocks.Furthermore, the number of Jordan blocks and the size of each of them is an invariant for .

**Proof**

Since we have already shown , it suffices to choose a nice basis in each .

Let . Then is a -invariant map. Therefore, we may regard as an -module. Since is a principal ideal domain, we may invoke the structure theorem for finitely generated modules over PID to see that

where each is a cyclic subspace. Take as an example. Let be its generator, and suppose that is the smallest number such that . Then it is immediate that forms a basis for . It is then routine to show that with respect tothis basis, takes exactly the form of a Jordan block.

Similarly, one can choose basis for the other to get a required basis for for each eigenvalue . Put them together and we have found a basis with the desired matrix representation.

Notice that the number of Jordan blocks is determined by the number of distinct eigenvalues and the number of cyclic modules in the decomposition. The size of Jordan blocks is determined by the invariants associated to the decomposition into cyclic modules. The uniqueness thus follows.

Although the above proof cannot help us to do any computation, it has some theoretical implications because the Jordan form is easier to manipulate.

Proposition 3 (Minimal Polynomial)For each eigenvalue , let be the size of the largest Jordan block corresponding to . Then the minimal polynomial for is

**Remark**

- In fact one can directly apply the structure theorem for finitely generated modules over PID to get Jordan form directly. (skipping the proof of theorem 1) To me, a constructive proof is better though.
- The algebraically closed field condition can be replaced by “the characteristic polynomial splits”, because all we used is that eigenvalues exist.

I recall struggling some time back trying to understand the proof of the Jordan normal form from a pure linear algebra textbook; it was only when, later on, I re-learned it from Lang’s Algebra using the structure theorem that it actually made much more sense.

Then again, I also think when the Jordan form is often used (e.g. in the theory of Lie algebras), the main idea is not so much the form itself, but the fact that one gets to represent a matrix as a sum of a semisimple and a nilpotent element which are each polynomials in the initial matrix.

I have the same experience. I don’t remember about the first proof of Jordan form I saw, and I can only remember this one using structure theorem of f.g. modules. This theorem gives such a clean explanation for both Jordan form and rational canonical form.

By the way, how did you manage to find here?

I found this page using the “Tag surfer” feature on WordPress, on which your post on sheaves came up (since “algebraic geometry” is a tag I had selected), after which I looked at the blog in general.