# [Linear Algebra] Spectrum

## Answer

This is an elementary but somewhat inelegant approach, but begin with the following lemma:**Lemma**

Suppose $A_1,...,A_n$ are linear operators on $V$, if $v\neq0$ is a vector so that $A_1 A_2...A_nv = 0$ then there is a non-zero $w\in V$ and an $i$ with $A_iw=0$.**Proof**

The result is an induction on $n$, note that for $n=1$ its clearly true. On the other hand if $A_1(A_2....A_nv)=0$ then either $A_2...A_nv\neq 0$ and we have with $i=1$ and $w=A_2...A_nv$ that $A_iw=0$ and we are finished, or we have $A_2....A_nv=0$ to which we may apply the induction hypothesis.

We gain the following corollary:**Corollary**

For $T\in L(V)$, if there is a $v\in V$ with $v\neq0$ so that $\prod_{i=1}^n (T-\lambda_i)v=0$ then there is an $i$ and a $w\neq0$ with $(T-\lambda_i)w=0$, ie $\lambda_i$ is an eigenvalue of $T$.

Now let $p$ be a polynomial and $\xi$ an eigenvalue of $p$ with eigenvector $v$ (ie $\xi\in\mathrm{Spec}p(T)$). Since $K$ is algebraically closed you may decompose

$$p(x)-\xi = c\prod_i (x-\lambda_i)$$

Then

$$0=(p(T)-\xi)v=c \prod_i (T-\lambda_i)v$$

and there is an $i$ with $\lambda_i$ is an eigenvalue of $T$. Buth

$$p(\lambda_i) = (p(\lambda_i)-\xi)+\xi =c\prod_j(\lambda_i-\lambda_j) + \xi = \xi$$

which gives $\xi\in p(\mathrm{Spec}(T))$, ie the inclusion $\mathrm{Spec}(p(T))\subseteq p(\mathrm{Spec}(T))$.

The other inclusion is far more direct:

Suppose $\lambda $ is an eigenvalue of $T$ and $v$ is an eigenvector for this eigenvalue. For any $k\in\Bbb N$ you then have:$$T^kv= T^{k-1}(Tv)=T^{k-1}(\lambda v) = \lambda T^{k-1}v$$ and then by induction over $k$ you get $T^k v = \lambda^k v$. Now for $p(T)= \sum_{k=0}^n a_k\ x^k$ a polynomial you get:

$$p(T)v =\sum_{k=0}^n a_k T^k v = \sum_{k=0}^n a_k \lambda^k v = p(\lambda)v$$and it follows that $p(\lambda)$ is an eigenvalue of $p(T)$ - this is the inclusion $p(\mathrm{Spec}(T))\subseteq \mathrm{Spec}(p(T))$.

- answered
- 1097 views
- $10.00

### Related Questions

- Advice for proving existence claims
- Linear Algebra - Vectors and Linear Systems
- Linear Algebra Help : Consider Two Planes, P1 and P2
- Linear Algebra Assistance: Linear Combinations of Vectors
- If A is an nxn matrix, then A has n distinct eigenvalues.True or false?
- Consider the function, prove that it's bilinear, symmetric, and positive definite
- Consider the vector v = (3, 4, 5)^T, calculate the orthogonal projection
- linear algebra

Just an addendum, our definition for Spectrum of T is the set Spec(T), that consists of all eigenvalues of T.

Do you have access to the Jordan decomposition of a matrix?

Hey, we do! How would an answer using it look like?

Suppose T= D+ N where D is diagonal in some basis and N is nilpotent and DN=ND (the eigenbalues of T are then the values on the diagonal of D). Then for any polynomial p you have p(T) = p(D) + N X, where X is some sum and product of the Ds and the Ns. But then again NX is nilpotent (aka strictly upper triangular) since (NX)^n = N^n X^n, and p(D) is diagonal and both matrices commute, so the eigenvalues of p(D)+NX are the same as the eigenvalues of p(D), but this is just ....

.... p applied to the eigenvalues of D, ie p applied to the eigenvalues of T.

The decomposition T = D+N is the Jordan decomposition and it works for any linear map on a finite dimensional vector space in an algebraically closed field.