# [Linear Algebra] Spectrum

## Answer

This is an elementary but somewhat inelegant approach, but begin with the following lemma:**Lemma**

Suppose $A_1,...,A_n$ are linear operators on $V$, if $v\neq0$ is a vector so that $A_1 A_2...A_nv = 0$ then there is a non-zero $w\in V$ and an $i$ with $A_iw=0$.**Proof**

The result is an induction on $n$, note that for $n=1$ its clearly true. On the other hand if $A_1(A_2....A_nv)=0$ then either $A_2...A_nv\neq 0$ and we have with $i=1$ and $w=A_2...A_nv$ that $A_iw=0$ and we are finished, or we have $A_2....A_nv=0$ to which we may apply the induction hypothesis.

We gain the following corollary:**Corollary**

For $T\in L(V)$, if there is a $v\in V$ with $v\neq0$ so that $\prod_{i=1}^n (T-\lambda_i)v=0$ then there is an $i$ and a $w\neq0$ with $(T-\lambda_i)w=0$, ie $\lambda_i$ is an eigenvalue of $T$.

Now let $p$ be a polynomial and $\xi$ an eigenvalue of $p$ with eigenvector $v$ (ie $\xi\in\mathrm{Spec}p(T)$). Since $K$ is algebraically closed you may decompose

$$p(x)-\xi = c\prod_i (x-\lambda_i)$$

Then

$$0=(p(T)-\xi)v=c \prod_i (T-\lambda_i)v$$

and there is an $i$ with $\lambda_i$ is an eigenvalue of $T$. Buth

$$p(\lambda_i) = (p(\lambda_i)-\xi)+\xi =c\prod_j(\lambda_i-\lambda_j) + \xi = \xi$$

which gives $\xi\in p(\mathrm{Spec}(T))$, ie the inclusion $\mathrm{Spec}(p(T))\subseteq p(\mathrm{Spec}(T))$.

The other inclusion is far more direct:

Suppose $\lambda $ is an eigenvalue of $T$ and $v$ is an eigenvector for this eigenvalue. For any $k\in\Bbb N$ you then have:$$T^kv= T^{k-1}(Tv)=T^{k-1}(\lambda v) = \lambda T^{k-1}v$$ and then by induction over $k$ you get $T^k v = \lambda^k v$. Now for $p(T)= \sum_{k=0}^n a_k\ x^k$ a polynomial you get:

$$p(T)v =\sum_{k=0}^n a_k T^k v = \sum_{k=0}^n a_k \lambda^k v = p(\lambda)v$$and it follows that $p(\lambda)$ is an eigenvalue of $p(T)$ - this is the inclusion $p(\mathrm{Spec}(T))\subseteq \mathrm{Spec}(p(T))$.

- answered
- 1519 views
- $10.00

### Related Questions

- Step by step method to solve the following problem: find coordinates of B.
- Find $x$ so that $\begin{bmatrix} 2 & 0 & 10 \\ 0 & x+7 & -3 \\ 0 & 4 & x \end{bmatrix} $ is invertible
- Find $x$ so that $\begin{pmatrix} 1 & 0 & c \\ 0 & a & -b \\ -\frac{1}{a} & x & x^2 \end{pmatrix}$ is invertible
- Consider the vector v = (3, 4, 5)^T, calculate the orthogonal projection
- Find the values of x
- Let $H$ be the subset of all 3x3 matrices that satisfy $A^T$ = $-A$. Carefully prove that $H$ is a subspace of $M_{3x3} $ . Then find a basis for $H$.
- Calculate the inverse of a triangular matrix
- Linear Algebra - Matrices and Inverses Matrices

Just an addendum, our definition for Spectrum of T is the set Spec(T), that consists of all eigenvalues of T.

Do you have access to the Jordan decomposition of a matrix?

Hey, we do! How would an answer using it look like?

Suppose T= D+ N where D is diagonal in some basis and N is nilpotent and DN=ND (the eigenbalues of T are then the values on the diagonal of D). Then for any polynomial p you have p(T) = p(D) + N X, where X is some sum and product of the Ds and the Ns. But then again NX is nilpotent (aka strictly upper triangular) since (NX)^n = N^n X^n, and p(D) is diagonal and both matrices commute, so the eigenvalues of p(D)+NX are the same as the eigenvalues of p(D), but this is just ....

.... p applied to the eigenvalues of D, ie p applied to the eigenvalues of T.

The decomposition T = D+N is the Jordan decomposition and it works for any linear map on a finite dimensional vector space in an algebraically closed field.