# [Linear Algebra] Diagonalizable operator and Spectrum

See attached image. Exercise from my Advanced Linear Algebra class.

Working from Hoffman/Kunze *LinearAlgebra *and Roman *Advanced Linear Algebra.*

Any questions about notation, just let me know.

*Just an addendum, our definition for Spectrum of $T$ is the set $Spec(T)$, that consists of all eigenvalues of $T$.

## Answer

First lets show that if such a map exists its necessarily unique. To do this note that every function $\mathrm{Spec}(T)\to K$ can be realised by a polynomial because $\mathrm{Spec}(T)$ is a finite set. Specifically if

$$f:\mathrm{Spec}(T)\to K$$

is a function then for all $x\in\mathrm{Spec}(T)$ you have:

$$f(x) = \sum_{y\in \mathrm{Spec}(T)} f(y)\ \frac{\prod\limits_{z\in\mathrm{Spec(T), z\neq y} } ( x-z)}{\prod\limits_{z\in\mathrm{Spec(T), z\neq y}}(y-z)}$$

where the expression on the right is a polynomial in $x$ (this is the so called Lagrange interpolant of $f$ on the finite set $\mathrm{Spec(T)}$).

This means that if $\varphi: F^{\mathrm{Spec(T)}}\to L(V)$ is a map that is multiplicative (ie satisfies property (iii) ) and linear and if $f\in F^{\mathrm{Spec(T)}}$ is any function choose some polynomial $p(x)=\sum_{k=0}^n a_k x^k$ so that $p(x)=f(x)$ for all $x$, then:

$$\varphi(f) = \varphi(x\mapsto \sum_{k=0}^n a_k x^k) = \sum_{k=0}^n a_k \varphi(x\mapsto x^k)=a_0\varphi(x\mapsto 1)+\sum_{k=1}^na_k\ \varphi(x\mapsto x)^k$$

and the image of $\varphi(f)$ is uniquely determined if you know the image of the constant map $x\mapsto1$ (this is prescribed by (ii) ) and the image of the identity map $x\mapsto x$ (prescribed by (i) ).

So there can at most be one linear map satisfying properties (i), (ii), (iii).

Now lets show that such a map exists by explicitly defining one. Since $T$ is diagonalisable there is a basis $v_1,...,v_n$ consisting of eigenvectors, let $\lambda_i$ denote the eigenvalue corresponding to $v_i$. We define:

$$ \varphi: F^{\mathrm{Spec}(T)}\to L(V), \qquad \varphi(f) = [\sum_i a_i \ v_i \mapsto \sum_i a_i f(\lambda_i) v_i]$$

in other words $\varphi(f)$ is, in the basis $v_1,...,v_n$, given by the diagonal matrix that has $f(\lambda_i)$ on its diagonals and other entries $0$.

You then necessarily have that $\varphi(x\mapsto x)$ is matrix with $\lambda_i$ on the diagonals (ie the matrix of $T$), so $\varphi(x\mapsto x) = T$ which satisfies (i). Further $\varphi(x\mapsto 1)$ is the matrix with $1$ on the diagonal, so its the identity satisfying (ii). Finally $\varphi(f\cdot g)$ has $f(\lambda_i)\cdot g(\lambda_i)$ on the diagonals, which corresponds to the product of the matrix with $f(\lambda_i)$ on the diagonal with the matrix with $g(\lambda_i)$ on the diagonal, ie $\varphi(f\cdot g) = \varphi(f)\cdot \varphi(g)$ and (iii) is satisfied. Finally this map is clearly linear as $\varphi(f+g)$ is the matrix with $f(\lambda_i)+g(\lambda_i)$ on the diagonal, this is equal to the sum of the matrix with $f(\lambda_i)$ on the diagonal and the matrix with $g(\lambda_i)$ on the diagonal, ie $\varphi(f)+\varphi(g)$. Similarly for any $k\in K$ you have $\varphi(k f)=k\varphi(f)$.

Now they also want us to check that $\varphi$ is injective and that its image is $\{ p(T)\mid p(x)\in K[x]\}$. Well we have already seen that for any function $f:\mathrm{Spec}(T)\to K$ there is a polynomial $p$ with $f(x)=p(x)=\sum_{k=0}^n a_k x^k$, hence $\varphi(f)=\varphi(p) =a_0\cdot 1 +\sum_{k=1}^n a_k T^k = p(T)$ and we have characterised the image.

To see that $\varphi$ is injective let $f:\mathrm{Spec}(T)\to K$ be a function so that $\varphi(f)=0$. Then for all vectors $v\in V$ you ahve that $\varphi(f)v=0$, in particular for all of the basis elemtns $v_i$ from above you have

$$\varphi(f)v_i = f(\lambda_i)v_i \overset!=0$$

and then $f(\lambda_i)=0$ for all $i$. This just means that $f$ is the constant function $0$ on all of $\mathrm{Spec}(T)$ and so $\ker(\varphi) = \{0\}$ and $\varphi$ is injective.

- answered
- 595 views
- $25.00

### Related Questions

- [change of basis] Consider the family β = (1 + x + x 2 , x − x 2 , 2 + x 2 ) of the polynomial space of degree ≤ 2, R2[x].
- Hello! I Would like a proof detailed of the following question.
- (Short deadline) Linear Algebra
- Linear Algebra - Vectors and Linear Systems
- Eigenvalues and eigenvectors of $\begin{bmatrix} 3 & 2 & 4 \\ 2 & 0 & 2 \\ 4 & 2 & 3 \end{bmatrix} $
- Find eigenvalues and eigenvectors of $\begin{pmatrix} -3 & 0 & 2 \\ 1 &-1 &0\\ -2 & -1& 0 \end{pmatrix} $
- Character of 2-dimensional irreducible representation of $S_4$
- [Linear Algebra] Proof check. Nilpotent$\Rightarrow Spec\Rightarrow$ Characteristic Polynomial $\Rightarrow$ Nilpotent