Conjugate / Transpose - Matrix
I would apreciate an detalied soultion. I know it isn't a hard exercise, but I tend to make mistakes and not see them. I honetly fear the possible losing of points. Therefore I am now asking you for help.
Honestly tank you.
Answer
1. We will use the following two properties of the tranpose:
(i) Linearity: For all $r\in \Bbb R$, $x,y\in M_n(\Bbb R)$ one has $(rx+y)^t= rx^t+y^t$.
(ii) Anti-multiplicativity: For all $x,y\in M_n(\Bbb R)$ one has $(xy)^t=y^tx^t$.
We use (ii) to get:
(iii) For all $x\in M_n(\Bbb R)$, $k\in\Bbb N$ one has $(x^k)^t= (x^t)^k$.
As a proof we do induction over $k$. Its obviously true for $k=0$ (remember that $x^0$ is the identity matrix), so for the induction step we assume it holds for a $k$, then:
$$(x^{k+1})^t = (x^k\cdot x)^t \overset{(ii)}= x^t(x^k)^t \overset{IH}= x^t (x^t)^k = (x^{t})^{k+1}$$
Now let $f[x]=\sum_{k=0}^N f_k\ x^k$ be a polynomial. By using (i) and (iii) we get:
$$f[a]^t = \left(\sum_{k=0}^Nf_k \ a^k\right)^t \overset{(i)}= \sum_{k=0}^N f_k\ (a^k)^t \overset{(iii)}= \sum_{k=0}^N f_k\ (a^t)^k = f[a^t]$$
2. The proof is the same. We have that conjugation is:
(i) Linear: For all $r\in \Bbb R$, $x,y\in M_n(\Bbb R)$ one has $c(rx+y)c^{-1}= r\,cxc^{-1} + cyc^{-1}$.
(ii) Multiplicative: For all $x,y\in M_n(\Bbb R)$ one has $c(xy)c^{-1}= (cxc^{-1})(cyc^{-1})$.
Then by doing the same induction proof as before you get:
(iii) For all $x\in M_n(\Bbb R), k\in\Bbb N$ one has $c(x^k)c^{-1}=(cxc^{-1})^k$.
So for a polynomial $f[x]$ you have:
$$cf[a]c^{-1} = c\left(\sum_{k=0}^Nf_k\ a^k\right)c^{-1}\overset{(i)}= \sum_{k=0}f_k \ (cx^kc^{-1}) \overset{(iii)}= \sum_{k=0}^N f_k (cac^{-1})^k = f[cac^{-1}]$$
3. Here we use the result from 2., remember that the minimal polynomial $M_x$ of $x\in M_{n}(\Bbb R)$ is the monic polynomial of minimal degree so that $M_x[x]=0$.
We will show that $M_a(cac^{-1})=0=M_{cac^{-1}}(a)$ and that $\deg(M_a)=\deg(M_{cac^{-1}})$. This implies that $M_a$ is a monic polynomial of with $M_a(cac^{-1})=0$ and whose degree is minimal among all polynomials $f$ for which $f(cac^{-1})=0$ - ie that $M_a$ is the minimal polynomial for $cac^{-1}$, meaning
$$M_a=M_{cac^{-1}}$$
Well one has:
$$M_a(cac^{-1}) \overset{2.}= cM_a(a)c^{-1}=c\cdot 0\cdot c^{-1}=0$$
This also implies $\deg(M_a)≥\deg(M_{cac^{-1}})$ as $M_{cac^{-1}}$ has minimal degree among all polynomials annihilating $cac^{-1}$. Similarly:
$$M_{cac^{-1}}(a) = c^{-1}(cM_{cac^{-1}}(a))c^{-1}c\overset{2.}= c^{-1}M_{cac^{-1}} (cac^{-1})c = c^{-1}0c =0$$
and one gets in the same fashion $\deg(M_{cac^{-1}})≥\deg(M_a)$.
- answered
- 1006 views
- $15.00
Related Questions
- Find eigenvalues and eigenvectors of $\begin{pmatrix} -3 & 0 & 2 \\ 1 &-1 &0\\ -2 & -1& 0 \end{pmatrix} $
- Find eigenvalues and eigenvectors of $\begin{pmatrix} 1 & 6 & 0 \\ 0& 2 & 1 \\ 0 & 1 & 2 \end{pmatrix} $
- Numerical Linear Algebra Question
- General solutions of the system $X'=\begin{pmatrix} a & b \\ c & d \end{pmatrix} $
- Does $\sum_{n=2}^{\infty}\frac{\sin n}{n \ln n}$ converge or diverge?
- Linear algebra
- Find the eigenvalues of $\begin{pmatrix} -1 & 1 & 0 \\ 1 & 2 & 1 \\ 0 & 3 & -1 \end{pmatrix} $
- Questions about using matrices for finding best straight line by linear regression
Offer is way too low!
Thank you for informing me. Is it now acceptable? If not, what would be?