Find eigenvalues and eigenvectors of the matrix $\begin{pmatrix} 1 & 6 & 0 \\ 0 & 2 & 1 \\ 0 & 1 & 2 \end{pmatrix} $
Find eigenvalues and eigenvectors of the matrix $$\begin{pmatrix} 1 & 6 & 0 \\ 0 & 2 & 1 \\ 0 & 1 & 2 \end{pmatrix}. $$
Please show work.
Answer
(i) To find the eigenvalues we should set
\[\text{det}(A-\lambda I)=0.\]
Hence
\[\begin{vmatrix} 1-\lambda & 6 & 0 \\ 0 & 2-\lambda & 1 \\ 0 & 1 & 2-\lambda \end{vmatrix}=(1-\lambda) \begin{vmatrix} 2-\lambda & 1 \\ 1& 2-\lambda \end{vmatrix} \]
\[=(1-\lambda)((2-\lambda)(2-\lambda)-1)=(1-\lambda)(\lambda^2-4\lambda+3)\]
\[=(1-\lambda)(1-\lambda)(\lambda-3)=0\]
\[\Rightarrow \lambda_1=\lambda_2=1, \lambda_3=3.\]
(ii) Next we find the eigenvectors for $\lambda_1=\lambda_2=1$.
\[(A-I)v_1=0 \Rightarrow\]
\[\begin{pmatrix} 0 & 6 & 0 & 0 \\ 0 & 1 & 1 &0 \\ 0 & 1 & 1 & 0 \end{pmatrix} \]
$-R_2+R_3 \rightarrow R_3$ gives
\[\begin{pmatrix} 0 & 6 & 0 & 0 \\ 0 & 1 & 1 &0 \\ 0 & 0 & 0 & 0 \end{pmatrix}. \]
Hence
\[6y=0 \text{and} y+z=0 \Rightarrow y=z=0.\]
$x$ is a free variable, and we can take $x=1$. So there is only one linearly independent eigenvector
\[v= \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}. \]
(iii) We can similarly compute the eigenvalues for $\lambda_3=3.$. \[(A-I)v_3=0 \Rightarrow\] \[\begin{pmatrix} -2 & 6 & 0 & 0 \\ 0 & -1 & 1 &0 \\ 0 & 1 & -1 & 0 \end{pmatrix} \] $R_2+R_3 \rightarrow R_3$ gives \[\begin{pmatrix} -2 & 6 & 0 & 0 \\ 0 & -1 & 1 &0 \\ 0 & 0 & 0 & 0 \end{pmatrix}. \] Hence \[x=3y \text{and} y=z \Rightarrow y=z=0.\] $z$ is a free variable, and we can take $z=1$. So there is only one linearly independent eigenvector \[v_3= \begin{pmatrix} 3\\ 1 \\ 1\end{pmatrix}. \]

- answered
- 1917 views
- $12.00
Related Questions
- Space of matrices with bounded row space
- Let $H$ be the subset of all 3x3 matrices that satisfy $A^T$ = $-A$. Carefully prove that $H$ is a subspace of $M_{3x3} $ . Then find a basis for $H$.
- Relating dot product divided with square of the vector while changing basis of vector
- Prove Property of Projection Matrices
- Stuck on this and need the answer for this problem at 6. Thanks
- linear algebra
- [change of basis] Consider the family β = (1 + x + x 2 , x − x 2 , 2 + x 2 ) of the polynomial space of degree ≤ 2, R2[x].
- inverse of matrices