**1.** The orthogonal projecton on the orthogonal space of $\vec{n}$ is given by $\pi(\vec{v}) = \vec{v} - \langle \vec{v}, \vec{n} \rangle \vec{n}$, so the corresponding matrix is \[ \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix} - \begin{pmatrix} x^2 & xy & xz \\ xy & y^2 & yz \\ xz & yz & z^2 \end{pmatrix} = \begin{pmatrix} 1- x^2 & -xy & -xz \\ -xy & 1-y^2 & -yz \\ -xz & -yz & 1-z^2 \end{pmatrix} \]

**a-b. **These two statements are false. Take for example the matrix \[ A = \begin{pmatrix} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \end{pmatrix}. \] This is the rotation by $\pi$ around $[1,0,0]^T$. We have $A = A^T = A^{-1}$, $\det(A) = 1$, but $A$ is not the identity matrix, and it has real eigenvalues that are not $1$.

We can, however, say that $A$ must have at least one eigenvalue equal to $1$. Indeed, the eigenvalues of $A$ and $A^T$ are the same, and the eigenvalues of $A$ and $A^{-1}$ are the inverses of each other, so if the eigenvalues are $a, b, c$, then $\{a, b, c\} = \{a^{-1}, b^{-1}, c^{-1}\}$ as sets. If each of them is equal to its own inverse, then they must all be either $1$ or $-1$, but since $abc = 1$ they can't all be $-1$ so one of them must be equal to $1$. Otherwise, assume wlog $c = b^{-1}$; then $abc = abb^{-1} = a$, but $abc = 1$ so $a=1$.

**c. **It follows immediately from the previous point: since $\det A = 1$, then $abc=1$. We showed that we can assume $a=1$, so $bc = 1$ as well. Since $b$ and $c$ must be conjugates, then $c = \overline{b}$, so $b \overline{b} = 1$.

**d. **There is no question here, right?

**e. **Let $h \in H$. By definition, $\langle h, u \rangle = 0$. We have \[ \langle Ah, u \rangle = \langle h, A^T u \rangle = \langle h, A^{-1} u \rangle = \langle h, u \rangle = 0 \] (since $Au = u$, then $A^{-1}u = u$ as well), so $Ah \in \langle u \rangle^\perp = H$.

Note that there is a mistake in the text: you shouldn't write $\{u\}$ but $\langle u \rangle$ (the linear subspace spanned by $u$).

**f. **Since $A$ is a real matrix, the same argument of e. applies, just replace $H$ with $K$. The text also has the same mistake.

**g. **We have \[ \lambda (w_1 + i w_2) = (\cos \theta + i \sin \theta) (w_1 + i w_2) = (\cos \theta w_1 - \sin \theta w_2) + i (\sin \theta w_1 + \cos \theta w_2) \] which is exactly the result of the matrix multiplication \[ \begin{pmatrix} \cos \theta & - \sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} \begin{pmatrix} w_1 \\ w_2 \end{pmatrix} \] as desired.

**a. **The columns of the matrix are given by $\vec{u} \times \vec{i}$, $\vec{u} \times \vec{j}$, and $\vec{u} \times \vec{k}$. Let's compute the first one, so $v = [1,0,0]^T$. The determinants are $0, u_3, u_2$ respectively, so $\vec{u} \times \vec{i} = (0, -u_3, u_2)$. By the same argument, $\vec{u} \times \vec{j} = (u_3, 0, -u_1)$, and $\vec{u} \times \vec{k} = (-u_2, u_1, 0)$, so the corresponding matrix is \[ \begin{pmatrix} 0 & u_3 & -u_2 \\ - u_3 & 0 & u_1 \\ u_2 & -u_1 & 0 \end{pmatrix} \]

Note that the formula as given above has the opposite sign wrt the conventional formula for $u \times v$, which is usually written with $u$ in the second row and $v$ in the third row, see for example

https://en.wikipedia.org/wiki/Cross_product**b. **Let $[u]$ denote the matrix corresponding to the cross product on the left by $u$, that is, $[u]v = u \times v$. Then \[ w \times (u \times v) = w \times ([u]v) = [w][u]v \] but \[ (w \times u) \times v = ([w]u) \times v = [[w]u]v \] and $[w][u] \neq [[w]u]$ in general: for example if $w = u$ then $w \times u = 0$ so $[[w]u] = 0$, while $[w][u] = [u]^2 \neq 0$ as $[u]$ has rank $2$ for each $u \neq 0$ and so $[u]^2$ has rank at least $1$ (in particular, it is not identically $0$).

**c. **By writing $w = w_1 \vec{i} + v_2 \vec{j} + v_3 \vec{k}$ and recalling that $\vec{i}, \vec{j}, \vec{k}$ are orthonormal, we have \[ \langle w, u \times v \rangle = w_1 \cdot (-1)^{1+1} \cdot \det \begin{pmatrix} v_2 & v_3 \\ u_2 & u_3 \end{pmatrix} + w_2 \cdot (-1)^{1+2} \cdot \det \begin{pmatrix} v_1 & v_3 \\ u_1 & u_3 \end{pmatrix} + w_3 \cdot (-1)^{1+3} \cdot \det \begin{pmatrix} v_1 & v_2 \\ u_1 & u_2 \end{pmatrix} \] which is exactly the formula for the determinant of the given matrix computed with the cofactor expansion.

Now $u \perp u \times v$ and $v \perp u \times v$ follow immediately, as $\langle u, u \times v \rangle$ is the determinant of a matrix with two identical rows, which is $0$, and the same goes for $\langle v, u \times v \rangle$.