Rank, Range, Critical Values, Preimage, and Integral of Differential Forms
Consider the mapping $F: \R^4\rightarrow\R^2,$
where $F_1:=x_1y_1+x_2y_2$ and $F_2:=x_1y_2x_2y_1$.
(a) Find the points in $\R^4$ where the rank of $dF=(y_1dx_1+x_1dy_1+y_2dx_2+x_2dy_2,y_2dx_1x_2dy_1y_1dx_2+x_1dy_2)$ is less than or equal to 1.
(Note: I calculated $dF$ and I believe the answer is only $(0,0,0,0)$.)
(b) What is the range of $F$and what are the critical values (the image of the points in (a))?
(Note: I believe the range is all of $\R^2$ and, assuming my answer to part (a) is correct, the only critical value is $(0,0)$.)
(c) Show that the preimage $F^{1}(p)$ of the point $p=(0,1)\in\R^2$ is a smooth connected surface $S$ in $\R^4$ and compute the restriction of the form
to $S$.
(d) Compute the integral
$$\oint_\gamma x_1dy_1+x_2dy_2$$
where $\gamma:=\lbrace (\sin t,\sin t,\cos t,\cos t)\mid t\in[0,2\pi]\rbrace$.
(e)What is the preimage $F^{1}(p)$ of the point $p=(0,0)$?
Answer
I'll write this in a slightly different way: view $\mathbb{R}^4$ as $M_{2}(\mathbb{R})$, the set of 2x2 matrices by sending $(x_1, x_2, y_1, y_2)$ to the matrix $\begin{pmatrix} x_1 & y_1 \\ x_2 & y_2 \end{pmatrix}$. Then the map $F: M_2(\mathbb{R}) \to \mathbb{R}^2$ is given by $F(M) = (a(M), det(M))$, where $a(M)$ is the dot product of the columns of $M$. This simplifies the arguments.
a.) The Jacobian of $a$ (ordering the coordinates as $(x_1, x_2, y_1, y_2)$) is just $\begin{pmatrix} y_1 & y_2 & x_1 & x_2 \end{pmatrix}$ and the Jacobian of $det(M)$ is $\begin{pmatrix} y_2 & y_1 & x_2 & x_1 \end{pmatrix}$. Thus the Jacobian of F is simply $\begin{pmatrix} y_1 & y_2 & x_1 & x_2 \\ y_2 & y_1 & x_2 & x_1\end{pmatrix}$. The block on the left hand side $\begin{pmatrix} y_1 & y_2 \\ y_2 & y_1 \end{pmatrix}$ has determinant $(y_1^2 + y_2^2)$ and the block on the right hand side analogously has determinant $(x_1^2 + x_2^2)$. If either of these is nonzero the matrix has rank 2, thus the Matrix only has rank less than 2 when both quantities are zero. Since the various $x_i, y_i$ are all real quanities their squares are stricly positive, so this is only possible when $M = 0$, as you had already worked out.
b.) Surjectivity is evidenced by noting that if $M(a, b) = \begin{pmatrix} 1 & a \\ 0 & b \end{pmatrix}$ then $F(M(a, b)) = (a, b)$, this expression can clearly take on any value. As you say $(0, 0) = F(0, 0, 0, 0)$ is clearly the only critical value.
c.) Because $(0, 1)$ is a regular value of $F$ the preimage is a smooth manifold by the implicit function theorem, or alternatively the preimage theorem.
One can see this explicitly on the level of matrices: $F(M) = (0, 1)$ if and only if $det(M) = 1$ and $a(M) = 0$, i.e. if and only if $M$ is a matrix of unit determinant with orthogonal columns. Such matrices are of the form $ \begin{pmatrix} cos(\theta) & sin(\theta) \\ sin(\theta) & cos(\theta) \end{pmatrix} \cdot \begin{pmatrix} a & 0 \\ 0 & a^{1} \end{pmatrix}$ for $\theta \in [0, 2\pi), a \in \mathbb{R}_{> 0}$ as the condition of orthogonality forces the columns to be orthogonal, and the condition of unit determinant forces the product of the lengths of the columns to be 1, since all matrices with orthonormal columns are of the form $\begin{pmatrix} cos(\theta) & sin(\theta) \\ sin(\theta) & cos(\theta) \end{pmatrix}$ we easily derive the above form. We can take $a$ to be positive because $\begin{pmatrix} cos(\theta + \pi) & sin(\theta + \pi) \\ sin(\theta + \pi) & cos(\theta + \pi) \end{pmatrix} = \begin{pmatrix} cos(\theta) & sin(\theta) \\ sin(\theta) & cos(\theta) \end{pmatrix}$ for any $\theta$ (thus making multiplying by negative values of $a$ unnecessary), where $\theta + \pi$ is defined to be $\theta + \pi  2\pi$ if its value exceeds $2\pi$. By this parametrization it is easy to see that $F^{1}(0, 1)$ is connected: we have exhibited a continuous surjection from the set $[0, 2\pi) \times \mathbb{R}_{> 0}$ onto $F^{1}(0, 1)$.
In the above coordinates $$x_1 = a cos(\theta), x_2 = a sin(\theta), y_1 = sin(\theta)/a, y_2 = cos(\theta)/a$$ thus substituting in we get that:
$$dx_1 = d(acos(\theta)) = a \cdot sin(\theta) \cdot d\theta + cos(\theta) \cdot da$$
$$dx_2 = (a \cdot cos(\theta) \cdot d\theta + sin(\theta) \cdot da)$$
and
$$dy_1 = d(\frac{sin(\theta)}{a}) = \frac{1}{a} (cos(\theta) \cdot d\theta  \frac{sin(\theta)}{a} \cdot da)$$
$$dy_2 = \frac{1}{a}(sin(\theta)\cdot d\theta + \frac{cos(\theta)}{a} \cdot da).$$
Taking wedges we see that $$dx_1 \wedge dy_1 = \frac{1}{a} (sin^2(\theta) d\theta \wedge da + cos^2(\theta) \cdot da \wedge d\theta)$$
whereas
$$dx_2 \wedge dy_2 = \frac{1}{a}(cos^2(\theta) d\theta \wedge da + sin^2(\theta) da \wedge d\theta)$$
Adding up we get $dx_1 \wedge dy_1 + dx_2 \wedge dy_2 = d\theta \wedge da + da \wedge d\theta = 0$ on $S$.
d.) Make the substitution of $(x_1, y_1, x_2, y_2) = \gamma(\theta)$ and one gets that the integral is equal to $\int_{0}^{2\pi} sin(\theta) d(sin(\theta)) + cos(\theta) d(cos(\theta)) = \int_0^{2\pi} (sin(\theta)cos(\theta)  cos(\theta)sin(\theta)) \cdot d\theta = 0$ which is predictable since the form in question is exact when restricted to $(x, x, y, y) \subset \mathbb{R}^4$.
e.) Using our previous coordinates $F^{1}(p)$ consists of those two by two invertible matrices with determinant zero and orthogonal columns. However if a two by two matrix has determinant zero then the columns are multiples of each other, and thus can never be orthogonal unless one of them is zero. Thus $F^{1}(0, 0)$ is the set of $2\times 2$ matrices with either $x_1 = x_2 = 0$ or $y_1 = y_2 = 0$ or both, which is the union of two perpindicular planes in 4dimensional space.

Looking back on it I suppose you were meant to use Stokes' theorem and part c.) for part d.), but it seems a bit silly to do so since the integral is already quite simple.

Also if you're curious and know a bit about complex numbers: if z = x_1 + i x_2, w = y_2 + i y_1 then z*w = (x_1y_2  x_2y_1) + i(x_1y_1 + x_2 y_2) so the map F is better interpreted as a map F: C^2 \to C, where C is the complex numbers, and the map F is just F(z, w) = z*w. This explains a lot about what is going on in the geometric parts of this problem, but I figured you'd be more likely to be familiar with linear algebra.

Thanks for the answer! I will look through it tomorrow.
 answered
 126 views
 $25.00
Related Questions
 real analysis
 Define $F : \mathbb{R}^ω → \mathbb{R}^ω$ by $F(x)_n = \sum^n_{k=1} x_k$. Determine whether $F$ restricts to give a welldefined map $F : (\ell_p, d_p) → (\ell_q, d_q)$
 Prove that $A  B=A\cap B^c$
 Use first set of data to derive a second set
 Does the sequence $f_n=\arctan (\frac{2x}{x^2+n^3})$ converge uniformly on $\mathbb{R}$?
 real analysis
 A lower bound on infinite sum of exponential functions (corrected version)
 real analysis