Finding Probability Density Function of a Standard Brownian motion: Conditioning for two different cases

I attached a pdf of the problem. This is mostly an algebra-heavy problem. I made some mistakes on part a, but my approach to part b was incorrect, leaving a mess of unecessary algebra all over. 

I would greatly appreciate if both parts a and b were solved with details and reasoning, especially for the steps that involve Stochastic Calculus arguments. Please let me know if further detail must be provided. Thank you!

  • I believe there is a typo in the problem. If t1

  • I believe there is a typo in the problem. If t1 < t2, it only makes sense to ask what is the distribution of W_t2 given W_t1. Please double check this.

  • Hi Rage, I believe you are correct. To be sure, I asked my professor about whether or not it is a typo on his end. I will update as soon as I get a response. Thank you!

Answer

Part (a):

In general, given two random variables $X, Y$ with joint density $f(x, y)$, the conditional density $g_y (x)$ of $X$ given $Y = y$ is given by

$$g_y (x) := \frac{f(x, y)}{h(y)}.$$
where $h$ is the probability density function of $Y$.

(See, for instance Baldi's Stochastic Calculus, Chapter 4.)

Thus for Part (a), it suffices to compute the density of $W_{t_2}$, as well as the joint density of $W_{t_1}$ and $W_{t_2}$.

Recall that Brownian motion increments $W_t - W_s$ are independent and normally distributed, with mean $0$ and variance $t - s$. 

Since by definition $W_0 = 0$ a.s, we can write $W_{t_2} = W_{t_2} - W_0$, and hence $W_{t_2}$ is normally distributed with mean $0$ and variance $t_2$. Thus it has distribution function

$$f_{W_{t_2}} (y) = \frac{1}{ \sqrt {2\pi t_2}} e^{-\frac{y^2}{2 t_2}}.$$

Now we compute the joint density. Write $W_{t_2} = W_{t_1} + (W_{t_2} - W_{t_1}) = W_{t_1} + Z$, where $Z := W_{t_2} - W_{t_1}$.

By definition of Brownian motion, $W_{t_1}$ and $Z$ are independent, and thus have joint density 

$$f_{W_{t_1}, Z} (x, z) = f_{W_{t_1}} (x) f_{Z} (z).$$

By the change of variables rule for random variables (see, for instance https://faculty.math.illinois.edu/~r-ash/Stat/StatLec1-5.pdf),

$$f_{W_{t_1}, W_{t_2}} (x, y) = f_{W_{t_1}, W_{t_1} + Z} (x, y) = J(u) f_{W_{t_1}, Z} (x, y - x),$$
where $J(u)$ is the determinant of the Jacobian of the map $u: \mathbb R^2 \to \mathbb R^2$ given by

$$u(x, y) = (x, y - x).$$

A routine calculation shows that $J(u) = 1$ identically, and so

$$f_{W_{t_1}, W_{t_2}} (x, y) = f_{W_{t_1}, Z} (x, y - x)$$

$$ = f_{W_{t_1}} (x) f_{Z} (y - x)$$

$$ = \left (  \frac{1}{ \sqrt {2\pi t_1}} e^{-\frac{x^2}{2 t_1}} \right ) \left ( \frac{1}{ \sqrt {2\pi (t_2 - t_1)}} e^{-(\frac{(y - x)^2}{2 (t_2 - t_1)})} \right)$$

where in the third equality we recall that $Z = B_{t_2} - B_{t_1}$ is normally distributed with mean $0$ and variance $t_2 - t_1$.

Thus finally we obtain the conditional density $g_{x_2}$ of $W_{t_1}$ given $W_{t_2} = x_2$ as

$$g_{x_2} (x) = \frac{f_{W_{t_1}, W_{t_2}} (x, y)}{f_{W_{t_2}} (y)}$$

$$= \frac{ \left ( \frac{1}{\sqrt {2\pi t_1}} e^{\frac{x^2}{2 t_1}} \right ) \left ( \frac{1}{ \sqrt {2\pi (t_2 - t_1)}} e^{-\frac{(y - x)^2}{2 (t_2 - t_1)}} \right)}{\frac{1}{ \sqrt {2\pi t_2} } e^{-(\frac{y^2}{2 t_2})}}.$$

$$= \sqrt {\frac{t_2}{ 2 \pi t_1(t_2 - t_1) }} \text{exp}\left (\frac{y^2}{2 t_2} - \frac{x^2}{2 t_1} - \frac{(y - x)^2}{2 (t_2 - t_1)} \right ).$$

Part (b):

The formula at the start of the answer to Part (a) generalizes. Indeed, given three random variables $X, Y, Z$ with joint density $f(x, y, z)$, the conditional density $g_{x, z} (y)$ of $Y$ given $X = x$ and $Z = z$ is given by

$$g_{x, z} (y) = \frac{f(x, y, z)}{h(x, z)},$$

where $h(x, z)$ denotes the joint density of $X, Z$.

Let us write $U := W_{t_2} - W_{t_1}$, and $V := W_{t_3} - W_{t_1}.$

Then $W_{t_1}, U, V$ are independent, and we have

$$W_{t_2} = W_{t_1} + U,$$

$$W_{t_3} = W_{t_1} + U + V.$$

Once again by change of variables, we have

$$f_{W_{t_1}, W_{t_2}, W_{t_3}} (x, y, z)$$

$$= f_{W_{t_1}, W_{t_1} + U, W_{t_1} + U + V} (x, y, z)$$

$$ = J(q) f_{W_{t_1}, U, V} (x, y - x, z - y - x).$$

where $J(q)$ denotes the determinant of the Jacobian of the map $q: \mathbb R^3  \to \mathbb R^3$ given by

$$q(x, y, z) = (x, y -x, z - y - x).$$

Again $J(q) = 1$ identically, and so we have

$$f_{W_{t_1}, W_{t_2}, W_{t_3}} (x, y, z)$$

$$ = f_{W_{t_1}, U, V} (x, y - x, z - y - x).$$

$$ = \left (\frac{1}{\sqrt {2\pi t_1}} e^{-\frac{x^2}{2 t_1}} \right ) \left (\frac{1}{ \sqrt {2\pi (t_2 - t_1)}} e^{\frac{(y - x)^2}{2 (t_2 - t_1)}} \right ) \left (\frac{1}{ \sqrt {2\pi(t_3 - t_2)}} e^{-\frac{(z - y - x)^2}{2 (t_3 - t_2)}} \right ).$$

On the other hand, we have already seen from Part (a) that

$$f_{W_{t_1}, W_{t_3}} (x, z) = \left ( \frac{1}{\sqrt {2\pi t_1}} e^{-\frac{x^2}{2 t_1}} \right ) \left ( \frac{1}{\sqrt {2\pi (t_3 - t_1) }} e^{-\frac{(z - x)^2}{2 (t_3 - t_1)}} \right).$$

Thus, we finally obtain the conditional density $g$ of $W_{t_2}$ given $W_{t_1} = x_1$ and $W_{t_3} = x_3$ as

$$g(y) = \frac{\left (\frac{1}{ \sqrt {2\pi t_1}} e^{-\frac{x_1^2}{2 t_1}} \right ) \left (\frac{1}{\sqrt {2\pi (t_2 - t_1)}} e^{-\frac{(y - x_1)^2}{2 (t_2 - t_1)}} \right ) \left (\frac{1}{\sqrt {2\pi (t_3 - t_2) }} e^{-\frac{(x_3 - y - x_1)^2}{2 (t_3 - t_2)}} \right )}{\left ( \frac{1}{ \sqrt {2\pi t_1}} e^{-\frac{x_1^2}{2 t_1}} \right ) \left ( \frac{1}{ \sqrt {2\pi (t_3 - t_1)}} e^{-\frac{(x_3 - x_1)^2}{2 (t_3 - t_1)}} \right)}.$$

$$ = \sqrt {\frac{t_3 - t_1}{2\pi(t_3 - t_2)(t_2 - t_1)}} \text{exp}\left (\frac{(x_3 - x_1)^2}{2 (t_3 - t_1)} - \frac{(y - x_1)^2}{2 (t_2 - t_1)} -  \frac{(x_3 - y - x_1)^2}{2 (t_3 - t_2)} \right ).$$

  • Please let me know if there are any algebra errors, though I believe the overall structure of the proof is correct.

  • Hello Bromster. Thank you for the response. Referring to the t's in the denominator with "1/sqrt(2pi)" , can you explain why the t's aren't under the square root? Should they be, or does it matter?

  • Oh, of course you are right. The variance is $t$ so the standard deviation should be $\sqrt t$. Changing to reflect this.

  • Thanks for the tip!

The answer is accepted.