Poisson process question
Let $N_1(t)$ and $N_2(t)$ be independent Poisson processes with rates $\lambda_1=1$ and $\lambda_2=3$ respectively. Let $\left \{ T_n^{[1]} \right \} $ and $\left \{ T_n^{[2]} \right \} (n\ge0)$ denote the arrival times of the processes $N_1$ and $N_2$ respectively. Define for $t\geq0$
$$S(t)=\max\left \{ T_{N_1(t)}^{[1]},T_{N_2(t)}^{[2]} \right \}. $$ Compute $E[S(t)]$ (express your answer as an infinite series.)
Answer
Anyway I'm pretty sure I interpreted your question correctly: say, for definiteness, there are two sources of meteors happening continuously at rates $\lambda_1, \lambda_2$. This question is asking, in plain english, for the expected value of the last time $Last_{1, 2}(t) =: S(t)$ you've seen a Meteor at time $t$ (arguably a more interesting statistic in real life is the expected value of $t - Last_{1, 2}(t)$, but never mind).
It is annoying to compute anything about a statistic like $T^{[1]}_{N_1(t)}$, so let's do everything conditionally: $$ E[S(t)] = \sum_{n = 0}^\infty\sum_{m = 0}^\infty E[S(t)|N_1(t) = n, N_2(t) = m] P(N_1(t) = n, N_2(t) = m)$$
Now we have an ugly infinite sum but both factors of each term are very computable. Let's simplify: $$E[S(t)|N_1(t) = n, N_2(t) = m] = E[Max(T_n^{[1]}, T_m^{[2]})].$$ Let's think about this in a clearer way, so let's focus on just one Poisson process $N_1$. Then, conditioned on $N_t(t) = n$ we know that there are $n$ arrival times $T_1, \dots, T_n$ in the interval $[0, t]$. Let's just look at the distribution of the first one to start, we'll try to compute the conditional cdf: $$P(T_1 \leq s|N_1(t) = n) = 1 - P(N_1(s) = 0 | N_1(t) = n) $$ and this latter term gives by the memoryless property of the Poisson distribution: $$1 - P(N_1(s) = 0 | N_1(t) = n) = 1 - \frac{P(N_1(s) = 0 \cap N_1(t) = n)}{P(N_t(t) = n)}$$recallying that the pmf of the number of events in a Poisson process is, by definition $P(N(t) = n) = \frac{(\lambda \cdot t)^ne^{-\lambda t}}{n!}$ and is the same when any interval of length $t$ replaces $[0, t]$ thus: $$= 1 - \frac{P(N_1(s) = 0 \cap N_1([s, t]) = n)}{P(N_t(t) = n)} = 1 - \frac{e^{-\lambda \cdot s} \cdot (t - s)^n e^{-\lambda(t-s)}}{e^{-\lambda \cdot t }t^n}$$$$ = 1 - \frac{(t - s)^n}{t^n} $$ so we see that, as the gaps between arrival times are independent, the $n$ arrival times are uniformly distributed. Thus we see that $P(\text{Max}(T_1, \dots, T_n) < s) = \frac{s^n}{t^n}$, completely independent of $\lambda$ (when tacitly conditioned on $n$). This tells us that $$E[S(t) | N_1(t) = n, N_2(t) = m] = \int_{0}^t P(S(t) \geq s)ds$$ which gives us the very beautiful result $$ = \int_{0}^t (1 - P(T^{[1]}_n < s, T^{[2]}_m < s))dt = \int_{0}^t \left(1 - \frac{s^{n + m}}{t^{n + m}}\right) ds $$$$ = \frac{t(n + m)}{n + m + 1}.$$ On the other hand, we know by definition of a Poisson process and independence that: $$P(N_1(t) = n, N_2(t) = m) = \frac{(\lambda_1 \cdot t)^ne^{-\lambda_1 t}}{n!} \frac{(\lambda_2 \cdot t)^me^{-\lambda_2 t}}{m!}$$ more legibly: $$ = \frac{(\lambda_1^n\lambda_2^m)t^{n + m}e^{-(\lambda_1 + \lambda_2) t}}{n!m!}$$
Putting this all together, this gives:
$$\left(\sum_{n = 0}^\infty \sum_{m = 0}^\infty \frac{(n +m)(\lambda_1^n \lambda_2^m)}{(n + m + 1)\cdot n!\cdot m!} t^{m + n + 1} \right)e^{-(\lambda_1 + \lambda_2)t}$$
Letting $N = n + m$ we get the more pleasing: $$\left(\sum_{N = 0}^\infty \sum_{n = 0}^N \frac{N(\lambda_1^n \lambda_2^{N - n})}{(N + 1)\cdot n!\cdot (N - n)!} t^{N + 1} \right)e^{-(\lambda_1 + \lambda_2)t}$$
Finally as $\lambda_1 = 1, \lambda_2 = 3$ we get:
$$= \left(\sum_{N = 0}^\infty \sum_{n = 0}^N \frac{N\cdot 3^{N - n}}{(N + 1)\cdot n!\cdot (N - n)!} t^{N + 1} \right)e^{-4t}$$
-
It might interest you that this seems to simplify further to simply t - \frac{1}{\lambda_1 + \lambda_2} + o(C) very beautifully, as one might expect.
-
Also you can make this much easier on yourself if you know that the sum of two poisson processes is Poisson, in which case one can just compute directly with the sum of the two distributions.
- answered
- 1698 views
- $15.00
Related Questions
- Probability of choosing the bakery with the best bread
- Prove that the following sequences monotnically decrease and increase correspondingly. Since they are bounded, find the limit.
- Bayes theorema question, two tests (one positive, one negative)
-
Poker Outcomes and Variance: Calculating Likelihood of an Observed Outcome
- X is number of (fair) coin flips needed to land m heads OR m tails. m is arbitrary natural number. Delfine CDF of X. (in It's simplest form)
- Hypothesis Testing, Probabilities
- Trajectory detection in noise - the probability of at least one random point being within epsilon distance of the trajectory?
- Probability of picking a red ball
If I've understood your setup correctly you want to know the expected value of the last time either the process N_1 or the process N_2 has happened, is that correct?