Random process in probability theory
A compound Poisson process is a continuous-time stochastic process with jumps. The jumps arrive randomly according to a Poisson process and the size of the jumps is also random, with a specified probability distribution. To be precise, a compound Poisson process, parameterised by a rate
and jump size distribution G, is a process
given by
![{\displaystyle Y(t)=\sum _{i=1}^{N(t)}D_{i}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5942ff0f7afd66a0a9ea1b606110ef037fa80c97)
where,
is the counting variable of a Poisson process with rate
, and
are independent and identically distributed random variables, with distribution function G, which are also independent of
When
are non-negative integer-valued random variables, then this compound Poisson process is known as a stuttering Poisson process. [citation needed]
Properties of the compound Poisson process
[edit]
The expected value of a compound Poisson process can be calculated using a result known as Wald's equation as:
![{\displaystyle \operatorname {E} (Y(t))=\operatorname {E} (D_{1}+\cdots +D_{N(t)})=\operatorname {E} (N(t))\operatorname {E} (D_{1})=\operatorname {E} (N(t))\operatorname {E} (D)=\lambda t\operatorname {E} (D).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ac136fbc1dcdeac1cb4b6f6c0b4a7bc032e18f10)
Making similar use of the law of total variance, the variance can be calculated as:
![{\displaystyle {\begin{aligned}\operatorname {var} (Y(t))&=\operatorname {E} (\operatorname {var} (Y(t)\mid N(t)))+\operatorname {var} (\operatorname {E} (Y(t)\mid N(t)))\\[5pt]&=\operatorname {E} (N(t)\operatorname {var} (D))+\operatorname {var} (N(t)\operatorname {E} (D))\\[5pt]&=\operatorname {var} (D)\operatorname {E} (N(t))+\operatorname {E} (D)^{2}\operatorname {var} (N(t))\\[5pt]&=\operatorname {var} (D)\lambda t+\operatorname {E} (D)^{2}\lambda t\\[5pt]&=\lambda t(\operatorname {var} (D)+\operatorname {E} (D)^{2})\\[5pt]&=\lambda t\operatorname {E} (D^{2}).\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5a818cc242b7003a3d5f043f431fdf57801e9734)
Lastly, using the law of total probability, the moment generating function can be given as follows:
![{\displaystyle \Pr(Y(t)=i)=\sum _{n}\Pr(Y(t)=i\mid N(t)=n)\Pr(N(t)=n)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/cad06d8edca1d55ac45254d3cc4d6b7041a5ffe1)
![{\displaystyle {\begin{aligned}\operatorname {E} (e^{sY})&=\sum _{i}e^{si}\Pr(Y(t)=i)\\[5pt]&=\sum _{i}e^{si}\sum _{n}\Pr(Y(t)=i\mid N(t)=n)\Pr(N(t)=n)\\[5pt]&=\sum _{n}\Pr(N(t)=n)\sum _{i}e^{si}\Pr(Y(t)=i\mid N(t)=n)\\[5pt]&=\sum _{n}\Pr(N(t)=n)\sum _{i}e^{si}\Pr(D_{1}+D_{2}+\cdots +D_{n}=i)\\[5pt]&=\sum _{n}\Pr(N(t)=n)M_{D}(s)^{n}\\[5pt]&=\sum _{n}\Pr(N(t)=n)e^{n\ln(M_{D}(s))}\\[5pt]&=M_{N(t)}(\ln(M_{D}(s)))\\[5pt]&=e^{\lambda t\left(M_{D}(s)-1\right)}.\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5b8480ad2cecd8cd45d38ad108824ed88fda17cc)
Exponentiation of measures
[edit]
Let N, Y, and D be as above. Let μ be the probability measure according to which D is distributed, i.e.
![{\displaystyle \mu (A)=\Pr(D\in A).\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b50aecdf78cf46b123c3db3d4ee859895944fe64)
Let δ0 be the trivial probability distribution putting all of the mass at zero. Then the probability distribution of Y(t) is the measure
![{\displaystyle \exp(\lambda t(\mu -\delta _{0}))\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/869ad48fd9a8cd9e7745237d2ae13184da3b2ee1)
where the exponential exp(ν) of a finite measure ν on Borel subsets of the real line is defined by
![{\displaystyle \exp(\nu )=\sum _{n=0}^{\infty }{\nu ^{*n} \over n!}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/44821bf9cce7ee17cb85e23a4620b707951c8757)
and
![{\displaystyle \nu ^{*n}=\underbrace {\nu *\cdots *\nu } _{n{\text{ factors}}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5abad2086c46b8fefb4cc961412d3aba7e0d62c4)
is a convolution of measures, and the series converges weakly.