Applying a linear operator to a Gaussian Process results in a Gaussian
Process: Proof
In this paper, it is stated without proof or citation that
"Differentiation is a linear operation, so the derivative of a Gaussian
process remains a Gaussian process". Intuitively, this seems reasonable,
as the linear combination or Gaussian random variables is also Gaussian,
and this is just an extension to the case where instead of a vector valued
random variable, we have a random variable defined on a function space.
But I cannot find a source with a proof and the details of a proof elude
me.
Proof Outline Let $x(t)\sim \mathcal{GP}(m(t),k(t, t^\prime))$ be a
Gaussian process with mean function $m(t)$ and covariance function $k(t,
t^\prime)$, and $\mathcal{L}$ a linear operator. For any vector
$T=(t_1,...,t_n)$, let $x_T=(x(t_1),...,x(t_n))$. Then $x_T\sim
\mathcal{N}(m_T,k_{T,T})$. Now consider the stochastic process
$u(t)=\mathcal{L}x(t)$. It suffices to show that the finite dimensional
distributions of $u(t)$ are Gaussian, but translating the action of the
linear operator on $x(t)$ to the finite dimensional case is giving me
trouble.
In the case of differentiation, we have
$u(t)=\mathcal{L}x(t)=\frac{dx}{dt}=\lim_ {h\rightarrow
0}\frac{x(t+h)-x(t)}{h}$. For all $h>0$, this random variable is normal,
and by interchanging integration and the limit, we have
$$ m_u(t)=E(\lim_ {h\rightarrow 0}\frac{x(t+h)-x(t)}{h})=\lim_
{h\rightarrow 0}E( \frac{x(t+h)-x(t)}{h})=\lim_ {h\rightarrow
0}\frac{m(t+h)-m(t)}{h}=m^\prime(t) $$
Of course, we need to verify when this interchange is appropriate.
Similarly, we can intuit the covariance function of $u(t)$ has the form
$$ k_u(t,t^\prime)=\frac{\partial^2 x}{\partial t\partial t^\prime
}k(t,t^\prime) $$
but I am having a hard time making the leap from finite approximations to
the infinite dimensional case.
No comments:
Post a Comment