# 统计代写|随机过程代写stochastic process代考|The Hierarchy of Convergences

## 统计代写|随机过程代写stochastic process代考|Almost-sure vs in Probability

Theorem 4.5.1
A. If the sequence $\left{Z_n\right}_{n \geq 1}$ of complex random variables converges almost surely to some complex random variable $Z$, it also converges in probability to the same random variable $Z$.

B. If the sequence of complex random variables $\left{X_n\right}_{n \geq 1}$ converges in probability to the complex random variable $X$, one can find a sequence of integers $\left{n_k\right}_{k \geq 1}$, strictly increasing, such that $\left{X_{n_k}\right}_{k \geq 1}$ converges almost surely to $X$.
(B says, in other words: From a sequence converging in probability, one can extract a subsequence converging almost surely.)
Proof. A. Suppose almost-sure convergence. By Theorem 4.1.3, for all $\varepsilon>0$,
$$P\left(\left|Z_n-Z\right| \geq \varepsilon \text { i.o. }\right)=0$$
that is
$$P\left(\cap_{n \geq 1} \cup_{k=n}^{\infty}\left(\left|Z_k-Z\right| \geq \varepsilon\right)\right)=0$$
or (sequential continuity of probability)
$$\lim {n \uparrow \infty} P\left(\cup{k=n}^{\infty}\left(\left|Z_k-Z\right| \geq \varepsilon\right)\right)=0$$
which in turn implies that
$$\lim {n \uparrow \infty} P\left(\left|Z_n-Z\right| \geq \varepsilon\right)=0$$ B. By definition of convergence in probability, for all $\varepsilon>0$, $$\lim {n \uparrow \infty} P\left(\left|X_n-X\right| \geq \varepsilon\right)=0$$
Therefore one can find $n_1$ such that $P\left(\left|X_{n_1}-X\right| \geq \frac{1}{1}\right) \leq\left(\frac{1}{2}\right)$. Then, one can find $n_2>$ $n_1$ such that $P\left(\left|X_{n_2}-X\right| \geq \frac{1}{2}\right) \leq\left(\frac{1}{2}\right)^2$, and so on, until we have a strictly increasing sequence of integers $n_k(k \geq 1)$ such that
$$P\left(\left|X_{n_k}-X\right| \geq \frac{1}{k}\right) \leq\left(\frac{1}{2}\right)^k$$
It then follows from Theorem 4.1 .2 that
$$\lim {k \uparrow \infty} X{n_k}=X \quad \text { a.s. }$$

## 统计代写|随机过程代写stochastic process代考|The Rank of Convergence in Distribution

We now compare convergence in distribution to the other types of convergence. Convergence in distribution is weaker than almost-sure convergence:

Theorem 4.5.4 If the sequence $\left{X_n\right}_{n \geq 1}$ of random vectors of $\mathbb{R}^d$ converges almost surely to some random vector $X$, it also converges in distribution to the same vector $X$.
Proof. By dominated convergence, for all $u \in \mathbb{R}$,
$$\lim {n \uparrow \infty} \mathrm{E}\left[\mathrm{e}^{i\left\langle u, X_n\right\rangle}\right]=\mathrm{E}\left[\mathrm{e}^{i\langle u, X\rangle}\right]$$ which implies, by Theorem 4.4.6 that $\left{X_n\right}{n \geq 1}$ converges in distribution to $X$.
In fact, convergence in distribution is even weaker than convergence in probability.
Theorem 4.5.5 If the sequence $\left{X_n\right}_{n \geq 1}$ of random vectors of $\mathbb{R}^d$ converges in probability to some random vector $X$, it also converges in distribution to $X$.

Proof. If this were not the case, one could find a function $f \in C_b\left(\mathbb{R}^d\right)$ such that $\mathrm{E}\left[f\left(X_n\right)\right]$ does not converge to $\mathrm{E}[f(X)]$. In particular, there would exist a subsequence $n_k$ and some $\varepsilon>0$ such that $\left|\mathrm{E}\left[f\left(X_{n_k}\right)\right]-\mathrm{E}[f(X)]\right| \geq \varepsilon$ for all $k$. As $\left{X_{n_k}\right}_{k \geq 1}$ converges in probability to $X$, one can extract from it a subsequence $\left{X_{n_{k_{\ell}}}\right}_{\ell \geq 1}$ converging almost surely to $X$. In particular, since $f$ is bounded and continuous, $\lim {\ell} \mathrm{E}\left[f\left(X{n_{k_{\ell}}}\right]=\mathrm{E}[f(X)]\right.$ by dominated convergence, a contradiction.

Combining Theorems 4.5 .3 and 4.5 .5 , we have that convergence in distribution is weaker than convergence in the quadratic mean:

Theorem 4.5.6 If the sequence of real random variables $\left{Z_n\right}_{n \geq 1}$ converges in quadratic mean to some random variable $Z$, it also converges in distribution to the same random variable $Z$.

# 随机过程代考

## 统计代写|随机过程代写stochastic process代考|Almost-sure vs in Probability

B. 如果复杂随机变量的序列 $\backslash$ \eft{X_n $\backslash$ right}_{n $\backslash g e q$ 1} 在概率上收敛于复杂的随机变量 $X$, 可以找到一个整数序列 $\backslash$ left{n_k right}_{k \geq 1}，严格递增，使得
( $B$ 说，换句话说：从概率收敛的序列，可以提取几乎肯定收敛的子序列。)

$$P\left(\left|Z_n-Z\right| \geq \varepsilon \text { i.o. }\right)=0$$

$$P\left(\cap_{n \geq 1} \cup_{k=n}^{\infty}\left(\left|Z_k-Z\right| \geq \varepsilon\right)\right)=0$$

$$\lim n \uparrow \infty P\left(\cup k=n^{\infty}\left(\left|Z_k-Z\right| \geq \varepsilon\right)\right)=0$$

$$\lim n \uparrow \infty P\left(\left|Z_n-Z\right| \geq \varepsilon\right)=0$$
B. 根据概率收敛的定义，对于所有 $\varepsilon>0$ ，
$$\lim n \uparrow \infty P\left(\left|X_n-X\right| \geq \varepsilon\right)=0$$

$$P\left(\left|X_{n_k}-X\right| \geq \frac{1}{k}\right) \leq\left(\frac{1}{2}\right)^k$$

$$\lim k \uparrow \infty X n_k=X \quad \text { a.s. }$$

## 统计代写|随机过程代写stochastic process代考|The Rank of Convergence in Distribution

$$\lim n \uparrow \infty \mathrm{E}\left[\mathrm{e}^{i\left\langle u, X_n\right\rangle}\right]=\mathrm{E}\left[\mathrm{e}^{i\langle u, X\rangle}\right]$$

$\left|\mathrm{E}\left[f\left(X_{n_k}\right)\right]-\mathrm{E}[f(X)]\right| \geq \varepsilon$ 对全部 $k$. 作为 $\backslash$ left{X_{n_k}|right}_{k \geq 1} 1$}$ 收敛于概率 $X$ 可以从中提取一个子序列 $\$ left{X_{n_{k_{ell}}})、ight}_nell $\backslash g e q$ 1}几乎肯定收敛于 $X$. 特别 是，因为 $f$ 是有界且连续的， $\lim \ell \mathrm{E}\left[f\left(X n_{k_{\ell}}\right]=\mathrm{E}[f(X)]\right.$ 受支配趋同，矛盾。

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: