The original proof of the Leighton-Shor theorem amounts basically to perform by hand a kind of generic chaining in this highly non-trivial case, an incredible tour de force. ${ }^{23} \mathrm{~A}$ first attempt was made in [92] to relate (an important consequence of) the Leighton-Shor theorem to general methods for bounding stochastic processes but runs into technical complications. Coffman and Shor [26] then introduced the use of Fourier transforms and brought to light the role of ellipsoids, after which it became clear that the structure of these ellipsoids plays a central part in these matching results, a point of view systematically expounded in [114].

Chapter 17 is a continuation of the present chapter. The more difficult material it contains is presented later for fear of scaring readers at this early stage. A notable feature of the result presented there is that ellipsoids do not suffice, a considerable source of complication. The material of Appendix A is closely related to the Leighton-Shor theorem.

The original results of [3] are proved using an interesting technique called the transportation method. A version of this method, which avoids many of the technical difficulties of the original approach, is presented in [134]. With the notation of Theorem 4.5.1, it is proved in [134] (a stronger version of the fact) that with probability $\geq 9 / 10$, one has
$$\inf \pi \frac{1}{N} \sum{i \leq N} \exp \left(\frac{N d\left(X_i, Y_{\pi(i)}\right)^2}{L \log N}\right) \leq 2 .$$
Since $\exp x \geq x$, (4.131) implies that $\sum_{i \leq N} d\left(X_i, Y_{\pi(i)}\right)^2 \leq L \log N$ and hence using the Cauchy-Schwarz inequality $\sum_{i \leq N} d\left(X_i, Y_{\pi(i)}\right) \leq L \sqrt{N \log N}$. Moreover, (4.131) also implies $\max {i \leq N} d\left(X_i, Y{\pi(i)}\right) \leq L \log N / \sqrt{N}$. This unfortunately fails to bring a positive answer to Question 4.9.1.

For results about matching for unbounded distributions, see the work of J. Yukich [146] as well as the nonstandard results of [133].

## 统计代写|随机过程代写stochastic process代考|p-Stable Processes as Conditionally Gaussian Processes

Consider a number $02$, no r.v. satisfies (5.1). The case $p=2$ is the Gaussian case, which we now understand very well, so from now on we assume $p<2$. Despite the formal similarity, this is very different from the Gaussian case. It can be shown that
$$\lim _{s \rightarrow \infty} s^p \mathrm{P}(|X| \geq s)=c_p \sigma_p(X)^p$$
where $c_p>0$ depends on $p$ only. Thus $X$ does not have moments of order $p$, but it has moments of order $q$ for $q<p$. We refer the reader to [53] for a proof of this and for general background on $p$-stable processes.

A process $\left(X_t\right){t \in T}$ is called $p$-stable if, for every family $\left(\alpha_t\right){t \in T}$ for which only finitely many of the numbers $\alpha_t$ are not 0 , the r.v. $\sum_t \alpha_t X_t$ is $p$-stable. We can then define a (quasi-)distance $d$ on $T$ by
$$d(s, t)=\sigma_p\left(X_s-X_t\right)$$
When $p>1$, a $p$-stable r.v. is integrable, and $\mathrm{E}|X|$ is proportional to $\sigma_p(X)$. Thus one can also define an equivalent distance by $d^{\prime}(s, t)=\mathrm{E}\left|X_s-X_t\right|$.

A typical example of a $p$-stable process is given by $X_t=\sum_{i \leq n} t_i Y_i$ where $t=$ $\left(t_i\right){i \leq n}$ and $\left(Y_i\right){i \leq n}$ are independent $p$-stable r.v.s. It can in fact be shown that this example is generic in the sense that “each $p$-stable process (with a finite index set) can be arbitrarily well approximated by a process of this type”. Assuming further that $\sigma_p\left(Y_i\right)=1$ for each $i,(5.2)$ implies that the distance induced by the process is then the $\ell^p$ distance, $d\left(X_s, X_t\right)=|s-t|_p$.

# 随机过程代考

Leighton-Shor 定理的原始证明基本上相当于在这种非常重要的情况下手动执行一种通 用链㢺，这是一项令人难以置信的杰作。 ${ }^{23} \mathrm{~A}$ 第一次尝试是在 [92] 中将 Leighton-Shor 定理 (的一个重要结果) 与限制随机过程的一般方法联系起来，但遇到了技术上的困 难。Coffman 和 Shor [26] 随后介绍了傅里叶变换的使用并闸明了椭球体的作用，之后 很明显这些椭球体的结构在这些匹配结果中起着核心作用，这一观点在 [ 114]。

[3] 的原始结果使用一种称为运输方法的有趣技术得到证明。[134] 中介绍了这种方法的 一个版本，它避免了原始方法的许多技术困难。使用定理 4.5.1 的符号，在 [134] (事实 的更强版本）中证明了概率 $\geq 9 / 10$, 一个有
$$\inf \pi \frac{1}{N} \sum i \leq N \exp \left(\frac{N d\left(X_i, Y_{\pi(i)}\right)^2}{L \log N}\right) \leq 2 .$$

## 统计代写|随机过程代写stochastic process代考|p-Stable Processes as Conditionally Gaussian Processes

$$\lim {s \rightarrow \infty} s^p \mathrm{P}(|X| \geq s)=c_p \sigma_p(X)^p$$ 在哪里 $c_p>0$ 依赖于取决于 $p$ 仅有的。因此 $X$ 没有秩序的时刻 $p$ ，但它有秩序的时刻 $q$ 为 $了 q1 ， A p$-stable rv 是可积的，并且E $|X|$ 正比于 $\sigma_p(X)$. 因此，我们也可以 定义一个等效距离 $d^{\prime}(s, t)=\mathrm{E}\left|X_s-X_t\right|$. 一个典型的例子 $p$-稳定的过程由 $X_t=\sum{i \leq n} t_i Y_i$ 在哪里 $t=\left(t_i\right) i \leq n$ 和 $\left(Y_i\right) i \leq n$ 是 独立的 $p$-stable rvs 实际上可以证明这个例子是通用的，因为“每个 $p$-稳定过程 (具有有 限索引集) 可以通过这种类型的过程任意地很好地近似”。进一步假设 $\sigma_p\left(Y_i\right)=1$ 每个 $i,(5.2)$ 意味着由过程引起的距离是 $\ell^p$ 距离, $d\left(X_s, X_t\right)=|s-t|_p$.

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: