## 数学代写|概率论代写Probability theory代考|Probability Distribution on Metric Space

Definition 5.2.1. Distribution on a complete metric space. Suppose $(S, d)$ is a complete metric space. Let $n \geq 1$ be arbitrary. Recall the function $h_n \equiv 1 \wedge(1+$ $\left.n-d\left(\cdot, x_{\circ}\right)\right){+} \in C{u b}(S, d)$, where $x_{\circ} \in S$ is an arbitrary but fixed reference point. Note that the function $h_n$ has bounded support. Hence $h_n \in C(S, d)$ if $(S, d)$ is locally compact.

Let $J$ be an integration on $\left(S, C_{u b}(S, d)\right)$, in the sense of Definition 4.3.1. Suppose $J h_n \uparrow 1$ as $n \rightarrow \infty$. Then the integration $J$ is called a probability distribution, or simply a distribution, on $(S, d)$. We will let $\widehat{J}(S, d)$ denote the set of distributions on the complete metric space $(S, d)$.

Lemma 5.2.2. Distribution basics. Suppose $(S, d)$ is a complete metric space. Then the following conditions hold:

1. Let $J$ be an arbitrary distribution on $(S, d)$. Let $(S, L, J) \equiv\left(S, \overline{C_{u b}(S)}, J\right)$ be the complete extension of the integration space $\left(S, C_{u b}(S), J\right)$. Then $(S, L, J)$ is a probability space.
2. Suppose the metric space $(S, d)$ is bounded. Let $J$ be an integration on $\left(S, C_{u b}(S)\right)$ such that $J 1=1$. Then the integration $J$ is a distribution on $(S, d)$.
3. Suppose $(S, d)$ is locally compact. Let $J$ be an integration on $(S, C(S, d))$ in the sense of Definition 4.2.1. Suppose $J h_n \uparrow 1$ as $n \rightarrow \infty$. Then $J$ is a distribution on $(S, d)$.

Proof. 1. By Definition 5.2.1, $J h_n \uparrow 1$ as $n \rightarrow \infty$. At the same time, $h_n \uparrow 1$ on $S$. The Monotone Convergence Theorem therefore implies that $1 \in L$ and $J 1=1$. Thus $(S, L, J)$ is a probability space.

## 数学代写|概率论代写Probability theory代考|Weak Convergence of Distributions

Recall that if $X$ is an r.v. on a probability space $(\Omega, L, E)$ with values in $S$, then $E_X$ denotes the distribution induced on $S$ by $X$.

Definition 5.3.1. Weak convergence of distributions on a complete metric space. Recall that $\widehat{J}(S, d)$ denotes the set of distributions on a complete metric space $(S, d)$. A sequence $\left(J_n\right){n=1,2, \ldots}$ in $\widehat{J}(S, d)$ is said to converge weakly to $J \in \widehat{J}(S, d)$ if $J_n f \rightarrow J f$ for each $f \in C{u b}(S)$. We then write $J_n \Rightarrow J$. Suppose $X, X_1, X_2, \ldots$ are r.v.’s with values in $S$, not necessarily on the same probability space. The sequence $\left(X_n\right){n=1,2, \ldots}$ of r.v.’s is said to converge weakly to r.v. $X$, or to converge in distribution, if $E{X(n)} \Rightarrow E_X$. We then write $X_n \Rightarrow X$.

Proposition 5.3.2. Convergence in probability implies weak convergence. Let $\left(X_n\right)_{n=0,1, \ldots}$ be a sequence of r.v’s on the same probability space $(\Omega, L, E)$, with values in a complete metric space $(S, d)$. If $X_n \rightarrow X_0$ in probability, then $X_n \Rightarrow X_0$

Proof. Suppose $X_n \rightarrow X_0$ in probability. Let $f \in C_{u b}(S)$ be arbitrary, with $|f| \leq c$ for some $c>0$, and with a modulus of continuity $\delta_f$. Let $\varepsilon>0$ be arbitrary. By Definition 5.1 .8 of convergence in probability, there exists $p \geq 1$ so large that for each $n \geq p$, there exists an integrable set $B_n$ with $P\left(B_n\right)<\varepsilon$ and $$B_n^c \subset\left(d\left(X_n, X_0\right)<\delta_f(\varepsilon)\right) \subset\left(\left|f\left(X_n\right)-f\left(X_0\right)\right|<\varepsilon\right)$$ Consider each $n \geq p$. Then \begin{aligned} \left|E f\left(X_n\right)-E f\left(X_0\right)\right| & =E\left|f\left(X_n\right)-f\left(X_0\right)\right| 1_{B(n)}+E\left|f\left(X_n\right)-f\left(X_0\right)\right| 1_{B(n)^c} \ & \leq 2 c P\left(B_n\right)+\varepsilon<2 c \varepsilon+\varepsilon \end{aligned} Since $\varepsilon>0$ is arbitrarily small, we conclude that $E f\left(X_n\right) \rightarrow E f\left(X_0\right)$. Equivalently, $J_{X(n)} f \rightarrow J_{X(0)} f$. Since $f \in C_{u b}(S)$ is arbitrary, we have $J_{X(n)} \Rightarrow J_{X(0)}$. In other words, $X_n \Rightarrow X_0$.

# 概率论代考

## 数学代写|概率论代写Probability theory代考|Probability Distribution on Metric Space

1. 让 $J$ 是一个任意分布 $(S, d)$. 让 $(S, L, J) \equiv\left(S, \overline{C_{u b}(S)}, J\right)$ 成为整合空间的完整延 伸 $\left(S, C_{u b}(S), J\right)$. 然后 $(S, L, J)$ 是一个概率空间。
2. 假设度量空间 $(S, d)$ 是有界的。让 $J$ 成为一个整合 $\left(S, C_{u b}(S)\right)$ 这样 $J 1=1$. 然后整 合 $J$ 是一个分布 $(S, d)$.
3. 认为 $(S, d)$ 是局部紧凑的。让 $J$ 成为一个整合 $(S, C(S, d))$ 在定义 4.2.1 的意义上。 认为 $J h_n \uparrow 1$ 作为 $n \rightarrow \infty$. 然后 $J$ 是一个分布 $(S, d)$.
证明。1. 根据定义 5.2.1， $J h_n \uparrow 1$ 作为 $n \rightarrow \infty$. 同时， $h_n \uparrow 1$ 在 $S$. 因此，单调收敛定 理意味着 $1 \in L$ 和 $J 1=1$. 因此 $(S, L, J)$ 是一个概率空间。

## 数学代写|概率论代写Probability theory代考|Weak Convergence of Distributions

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: