# 统计代写|线性回归代写linear regression代考|STA621

## 统计代写|线性回归代写linear regression代考|Elliptically Contoured Distributions

Definition 10.4: Johnson (1987, pp. 107-108). A $p \times 1$ random vector $\boldsymbol{X}$ has an elliptically contoured distribution, also called an elliptically symmetric distribution, if $\boldsymbol{X}$ has joint pdf
$$f(z)=k_p|\boldsymbol{\Sigma}|^{-1 / 2} g\left[(z-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(z-\boldsymbol{\mu})\right]$$
and we say $\boldsymbol{X}$ has an elliptically contoured $E C_p(\boldsymbol{\mu}, \boldsymbol{\Sigma}, g)$ distribution.

If $\boldsymbol{X}$ has an elliptically contoured (EC) distribution, then the characteristic function of $\boldsymbol{X}$ is
$$\phi_{\boldsymbol{X}}(\boldsymbol{t})=\exp \left(i \boldsymbol{t}^T \boldsymbol{\mu}\right) \psi\left(\boldsymbol{t}^T \boldsymbol{\Sigma} \boldsymbol{t}\right)$$
for some function $\psi$. If the second moments exist, then
$$E(\boldsymbol{X})=\boldsymbol{\mu}$$
and
$$\operatorname{Cov}(\boldsymbol{X})=c_X \boldsymbol{\Sigma}$$
where
$$c_X–2 \psi^{\prime}(0)$$
Definition 10.5. The population squared Mahalanobis distance
$$U \equiv D^2=D^2(\boldsymbol{\mu}, \boldsymbol{\Sigma})=(\boldsymbol{X}-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(\boldsymbol{X}-\boldsymbol{\mu})$$
For elliptically contoured distributions, $U$ has pdf
$$h(u)=\frac{\pi^{p / 2}}{\Gamma(p / 2)} k_p u^{p / 2-1} g(u) .$$
For $c>0$, an $E C_p(\boldsymbol{\mu}, c \boldsymbol{I}, g)$ distribution is spherical about $\boldsymbol{\mu}$ where $\boldsymbol{I}$ is the $p \times p$ identity matrix. The multivariate normal distribution $N_p(\boldsymbol{\mu}, \boldsymbol{\Sigma})$ has $k_p=(2 \pi)^{-p / 2}, \psi(u)=g(u)=\exp (-u / 2)$, and $h(u)$ is the $\chi_p^2$ density.

The following lemma is useful for proving properties of EC distributions withónt using thė charáctêristic function (10.6). Sẻé Eáton (1986) and Cook $(1998$, pp. 57,130$)$

## 统计代写|线性回归代写linear regression代考|Sample Mahalanobis Distances

In the multivariate location and dispersion model, sample Mahalanobis distances play a role similar to that of residuals in multiple linear regression. The observed data $\boldsymbol{X}i=\boldsymbol{x}_i$ for $i=1, \ldots, n$ is collected in an $n \times p$ matrix $\boldsymbol{W}$ with $n$ rows $\boldsymbol{x}_1^T, \ldots, \boldsymbol{x}_n^T$. Let the $p \times 1$ column vector $T(\boldsymbol{W})$ be a multivariate location estimator, and let the $p \times p$ symmetric positive definite matrix $\boldsymbol{C}(\boldsymbol{W})$ be a dispersion estimator such as the sample covariance matrix. Definition 10.6. The $i$ th squared Mahalanobis distance is $$D_i^2=D_i^2(T(\boldsymbol{W}), \boldsymbol{C}(\boldsymbol{W}))=\left(\boldsymbol{X}_i-T(\boldsymbol{W})\right)^T \boldsymbol{C}^{-1}(\boldsymbol{W})\left(\boldsymbol{X}_i-T(\boldsymbol{W})\right)(10.12)$$ for each point $\boldsymbol{X}_i$. Notice that $D_i^2$ is a random variable (scalar valued). Notice that the population squared Mahalanobis distance is $$D{\boldsymbol{X}}^2(\boldsymbol{\mu}, \boldsymbol{\Sigma})=(\boldsymbol{X}-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(\boldsymbol{X}-\boldsymbol{\mu})$$
and that the term $\boldsymbol{\Sigma}^{-1 / 2}(\boldsymbol{X}-\boldsymbol{\mu})$ is the $p$-dimensional analog to the $Z$-score used to transform a univariate $N\left(\mu, \sigma^2\right)$ random variable into an $N(0,1)$ random variable. Hence the sample Mahalanobis distance $D_i=\sqrt{D_i^2}$ is an analog of the absolute value $\left|Z_i\right|$ of the sample $Z$-score $Z_i=\left(X_i-\bar{X}\right) / \hat{\sigma}$. Also notice that the Euclidean distance of $\boldsymbol{x}_i$ from the estimate of center $T(\boldsymbol{W})$ is $D_i\left(T(\boldsymbol{W}), \boldsymbol{I}_p\right)$ where $\boldsymbol{I}_p$ is the $p \times p$ identity matrix.

Example 10.3. The contours of constant density for the $N_p(\boldsymbol{\mu}, \boldsymbol{\Sigma})$ distribution are hyperellipsoid boundaries of the form $(\boldsymbol{x}-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(\boldsymbol{x}-\boldsymbol{\mu})=a^2$. An $\alpha$-density region $K_\alpha$ is a set such that $P\left(\boldsymbol{X} \in K_\alpha\right)=\alpha$, and for the $N_p(\boldsymbol{\mu}, \boldsymbol{\Sigma})$ distribution, the regions of highest density are sets of the form
$$\left{\boldsymbol{x}:(\boldsymbol{x}-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(\boldsymbol{x}-\boldsymbol{\mu}) \leq \chi_p^2(\alpha)\right}=\left{\boldsymbol{x}: D_{\boldsymbol{r}}^2(\boldsymbol{\mu}, \boldsymbol{\Sigma}) \leq \chi_p^2(\alpha)\right}$$
where $P\left(W \leqq \chi_p^2(\alpha)\right)-\alpha$ if $W \sim \chi_p^2$. If the $\boldsymbol{X}i$ are $n$ iid random vectors each with an $N_p(\mu, \Sigma)$ pdf, then a scatterplot of $X{i, k}$ versus $X_{i, j}$ should be ellipsoidal for $k \neq j$. Similar statements hold if $\boldsymbol{X}$ is $E C_p(\boldsymbol{\mu}, \boldsymbol{\Sigma}, g)$ with continuous nondecreasing $g$, but the $\alpha$-density region will use a constant $U_\alpha$ obtained from Equation (10.10).

## 统计代写|线性回归代写线性回归代考|椭圆轮廓分布

$$f(z)=k_p|\boldsymbol{\Sigma}|^{-1 / 2} g\left[(z-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(z-\boldsymbol{\mu})\right]$$
，我们说$\boldsymbol{X}$具有椭圆轮廓$E C_p(\boldsymbol{\mu}, \boldsymbol{\Sigma}, g)$分布

If $\boldsymbol{X}$ 具有椭圆轮廓(EC)分布，则特征函数 $\boldsymbol{X}$
$$\phi_{\boldsymbol{X}}(\boldsymbol{t})=\exp \left(i \boldsymbol{t}^T \boldsymbol{\mu}\right) \psi\left(\boldsymbol{t}^T \boldsymbol{\Sigma} \boldsymbol{t}\right)$$

$$E(\boldsymbol{X})=\boldsymbol{\mu}$$

$$\operatorname{Cov}(\boldsymbol{X})=c_X \boldsymbol{\Sigma}$$
where
$$c_X–2 \psi^{\prime}(0)$$10.5.

$$U \equiv D^2=D^2(\boldsymbol{\mu}, \boldsymbol{\Sigma})=(\boldsymbol{X}-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(\boldsymbol{X}-\boldsymbol{\mu})$$对于椭圆轮廓分布， $U$ pdf是否
$$h(u)=\frac{\pi^{p / 2}}{\Gamma(p / 2)} k_p u^{p / 2-1} g(u) .$$

## 统计代写|线性回归代写线性回归代考|样本Mahalanobis距离

，那是术语 $\boldsymbol{\Sigma}^{-1 / 2}(\boldsymbol{X}-\boldsymbol{\mu})$ 是 $p$的-维模拟 $Z$用于转换单变量的score $N\left(\mu, \sigma^2\right)$ 随机变量变为 $N(0,1)$ 随机变量。因此样本马氏距离 $D_i=\sqrt{D_i^2}$ 是绝对值的类比吗 $\left|Z_i\right|$ 样本的 $Z$-分数 $Z_i=\left(X_i-\bar{X}\right) / \hat{\sigma}$。还要注意欧几里得距离 $\boldsymbol{x}_i$ 从中心的估计 $T(\boldsymbol{W})$ 是 $D_i\left(T(\boldsymbol{W}), \boldsymbol{I}_p\right)$ 哪里 $\boldsymbol{I}_p$ 是 $p \times p$ 单位矩阵。

$$\left{\boldsymbol{x}:(\boldsymbol{x}-\boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1}(\boldsymbol{x}-\boldsymbol{\mu}) \leq \chi_p^2(\alpha)\right}=\left{\boldsymbol{x}: D_{\boldsymbol{r}}^2(\boldsymbol{\mu}, \boldsymbol{\Sigma}) \leq \chi_p^2(\alpha)\right}$$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: