# 统计代写|线性回归代写linear regression代考|STA621

## 统计代写|线性回归代写linear regression代考|Graphical Methods for Response Transformations

If the ratio of largest to smallest value of $y$ is substantial, we usually begin
by looking at $\log y$.
Mosteller and Tukey (1977, p. 91)
The applicability of the multiple linear regression model can be expanded by allowing response transformations. An important class of response transformation models adds an additional unknown transformation parameter $\lambda_o$, such that
$$Y_i=t_{\lambda_o}\left(Z_i\right) \equiv Z_i^{\left(\lambda_o\right)}=E\left(Y_i \mid \boldsymbol{x}i\right)+e_i=\boldsymbol{x}_i^T \boldsymbol{\beta}+e_i .$$ If $\lambda_o$ was known, then $Y_i=t{\lambda_o}\left(Z_i\right)$ would follow a multiple linear regression model with $p$ predictors including the constant. Here, $\boldsymbol{\beta}$ is a $p \times 1$ vector of unknown coefficients depending on $\lambda_o, \boldsymbol{x}$ is a $p \times 1$ vector of predictors that are assumed to be measured with negligible error, and the errors $e_i$ are assumed to be iid with zero mean.

Definition 3.2. Assume that all of the values of the “response” $Z_i$ are positive. A power transformation has the form $Y=t_\lambda(Z)=Z^\lambda$ for $\lambda \neq 0$ and $Y=t_0(Z)=\log (Z)$ for $\lambda=0$ where
$$\lambda \in \Lambda_L={-1,-1 / 2,-1 / 3,0,1 / 3,1 / 2,1} .$$

Definition 3.3. Assume that all of the values of the “response” $Z_i$ are positive. Then the modified power transformation family
$$t_\lambda\left(Z_i\right) \equiv Z_i^{(\lambda)}=\frac{Z_i^\lambda-1}{\lambda}$$
for $\lambda \neq 0$ and $Z_i^{(0)}=\log \left(Z_i\right)$. Generally $\lambda \in \Lambda$ where $\Lambda$ is some interval such as $[-1,1]$ or a coarse subset such as $\Lambda_L$. This family is a special case of the response transformations considered by Tukey (1957).

A graphical method for response transformations refits the model using the same fitting method: changing only the “response” from $Z$ to $t_\lambda(Z)$. Compute the “fitted values” $\hat{W}i$ using $W_i=t\lambda\left(Z_i\right)$ as the “response.” Then a transformation plot of $\hat{W}i$ versus $W_i$ is made for each of the seven values of $\lambda \in \Lambda_L$ with the identity line added as a visual aid. Vertical deviations from the identity line are the “residuals” $r_i=W_i-\hat{W}_i$. Then a candidate response transformation $Y=t{\lambda^*}(Z)$ is reasonable if the plotted points follow the identity line in a roughly evenly populated band if the unimodal MLR model is reasonable for $Y=W$ and $\boldsymbol{x}$. See Definition 2.6. Curvature from the identity line suggests that the candidate response transformation is inappropriate.

## 统计代写|线性回归代写linear regression代考|Main Effects, Interactions, and Indicators

Section $1.4$ explains interactions, factors, and indicator variables in an abstract setting when $Y \Perp \boldsymbol{x} \mid \boldsymbol{x}^T \boldsymbol{\beta}$ where $\boldsymbol{x}^T \boldsymbol{\beta}$ is the sufficient predictor (SP). MLR is such a model. The Section $1.4$ interpretations given in terms of the SP can be given in terms of $E(Y \mid \boldsymbol{x})$ for MLR since $E(Y \mid \boldsymbol{x})=\boldsymbol{x}^T \boldsymbol{\beta}=S P$ for MLR.

Definition 3.5. Suppose that the explanatory variables have the form $x_2, \ldots, x_k, x_{j j}=x_j^2, x_{i j}=x_i x_j, x_{234}=x_2 x_3 x_4$, et cetera. Then the variables $x_2, \ldots, x_k$ are main effects. A product of two or more different main effects is an interaction. A variable such as $x_2^2$ or $x_7^3$ is a power. An $x_2 x_3$ interaction will sometimes also be denoted as $x_2: x_3$ or $x_2 * x_3$.

Definition 3.6. A factor $W$ is a qualitative random variable. Suppose $W$ has $c$ categories $a_1, \ldots, a_c$. Then the factor is incorporated into the MLR model by using $c-1$ indicator variables $x_{W j}=1$ if $W=a_j$ and $x_{W j}=0$ otherwise, where one of the levels $a_j$ is omitted, e.g. use $j=1, \ldots, c-1$. Each indicator variable has 1 degree of freedom. Hence the degrees of freedom of the $c-1$ indicator variables associated with the factor is $c-1$.

Rule of thumb 3.3. Suppose that the MLR model contains at least one power or interaction. Then the corresponding main effects that make up the powers and interactions should also be in the MLR model.

Rule of thumb $3.3$ suggests that if $x_3^2$ and $x_2 x_7 x_9$ are in the MLR model, then $x_2, x_3, x_7$, and $x_9$ should also bé in thè MLR módẹl. A quick way tó chéck whether a term like $x_3^2$ is needed in the model is to fit the main effects models and then make a scatterplot matrix of the predictors and the residuals, where the residuals $r$ are on the top row. Then the top row shows plots of $x_k$ versus $r$, and if a plot is parabolic, then $x_k^2$ should be added to the model. Potential predictors $w_j$ could also be added to the scatterplot matrix. If the plot of $w_j$ versus $r$ shows a positive or negative linear trend, add $w_j$ to the model. If the plot is quadratic, add $w_j$ and $w_j^2$ to the model. This technique is for quantitative variables $x_k$ and $w_j$.

# 线性回归代考

## 统计代写|线性回归代写线性回归代考|响应转换的图形方法

$$Y_i=t_{\lambda_o}\left(Z_i\right) \equiv Z_i^{\left(\lambda_o\right)}=E\left(Y_i \mid \boldsymbol{x}i\right)+e_i=\boldsymbol{x}_i^T \boldsymbol{\beta}+e_i .$$如果$\lambda_o$是已知的，那么$Y_i=t{\lambda_o}\left(Z_i\right)$将遵循一个具有$p$预测器(包括常数)的多元线性回归模型。在这里，$\boldsymbol{\beta}$是一个$p \times 1$的未知系数向量，取决于$\lambda_o, \boldsymbol{x}$是一个$p \times 1$的预测因子向量，假设被测量的误差可以忽略不计，误差$e_i$假设为iid，平均值为零。

$$\lambda \in \Lambda_L={-1,-1 / 2,-1 / 3,0,1 / 3,1 / 2,1} .$$

$$t_\lambda\left(Z_i\right) \equiv Z_i^{(\lambda)}=\frac{Z_i^\lambda-1}{\lambda}$$
。通常是$\lambda \in \Lambda$，其中$\Lambda$是某个区间(如$[-1,1]$)或粗子集(如$\Lambda_L$)。这个族是Tukey(1957)所考虑的响应转换的一个特例

## 统计代写|线性回归代写线性回归代考|主要影响，相互作用，和指标

Section $1.4$ 在抽象设置中解释交互、因素和指示变量 $Y \Perp \boldsymbol{x} \mid \boldsymbol{x}^T \boldsymbol{\beta}$ 哪里 $\boldsymbol{x}^T \boldsymbol{\beta}$ 为充分预测因子(SP)。MLR就是这样一种模式。部分 $1.4$ 根据SP给出的解释可以根据 $E(Y \mid \boldsymbol{x})$ 自 $E(Y \mid \boldsymbol{x})=\boldsymbol{x}^T \boldsymbol{\beta}=S P$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: