# 统计代写|线性回归分析代写linear regression analysis代考|Two Important Special Cases

## 统计代写|线性回归分析代写linear regression analysis代考|Two Important Special Cases

When studying a statistical model, it is often useful to try to understand the model that contains a constant but no nontrivial predictors, then try to understand the model with a constant and one nontrivial predictor, then the model with a constant and two nontrivial predictors, and then the general model with many predictors. In this text, most of the models are such that $Y$ is independent of $\boldsymbol{x}$ given $\boldsymbol{x}^T \boldsymbol{\beta}$, written
$$Y \Perp \boldsymbol{x} \mid \boldsymbol{x}^T \boldsymbol{\beta} .$$
Then $w_i=\boldsymbol{x}_i^T \hat{\boldsymbol{\beta}}$ is a scalar, and trying to understand the model in terms of $\boldsymbol{x}_i^T \hat{\boldsymbol{\beta}}$ is about as easy as trying to understand the model in terms of one nontrivial predictor. In particular, the response plot of $\boldsymbol{x}_i^T \hat{\boldsymbol{\beta}}$ versus $Y_i$ is essential.

For MLR, the two main benefits of studying the MLR model with one nontrivial predictor $X$ are that the data can be plotted in a scatterplot of $X_i$ versus $Y_i$ and that the OLS estimators can be computed by hand with the aid of a calculator if $n$ is small.

## 统计代写|线性回归分析代写linear regression analysis代考|Simple Linear Regression

The simple linear regression (SLR) model is
$$Y_i=\beta_1+\beta_2 X_i+e_i=\alpha+\beta X_i+e_i$$
where the $e_i$ are iid with $E\left(e_i\right)=0$ and $\operatorname{VAR}\left(e_i\right)=\sigma^2$ for $i=1, \ldots, n$. The $Y_i$ and $e_i$ are random variables while the $X_i$ are treated as known constants. The parameters $\beta_1, \beta_2$, and $\sigma^2$ are unknown constants that need to be estimated. (If the $X_i$ are random variables, then the model is conditional on the $X_i$ ‘s provided that the errors $e_i$ are independent of the $X_i$. Hence the $X_i$ ‘s are still treated as constants.)

The SLR model is a special case of the MLR model with $p=2, x_{i, 1} \equiv 1$, and $x_{i, 2}=X_i$. The normal SLR model adds the assumption that the $e_i$ are iid $\mathrm{N}\left(0, \sigma^2\right)$. That is, the error distribution is normal with zero mean and constant variance $\sigma^2$. The response variable $Y$ is the variable that you want to predict while the predictor variable $X$ is the variable used to predict the response. For SLR, $E\left(Y_i\right)=\beta_1+\beta_2 X_i$ and the line $E(Y)=\beta_1+\beta_2 X$ is the regression function. $\operatorname{VAR}\left(Y_i\right)=\sigma^2$.

For SLR, the least squares estimators $\hat{\beta}1$ and $\hat{\beta}_2$ minimize the least squares criterion $Q\left(\eta_1, \eta_2\right)=\sum{i=1}^n\left(Y_i-\eta_1-\eta_2 X_i\right)^2$. For a fixed $\eta_1$ and $\eta_2$, $Q$ is the sum of the squared vertical deviations from the line $Y=\eta_1+\eta_2 X$. The least squares (OLS) line is $\hat{Y}=\hat{\beta}1+\hat{\beta}_2 X$ where the slope $$\hat{\beta}_2 \equiv \hat{\beta}=\frac{\sum{i=1}^n\left(X_i-\bar{X}\right)\left(Y_i-\bar{Y}\right)}{\sum_{i=1}^n\left(X_i-\bar{X}\right)^2}$$
and the intercept $\hat{\beta}1 \equiv \hat{\alpha}=\bar{Y}-\hat{\beta}_2 \bar{X}$. By the chain rule, $$\frac{\partial Q}{\partial \eta_1}=-2 \sum{i=1}^n\left(Y_i-\eta_1-\eta_2 X_i\right)$$

# 线性回归代考

## 统计代写|线性回归分析代写linear regression analysis代考|Two Important Special Cases

$$Y \backslash \operatorname{Perp} \boldsymbol{x} \mid \boldsymbol{x}^T \boldsymbol{\beta}$$

## 统计代写|线性回归分析代写linear regression analysis代考|Simple Linear Regression

$$Y_i=\beta_1+\beta_2 X_i+e_i=\alpha+\beta X_i+e_i$$

SLR 模型是 MLR 模型的特例 $p=2, x_{i, 1} \equiv 1$ ， 和 $x_{i, 2}=X_i$. 正常的 SLR 模型添加了以 下假设 $e_i$ 是同龄人 $N\left(0, \sigma^2\right)$. 也就是说，误差分布是均值为零且方差恒定的正态分布 $\sigma^2$ . 响应变量 $Y$ 是您要预测的变量，而预测变量 $X$ 是用于预测响应的变量。对于单反， $E\left(Y_i\right)=\beta_1+\beta_2 X_i$ 和线 $E(Y)=\beta_1+\beta_2 X$ 是回归函数。 $\operatorname{VAR}\left(Y_i\right)=\sigma^2$.

$$\frac{\partial Q}{\partial \eta_1}=-2 \sum i=1^n\left(Y_i-\eta_1-\eta_2 X_i\right)$$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: