# 统计代写|应用线性模型代写Applied Linear Models代考|STAT501

## 统计代写|应用线性模型代写Applied Linear Models代考|RELATED TOPICS

It is appropriate to briefly mention certain topics related to the preceding development that are customarily associated with testing hypotheses. The treatment of these topics will do no more than act as an outline to the reader, showing him their application to the linear models situation. As with the discussion of distribution functions in Chapter 2, the reader will have to look elsewhere for a complete discussion of these topics.
a. The likelihood ratio test
Tests of linear hypotheses $\mathbf{K}^{\prime} \mathbf{b}=\mathbf{m}$ have been developed from the starting point of the $F$-statistic. This, in turn, can be shown to stem from the likelihood ratio test.

For a sample of $N$ observations $\mathbf{y}$, where $\mathbf{y}$ is $N\left(\mathbf{X b}, \sigma^2 \mathbf{I}\right)$ the likelihood function is
$$L\left(\mathbf{b}, \sigma^2\right)=\left(2 \pi \sigma^2\right)^{-\frac{1}{2} N} \exp \left{-\left[(\mathbf{y}-\mathbf{X b})^{\prime}(\mathbf{y}-\mathbf{X b}) / 2 \sigma^2\right]\right} .$$
The likelihood ratio test utilizes two values of $L\left(\mathbf{b}, \sigma^2\right)$ :
(i) $\operatorname{Max}\left(L_w\right)$, the maximum value of $L\left(\mathbf{b}, \sigma^2\right)$ maximized over the complete range of parameters, namely $0<\sigma^2<\infty$, and $-\infty<b_i<\infty$ for all $i$.
(ii) $\operatorname{Max}\left(L_H\right)$, the maximum value of $L\left(\mathbf{b}, \sigma^2\right)$ maximized over the range of parameters limited (restricted or defined) by the hypothesis $H$.
The likelihood ratio is the ratio of these two maxima:
$$L=\frac{\max \left(L_H\right)}{\max \left(L_w\right)} .$$
Each maximum is found in the usual manner: differentiate $L\left(\mathbf{b}, \sigma^2\right)$ with respect to $\sigma^2$ and the elements of $\mathbf{b}$, equate the differentials to zero, solve the resulting equations for $\mathbf{b}$ and $\sigma^2$ and use these solutions in the place of $\mathbf{b}$ and $\sigma^2$ in $L\left(\mathbf{b}, \sigma^2\right)$. In the case of $\max \left(L_H\right)$ the maximization procedure is carried out within the limitations of the hypothesis. We demonstrate for the case of the hypothesis $H: \mathbf{b}=\mathbf{0}$. First, $\partial L\left(\mathbf{b}, \sigma^2\right) / \partial \mathbf{b}=\mathbf{0}$ gives, as we have seen, $\hat{\mathbf{b}}=\left(\mathbf{X}^{\prime} \mathbf{X}\right)^{-1} \mathbf{X}^{\prime} \mathbf{y} ;$ and $\partial L\left(\mathbf{b}, \sigma^2\right) / \partial \sigma^2=0$ gives $\hat{\sigma}^2=(\mathbf{y}-\mathbf{X} \hat{\mathbf{b}})^{\prime}(\mathbf{y}-\mathbf{X} \hat{\mathbf{b}}) / N$

## 统计代写|应用线性模型代写Applied Linear Models代考|Type I and II errors

Under the null hypothesis $H: \mathbf{K}^{\prime} \mathbf{b}=\mathbf{m}, F(H)=(N-r) Q / s$ SSE has the $F_{s, N-r}$ distribution. For a significance test at the $100 \alpha \%$ level the rule of the test is to not reject $H$ whenever $F(H) \leq F_{\alpha, s, N-r}$, the tabulated value of the $F_{s, N-r}$ distribution, at the $100 \alpha \%$ point. This means $F_{\alpha, s, N-r}$ is defined as follows: if $u$ is any variable having the $F_{s, N-r}$ distribution then
$$\operatorname{Pr}\left{u \geq F_{\alpha, s, N-r}\right}=\alpha .$$
The probability $\alpha$ is the (significance) level of the significance test. An oftused value for it is $0.05$, but there is nothing sacrosanct about this; any value between 0 and 1 can be used for $\alpha$. Other frequently used values are $0.01$ and $0.10$.

The rule of whether or not to reject the hypothesis $H$ is to reject it whenever $F(H)>F_{\alpha, s, N-r}$ and to not reject it whenever $F(H) \leq F_{\alpha, s, N-r}$. By the nature of the statistic $F(H)$ we know that over repeated sampling $F(H)$ will exceed $F_{\alpha, s, N-r}$ on $100 \alpha \%(5 \%$, say $)$ of the time; and when it does we will reject $H$. Therefore, in situations in which the null hypothesis $H$ is actually true, this rejection will constitute an error of judgment. It is the error known as a Type I error, or rejection error. It consists of wrongly rejecting the null hypothesis $H$ when it is true; the probability of its occurrence is $\alpha$.

## 统计代写|应用线性模型代写Applied Linear Models代考|RELATED TOPICS

(二) $\operatorname{Max}\left(L_H\right)$ ，最大值 $L\left(\mathbf{b}, \sigma^2\right)$ 在假设限制（限制或定义) 的参数范围内最大化 $H$. 似然比是这两个最大值的比值:
$$L=\frac{\max \left(L_H\right)}{\max \left(L_w\right)} .$$

## 统计代写|应用线性模型代写Applied Linear Models代考|Type I and II errors

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: