# 统计代写|线性回归分析代写linear regression analysis代考|Projection Matrices and the Column Space

## 统计代写|线性回归分析代写linear regression analysis代考|Projection Matrices and the Column Space

Vector spaces, subspaces, and column spaces should be familiar from linear algebra, but are reviewed below.

Definition 11.1. A set $\mathcal{V} \subset \mathbb{R}^k$ is a vector space if for any vectors $\boldsymbol{x}, \boldsymbol{y}, \boldsymbol{z} \in \mathcal{V}$, and scalars $a$ and $b$, the operations of vector addition and scalar multiplication are defined as follows:
1) $(\boldsymbol{x}+\boldsymbol{y})+\boldsymbol{z}=\boldsymbol{x}+(\boldsymbol{y}+\boldsymbol{z})$.
2) $\boldsymbol{x}+\boldsymbol{y}=\boldsymbol{y}+\boldsymbol{x}$
3) There exists $\mathbf{0} \in \mathcal{V}$ such that $\boldsymbol{x}+\mathbf{0}=\boldsymbol{x}=\mathbf{0}+\boldsymbol{x}$.
4) For any $\boldsymbol{x} \in \mathcal{V}$, there exists $\boldsymbol{y}=-\boldsymbol{x}$ such that $\boldsymbol{x}+\boldsymbol{y}=\boldsymbol{y}+\boldsymbol{x}=\mathbf{0}$.
5) $a(\boldsymbol{x}+\boldsymbol{y})=a \boldsymbol{x}+a \boldsymbol{y}$
6) $(a+b) \boldsymbol{x}=a \boldsymbol{x}+b \boldsymbol{y}$.
7) $(\mathrm{ab}) \boldsymbol{x}=\mathrm{a}(\mathrm{b} \boldsymbol{x})$
8) $1 \boldsymbol{x}=\boldsymbol{x}$
Hence for a vector space, addition is associative and commutative, there is an additive identity vector $\mathbf{0}$, there is an additive inverse $-\boldsymbol{x}$ for each $\boldsymbol{x} \in \mathcal{V}$, scalar multiplication is distributive and associative, and 1 is the scalar identity element.

Two important vector spaces are $\mathbb{R}^k$ and $\mathcal{V}={\mathbf{0}}$. Showing that a set $\mathcal{M}$ is a subspace is a common method to show that $\mathcal{M}$ is a vector space.

Definition 11.2. Let $\mathcal{M}$ be a nonempty subset of a vector space $\mathcal{V}$. If i) $a \boldsymbol{x} \in \mathcal{M} \forall \boldsymbol{x} \in \mathcal{M}$ and for any scalar $a$, and ii) $\boldsymbol{x}+\boldsymbol{y} \in \mathcal{M} \forall \boldsymbol{x}, \boldsymbol{y} \in \mathcal{M}$, then $\mathcal{M}$ is a vector space known as a subspace.

Definition 11.3. The set of all linear combinations of $\boldsymbol{x}1, \ldots, \boldsymbol{x}_n$ is the vector space known as $\operatorname{span}\left(\boldsymbol{x}_1, \ldots, \boldsymbol{x}_n\right)=\left{\boldsymbol{y} \in \mathbb{R}^k: \boldsymbol{y}=\sum{i=1}^n a_i \boldsymbol{x}_i\right.$ for some constants $\left.a_1, \ldots, a_n\right}$.

Definition 11.4. Let $\boldsymbol{x}1, \ldots, \boldsymbol{x}_k \in \mathcal{V}$. If $\exists$ scalars $\alpha_1, \ldots, \alpha_k$ not all zero such that $\sum{i=1}^k \alpha_i \boldsymbol{x}i=\mathbf{0}$, then $\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k$ are linearly dependent. If $\sum{i=1}^k \alpha_i \boldsymbol{x}_i=\mathbf{0}$ only if $\alpha_i=0 \forall i=1, \ldots, k$, then $\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k$ are linearly independent. Suppose $\left{\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k\right}$ is a linearly independent set and $\mathcal{V}=\operatorname{span}\left(\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k\right)$. Then $\left{\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k\right}$ is a linearly independent spanning set for $\mathcal{V}$, known as a basis.

Definition 11.10. Let $\boldsymbol{A}$ be an $n \times n$ matrix and let $\boldsymbol{x} \in \mathbb{R}^n$. Then a quadratic form $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x}=\sum_{i=1}^n \sum_{j=1}^n a_{i j} x_i x_j$, and a linear form is $\boldsymbol{A} \boldsymbol{x}$. Suppose $\boldsymbol{A}$ is a symmetric matrix. Then $\boldsymbol{A}$ is positive definite $(\boldsymbol{A}>0)$ if $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x}>0 \forall \boldsymbol{x} \neq \mathbf{0}$, and $\boldsymbol{A}$ is positive semidefinite $(\boldsymbol{A} \geq 0)$ if $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x} \geq 0 \forall \boldsymbol{x}$

Notation: The matrix $\boldsymbol{A}$ in a quadratic form $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x}$ will be symmetric unless told otherwise. Suppose $B$ is not symmetric. Since the quadratic form is a scalar, $\boldsymbol{x}^T \boldsymbol{B} \boldsymbol{x}=\left(\boldsymbol{x}^T \boldsymbol{B} \boldsymbol{x}\right)^T=\boldsymbol{x}^T \boldsymbol{B}^T \boldsymbol{x}=\boldsymbol{x}^T\left(\boldsymbol{B}+\boldsymbol{B}^T\right) \boldsymbol{x} / 2$, and the matrix $\boldsymbol{A}=\left(\boldsymbol{B}+\boldsymbol{B}^T\right) / 2$ is symmetric. If $\boldsymbol{A} \geq 0$, then the eigenvalues $\lambda_i$ of $\boldsymbol{A}$ are real and nonnegative. If $\boldsymbol{A} \geq 0$, let $\lambda_1 \geq \lambda_2 \geq \cdots \geq \lambda_n \geq 0$. If $\boldsymbol{A}>0$, then $\lambda_n>0$. Some authors say symmetric $\boldsymbol{A}$ is nonnegative definite if $\boldsymbol{A} \geq 0$, and that $\boldsymbol{A}$ is positive semidefinite if $\boldsymbol{A} \geq 0$ and there exists a nonzero $\boldsymbol{x}$ such that $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x}=0$. Then $\boldsymbol{A}$ is singular.

The spectral decomposition theorem is very useful. One application for linear models is defining the square root matrix. See Chapter 4 .

Theorem 11.3: Spectral Decomposition Theorem. Let $\boldsymbol{A}$ be an $n \times$ $n$ symmetric matrix with eigenvector eigenvalue pairs $\left(\lambda_1, t_1\right),\left(\lambda_2, t_2\right), \ldots$, $\left(\lambda_n, \boldsymbol{t}n\right)$ where $\boldsymbol{t}_i^T \boldsymbol{t}_i=1$ and $\boldsymbol{t}_i^T \boldsymbol{t}_j=0$ if $i \neq j$ for $i=1, \ldots, n$. Hence $\boldsymbol{A} \boldsymbol{t}_i=\lambda_i \boldsymbol{t}_i$. Then the spectral decomposition of $\boldsymbol{A}$ is $$\boldsymbol{A}=\sum{i=1}^n \lambda_i \boldsymbol{t}_i \boldsymbol{t}_i^T=\lambda_1 \boldsymbol{t}_1 \boldsymbol{t}_1^T+\cdots+\lambda_n \boldsymbol{t}_n \boldsymbol{t}_n^T .$$
Let $\boldsymbol{T}=\left[\begin{array}{llll}\boldsymbol{t}_1 & \boldsymbol{t}_2 & \cdots & \boldsymbol{t}_n\end{array}\right]$ be the $n \times n$ orthogonal matrix with $i$ th column $\boldsymbol{t}_i$. Then $\boldsymbol{T} \boldsymbol{T}^T=\boldsymbol{T}^T \boldsymbol{T}=\boldsymbol{I}$. Let $\boldsymbol{\Lambda}=\operatorname{diag}\left(\lambda_1, \ldots, \lambda_n\right)$ and let $\boldsymbol{\Lambda}^{1 / 2}=$ $\operatorname{diag}\left(\sqrt{\lambda_1}, \ldots, \sqrt{\lambda_n}\right)$. Then $\boldsymbol{A}=\boldsymbol{T} \boldsymbol{\Lambda} \boldsymbol{T}^T$

# 线性回归代考

## 统计代写|线性回归分析代写linear regression analysis代考|Projection Matrices and the Column Space

1) $(\boldsymbol{x}+\boldsymbol{y})+\boldsymbol{z}=\boldsymbol{x}+(\boldsymbol{y}+\boldsymbol{z})$.
2) $\boldsymbol{x}+\boldsymbol{y}=\boldsymbol{y}+\boldsymbol{x}$
3) 存在 $\mathbf{0} \in \mathcal{V}$ 这样 $\boldsymbol{x}+\mathbf{0}=\boldsymbol{x}=\mathbf{0}+\boldsymbol{x}$.
4）对于任何 $\boldsymbol{x} \in \mathcal{V}$ ， 那里存在 $\boldsymbol{y}=-\boldsymbol{x}$ 这样 $\boldsymbol{x}+\boldsymbol{y}=\boldsymbol{y}+\boldsymbol{x}=\mathbf{0}$.
5) $a(\boldsymbol{x}+\boldsymbol{y})=a \boldsymbol{x}+a \boldsymbol{y}$
6) $(a+b) \boldsymbol{x}=a \boldsymbol{x}+b \boldsymbol{y}$.
7) $(a b) \boldsymbol{x}=\mathrm{a}(\mathrm{b} \boldsymbol{x})$
8) $1 \boldsymbol{x}=\boldsymbol{x}$

$\sum i=1^k \alpha_i \boldsymbol{x} i=\mathbf{0}$ ， 然后 $\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k$ 是线性相关的。如果 $\sum i=1^k \alpha_i \boldsymbol{x}_i=\mathbf{0}$ 除非 $\alpha_i=0 \forall i=1, \ldots, k$ ，然后 $\boldsymbol{x}_1, \ldots, \boldsymbol{x}_k$ 是线性独立的。认为

$\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x}=\sum_{i=1}^n \sum_{j=1}^n a_{i j} x_i x_j$, 线性形式是 $\boldsymbol{A} \boldsymbol{x}$. 认为 $\boldsymbol{A}$ 是对称矩阵。然后 $\boldsymbol{A}$ 是正定的 $(\boldsymbol{A}>0)$ 如果 $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x}>0 \forall \boldsymbol{x} \neq \boldsymbol{0}$ ，和 $\boldsymbol{A}$ 是半正定的 $(\boldsymbol{A} \geq 0)$ 如果 $\boldsymbol{x}^T \boldsymbol{A} \boldsymbol{x} \geq 0 \forall \boldsymbol{x}$

$$\boldsymbol{A}=\sum i=1^n \lambda_i \boldsymbol{t}_i \boldsymbol{t}_i^T=\lambda_1 \boldsymbol{t}_1 \boldsymbol{t}_1^T+\cdots+\lambda_n \boldsymbol{t}_n \boldsymbol{t}_n^T .$$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: