## 数学代写|线性代数代写linear algebra代考|THE RANK NULLITY THEOREM

The Rank Nullity Theorem is perhaps the most important result in the theory of linear transformations, and it is a formidable tool to solve exercises.
Theorem 5.5.1 Let $L: V \longrightarrow W$ be a linear map. Then
$$\operatorname{dim} V=\operatorname{dim}(\operatorname{Ker} L)+\operatorname{dim}(\operatorname{Im} L) .$$
Proof. Let $\left{\mathbf{u}1, \ldots, \mathbf{u}_r\right}$ be a basis for the subspace Ker $L$. By Theorem 4.2.1, we can complete it to a basis $\mathcal{B}$ of $V$. Let $$\mathcal{B}=\left{\mathbf{u}_1, \ldots, \mathbf{u}_r, \mathbf{w}{r+1}, \ldots, \mathbf{w}_n\right}$$

If we prove that $\mathcal{B}1=\left{L\left(\mathbf{w}{r+1}\right), \ldots, L\left(\mathbf{w}_n\right)\right}$ is a basis for $\operatorname{Im}(L)$ then the theorem is proved, as $\operatorname{dim}(\operatorname{Ker}(L))=r, \operatorname{dim}(V)=n$ and $\operatorname{dim}(\operatorname{Im}(L))=n-r$ (the dimension of $\operatorname{Im}(L)$ is the number of vectors in a basis and $\mathcal{B}_1$ contains $n-r$ vectors).

Certainly $\mathcal{B}1$ is a system of generators for $\operatorname{Im}(L)$, by Proposition 5.4.4. Now we show that the vectors in $\mathcal{B}_1$ are linearly independent. Let $$\alpha{r+1} L\left(\mathbf{w}{r+1}\right)+\cdots+\alpha_n L\left(\mathbf{w}_n\right)=\mathbf{0},$$ with $\alpha{r+1}, \ldots, \alpha_n \in \mathbb{R}$. We want to show that $\alpha_{r+1}=\cdots=\alpha_n=0$.
We have:
$$\mathbf{0}=\alpha_{r+1} L\left(\mathbf{w}{r+1}\right)+\cdots+\alpha_n L\left(\mathbf{w}_n\right)=L\left(\alpha{r+1} \mathbf{w}1+\cdots+\alpha_n \mathbf{w}_n\right) .$$ and therefore $\mathbf{w}=\alpha{r+1} \mathbf{w}1+\cdots+\alpha_n \mathbf{w}_n$ belongs to the kernel of $L$. Since $\operatorname{Ker}(L)=$ $\left\langle\mathbf{u}_1, \ldots, \mathbf{u}_r\right\rangle$ we can write $\mathbf{w}$ in the form $\mathbf{w}=\alpha_1 \mathbf{u}_1+\cdots+\alpha_r \mathbf{u}_r$, with $\alpha_1, \ldots, \alpha_r \in \mathbb{R}$. Then $$\alpha{r+1} \mathbf{w}1+\cdots+\alpha_n \mathbf{w}_n=\alpha_1 \mathbf{1}_1+\cdots+\alpha_r \mathbf{1}_r$$ from which it follows that $$\alpha{r+1} \mathbf{w}_1+\cdots+\alpha_n \mathbf{w}_n-\left(\alpha_1 \mathbf{u}_1+\cdots+\alpha_r \mathbf{u}_r\right)=\mathbf{0}$$
and being $\mathcal{B}$ a basis for $V$ this implies that $\alpha_1=\ldots=\alpha_n=0$, concluding the proof of the theorem.

## 数学代写|线性代数代写linear algebra代考|ISOMORPHISM OF VECTOR SPACES

The concept of isomorphism allows us to identify two vector spaces, and then to treat them in the same way when we have to solve linear algebra problems, such as, for example, to determine if a set of vectors is linearly independent, or for the calculation of the kernel and of the image of a linear transformation.

Definition 5.6.1 A linear map $L: V \longrightarrow W$ is said to be an isomorphism, if it is invertible, or equivalently if it is injective and surjective.

Similarly, two vector spaces $V$ and $W$ are called isomorphic, if there is an isomorphism $L: V \longrightarrow W$; in this case, we write $V \cong W$.

Example 5.6.2 Consider the linear map $L: \mathbb{R}_2[x] \longrightarrow \mathbb{R}^3$ defined by: $L\left(x^2\right)=$ $(1,0,0), L(x)=(0,1,0), L(1)=(0,0,1)$. This linear transformation is invertibile. To show this, we can determine the kernel and see that it is the zero subspace and determine the image and see that it is all $\mathbb{R}^3$. We leave this as an exercise. Alternatively, we can define the linear transformation $T: \mathbb{R}^3 \longrightarrow \mathbb{R}_2[x]$, such that $T\left(\mathbf{e}_1\right)=x^2$, $T\left(\mathbf{e}_2\right)=x, T\left(\mathbf{e}_3\right)=1$ and verify that it is the inverse of $L$ (the student may want to do these verifications as an exercise). Therefore $\mathbb{R}_2[x]$ and $\mathbb{R}^3$ are isomorphic. Somehow, it is as if they were the same space, as we created a one to one correspondence that associates to a vector in $\mathbb{R}_2[x]$, one and only one vector in $\mathbb{R}^3$, and vice versa. This correspondence also preserves the operations of sum of vectors and multiplication of a vector by a scalar. In fact, we had already noticed that, once we fix basis in $\mathbb{R}_2[x]$, each vector is written using three coordinates, just like a vector in $\mathbb{R}^3$. If we fix the canonical basis $\left{x^2, x, 1\right}$, the linear map that associates to each polynomial its coordinates is just the isomorphism $L: \mathbb{R}_2[x] \longrightarrow \mathbb{R}^3$ described above. Once we write the coordinates of a polynomial, we can treat it as an element of $\mathbb{R}^3$. For example, to determine whether some polynomials are linearly independent or to determine a basis of the subspace they generate, we use the Gaussian algorithm as described in Chapter 1.

# 线性代数代考

## 数学代写|线性代数代写linear algebra代考|THE RANK NULLITY THEOREM

$$\operatorname{dim} V=\operatorname{dim}(\operatorname{Ker} L)+\operatorname{dim}(\operatorname{Im} L) .$$

$$\mathbf{0}=\alpha_{r+1} L(\mathbf{w} r+1)+\cdots+\alpha_n L\left(\mathbf{w}_n\right)=L\left(\alpha r+1 \mathbf{w} 1+\cdots+\alpha_n \mathbf{w}_n\right) .$$

$$\alpha r+1 \mathbf{w} 1+\cdots+\alpha_n \mathbf{w}_n=\alpha_1 \mathbf{1}_1+\cdots+\alpha_r \mathbf{1}_r$$

$$\alpha r+1 \mathbf{w}_1+\cdots+\alpha_n \mathbf{w}_n-\left(\alpha_1 \mathbf{u}_1+\cdots+\alpha_r \mathbf{u}_r\right)=\mathbf{0}$$

## 数学代写|线性代数代写linear algebra代考|ISOMORPHISM OF VECTOR SPACES

$(1,0,0), L(x)=(0,1,0), L(1)=(0,0,1)$. 这种线性变换是可逆的。为了证明这一点，我们可以确定内 $T: \mathbb{R}^3 \longrightarrow \mathbb{R}_2[x]$, 这样 $T\left(\mathbf{e}_1\right)=x^2, T\left(\mathbf{e}_2\right)=x, T\left(\mathbf{e}_3\right)=1$ 并验证它是 $L$ (学生可能布望将这些验证 作为练习)。所以 $\mathbb{R}_2[x]$ 和 $\mathbb{R}^3$ 是同构的。不知何故，就好像它们是同一个空间，因为我们创建了一个一对 一的对应关系，该对应关系与 $\mathbb{R}_2[x]$ ，个且只有一个向量 $\mathbb{R}^3$ ，反之亦然。这种对应关系还保留了向量之 和以及向量乘以标量的操作。事实上，我们已经注意到，一旦我们确定了基础 $\mathbb{R}_2[x]$ ，每个向量都是用三 个坐标写的，就像一个向量在 $\mathbb{R}^3$. 如果我们修筫规范基础 $\backslash$ left $\left{x^{\wedge} 2, x, 1 \backslash\right.$ right $}$. 关联到每个多项式其坐标的 线性映射就是同构 $L: \mathbb{R}_2[x] \longrightarrow \mathbb{R}^3$ 如上所述。一旦我们写出多项式的坐标，我们就可以将其视为 $\mathbb{R}^3$. 例 如，要确定某些多项式是否线性无关或确定它们生成的子空间的基，我们使用第 1 章中描述的高斯算法。

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: