# 统计代写|主成分分析代写Principal Component Analysis代考|Nonlinear and Kernel PCA

## 统计代写|主成分分析代写Principal Component Analysis代考|Nonlinear Principal Component Analysis

As discussed before, the main idea behind NLPCA is that we may be able to find an embedding of the data into a high-dimensional space such that the structure of the embedded data becomes (approximately) linear. To see why this may be possible, consider a set of points $\left(x_1, x_2\right) \in \mathbb{R}^2$ lying in a conic of the form
$$c_1 x_1^2+c_2 x_1 x_2+c_3 x_2^2+c_4=0 .$$
Notice that if we define the map $\phi: \mathbb{R}^2 \rightarrow \mathbb{R}^3$ as
$$\left(z_1, z_2, z_3\right)=\phi\left(x_1, x_2\right)=\left(x_1^2, \sqrt{2} x_1 x_2, x_2^2\right),$$
then the conic in $\mathbb{R}^2$ transforms into the following affine subspace in $\mathbb{R}^3$ :
$$c_1 z_1+\frac{c_2}{\sqrt{2}} z_2+c_3 z_3+c_4=0 .$$

Therefore, instead of learning a nonlinear manifold in $\mathbb{R}^2$, we can simply learn an affine manifold in $\mathbb{R}^3$. This example is illustrated in Figure 4.4.
More generally, we seek a nonlinear transformation (usually an embedding)
\begin{aligned} \phi(\cdot): \mathbb{R}^D & \rightarrow \mathbb{R}^M, \ \boldsymbol{x} & \mapsto \phi(\boldsymbol{x}), \end{aligned}
such that the structure of the embedded data $\left{\phi\left(\boldsymbol{x}j\right)\right}{j=1}^N$ becomes approximately linear. In machine learning, $\phi(x) \in \mathbb{R}^M$ is called the feature of the data point $x \in \mathbb{R}^D$, and the space $\mathbb{R}^M$ is called the feature space.

Let $\bar{\phi}=\frac{1}{N} \sum_{j=1}^N \phi\left(x_j\right)$ be the sample mean in the feature space and define the mean-subtracted (centered) embedded data matrix as
$$\Phi \doteq\left[\phi\left(x_1\right)-\overline{\boldsymbol{\phi}}, \phi\left(x_2\right)-\overline{\boldsymbol{\phi}}, \ldots, \phi\left(x_N\right)-\bar{\phi}\right] \in \mathbb{R}^{M \times N}$$

## 统计代写|主成分分析代写Principal Component Analysis代考|NLPCA in a High-dimensional Feature Space

A potential difficulty associated with NLPCA is that the dimension $M$ of the feature space can be very high. Thus, computing the principal components in the feature space may become computationally prohibitive. For instance, if we use a Veronese map of degree $n$, the dimension of the feature space is $M=\left(\begin{array}{c}n+D-1 \ n\end{array}\right)$, which grows exponentially fast. When $M$ exceeds $N$, the eigenvalue decomposition of $\Phi \Phi^{\top} \in$ $\mathbb{R}^{M \times M}$ becomes more costly than that of $\Phi^{\top} \Phi \in \mathbb{R}^{N \times N}$, although the two matrices have the same eigenvalues.

This motivates us to examine whether the computation of PCA in the feature space can be reduced to a computation with the lower-dimensional matrix $\Phi^{\top} \Phi$. The answer is actually yes. The key is to notice that despite the dimension of the feature space, every eigenvector $\boldsymbol{u} \in \mathbb{R}^M$ of $\Phi \Phi^{\top}$ associated with a nonzero eigenvalue is always in the span of the matrix $\Phi .^2$ That is,
$$\Phi \Phi^{\top} \boldsymbol{u}=\lambda \boldsymbol{u} \quad \Longleftrightarrow \quad \boldsymbol{u}=\Phi\left(\lambda^{-1} \Phi^{\top} \boldsymbol{u}\right) \in \operatorname{range}(\Phi)$$

Thus, if we let $\boldsymbol{w} \doteq \lambda^{-1} \Phi^{\top} \boldsymbol{u} \in \mathbb{R}^N$, we have $|\boldsymbol{w}|^2=\lambda^{-2} \boldsymbol{u}^{\top} \Phi \Phi^{\top} \boldsymbol{u}=\lambda^{-1}$. Moreover, since $\Phi^{\top} \Phi \boldsymbol{w}=\lambda^{-1} \Phi^{\top} \Phi \Phi^{\top} \boldsymbol{u}=\Phi^{\top} \boldsymbol{u}=\lambda \boldsymbol{w}$, the vector $\boldsymbol{w}$ is an eigenvector of $\Phi^{\top} \Phi$ with the same eigenvalue $\lambda$. Once such a $w$ has been computed from $\Phi^{\top} \Phi$, we can recover the corresponding $\boldsymbol{u}$ in the feature space as
$$u=\Phi w,$$
and compute the $d$ nonlinear principal components of $x$ under the map $\phi(\cdot)$ as
$$y_i \doteq \boldsymbol{u}_i^{\top}(\phi(x)-\overline{\boldsymbol{\phi}})=\boldsymbol{w}_i^{\top} \Phi^{\top}(\phi(x)-\overline{\boldsymbol{\phi}}) \in \mathbb{R}, \quad i=1, \ldots, d,$$
where $\boldsymbol{w}_i \in \mathbb{R}^N$ is the $i$ th leading eigenvector of $\Phi^{\top} \Phi \in \mathbb{R}^{N \times N}$.

# 主成分分析代考

## 统计代写|主成分分析代写Principal Component Analysis代考|Nonlinear Principal Component Analysis

$$c_1 x_1^2+c_2 x_1 x_2+c_3 x_2^2+c_4=0$$

$$\left(z_1, z_2, z_3\right)=\phi\left(x_1, x_2\right)=\left(x_1^2, \sqrt{2} x_1 x_2, x_2^2\right)$$

$$c_1 z_1+\frac{c_2}{\sqrt{2}} z_2+c_3 z_3+c_4=0 .$$

$$\phi(\cdot): \mathbb{R}^D \rightarrow \mathbb{R}^M, \boldsymbol{x} \quad \mapsto \phi(\boldsymbol{x})$$

$$\Phi \doteq\left[\phi\left(x_1\right)-\bar{\phi}, \phi\left(x_2\right)-\bar{\phi}, \ldots, \phi\left(x_N\right)-\bar{\phi}\right] \in \mathbb{R}^{M \times N}$$

## 统计代写|主成分分析代写Principal Component Analysis代考|NLPCA in a High-dimensional Feature Space

$$\Phi \Phi^{\top} \boldsymbol{u}=\lambda \boldsymbol{u} \Longleftrightarrow \boldsymbol{u}=\Phi\left(\lambda^{-1} \Phi^{\top} \boldsymbol{u}\right) \in \operatorname{range}(\Phi)$$

$$u=\Phi w$$

$$y_i \doteq \boldsymbol{u}_i^{\top}(\phi(x)-\overline{\boldsymbol{\phi}})=\boldsymbol{w}_i^{\top} \Phi^{\top}(\phi(x)-\overline{\boldsymbol{\phi}}) \in \mathbb{R}, \quad i=1, \ldots, d,$$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部:

myassignments-help服务请添加我们官网的客服或者微信/QQ，我们的服务覆盖：Assignment代写、Business商科代写、CS代考、Economics经济学代写、Essay代写、Finance金融代写、Math数学代写、report代写、R语言代考、Statistics统计学代写、物理代考、作业代写、加拿大代考、加拿大统计代写、北美代写、北美作业代写、北美统计代考、商科Essay代写、商科代考、数学代考、数学代写、数学作业代写、physics作业代写、物理代写、数据分析代写、新西兰代写、澳洲Essay代写、澳洲代写、澳洲作业代写、澳洲统计代写、澳洲金融代写、留学生课业指导、经济代写、统计代写、统计作业代写、美国Essay代写、美国代考、美国数学代写、美国统计代写、英国Essay代写、英国代考、英国作业代写、英国数学代写、英国统计代写、英国金融代写、论文代写、金融代考、金融作业代写。

Scroll to Top