# 电气工程代写|数字信号过程代写digital signal process代考|ECE4522

## 电气工程代写|数字信号过程代写digital signal process代考|Common Variation between Random Variables

In the last section we took a look at statistical metrics used to describe random variables or discrete number sequences. In this section we will explore how to quantify the dependence of one random variable on the outcome of another using the concepts of covariance and correlation.

The covariance is a measure of the common variation from their mean of two random variables $x$ and $y$. This variation from the mean is expressed as $x-E[x]$ and $y-E[y]$. If these variations from their mean tend to have the same sign, then the covariance equation below will average to a positive value, since the product of these variations will usually be positive. If a positive variation of $x$ goes hand in hand with a negative variation of $y$, then the covariance function below will average to a negative number. A zero will indicate that there is no common variation from their mean and the two random variables are not correlated.

The covariance may be normalized to produce the correlation coefficient $p$, which extends from 1 to 1 . A correlation coefficient of $1.0$ indicates that the outcomes of random variables $x$ and $y$ are the same except for a potential difference in scaling. A correlation coefficient of $-1.0$ indicates the same except that $X$ and $Y$ always have opposite signs.
$$p=\frac{c_{x y}}{\sqrt{\operatorname{var}(x) \cdot \operatorname{var}(y)}}$$
If the covariance and thus the correlation coefficient are equal to $0.0$, then the two variables are said to be uncorrelated.
Correlation
The correlation is a measure of the common variation of two random variables $x$ and $y$. The correlation between two random variables $x$ and $y$ is defined as the expected value of their product.
$$r_{x y}=E[x \cdot y]$$
The covariance considered only common variations from the respective mean of each random variable, whereas the correlation measures the common variation from zero.

## 电气工程代写|数字信号过程代写digital signal process代考|Orthogonal Waveforms

The idea of orthogonal random variables may have been a bit abstract, but orthogonal waveforms are much more intuitive and it turns out they are extremely important in the realm of communication engineering. Understanding and using these signals have led to a significant increase in transmission throughput in modern communication links. Let’s assume that we add two or more information carrying waveforms – via superposition – into’ a composite waveform for transmission. If these waveforms are orthogonal, then the information embedded in each may be independently detected at the receiver. The following two examples illustrate how the use of orthogonal signal sets accomplishes this feat.
Composite $(t)=\operatorname{Infol}(t) \cdot$ OrthWaveform $1(t)+\operatorname{Info} 2(t) \cdot$ OrthWaveform $1(t)+\ldots$

As we all remember from our introductory courses in communication systems, early radio systems relied on FM/PM and AM modulation techniques, which embedded information in either the phase or the amplitude of the carrier. With the advent of digital communication components such as dedicated baseband processors and high performance digital-to-analog converters, it became possible to accurately embed information in both the phase and the amplitude – or $I$ and Q components – of an RF carrier. The only way that this is possible is if an RF carrier is in fact composed of two orthogonal signals, which can each carry an independent data stream. The signals in question, whose orthogonality is proven below, are $\cos \left(2 \pi f_0 t\right)$ and $\sin \left(2 \pi f_0 t\right)$.
\begin{aligned} E\left[\cos \left(2 \pi f_0 t\right) \cdot \sin \left(2 \pi f_0 t\right)\right] &=E\left[\frac{\sin (0)+\sin \left(4 \pi f_0 t\right)}{2}\right] \ &=E\left[\frac{\sin \left(4 \pi f_0 t\right)}{2}\right] \ &=0 \end{aligned}
In the all too familiar figure below we illustrate how the RF carrier signal is assembled from the two orthogonal waveforms, $\cos \left(2 \pi f_0 t\right)$ and $\sin \left(2 \pi f_0 t\right)$, and the associated IQ data streams riding on them. These modulating data streams are the familiar $I(t)$ and $Q(t)$ signals.

# 数字信号过程代考

## 电气工程代写|数字信号过程代写digital signal process代考|Common Variation between Random Variables

$$p=\frac{c_{x y}}{\sqrt{\operatorname{var}(x) \cdot \operatorname{var}(y)}}$$

$$r_{x y}=E[x \cdot y]$$

## 电气工程代写|数字信号过程代写digital signal process代考|Orthogonal Waveforms

$$E\left[\cos \left(2 \pi f_0 t\right) \cdot \sin \left(2 \pi f_0 t\right)\right]=E\left[\frac{\sin (0)+\sin \left(4 \pi f_0 t\right)}{2}\right] \quad=E\left[\frac{\sin \left(4 \pi f_0 t\right)}{2}\right]=0$$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: