# 数学代写|组合学代写Combinatorics代考|MATH393

## 数学代写|组合学代写Combinatorics代考|Superposition of Multiple Object States

Superposing object states onto a common space is a familiar practice in many fieldsthink of the theater-level plotting tables that display the geographic distribution of a diversity of objects, for example. The basic concept assumes that object state spaces are copies of a common state space, $\mathcal{X}$. When the spaces are incommensurate, e.g., they have different dimensionalities, it is necessary to map object states to points in $X$ in order to superpose them.

The superposition of the object states defines the multiobject state. The points in the superposition are unlabeled, that is, which point corresponds to the state of which object is unknown. The multiobject state is, therefore, inherently less informative than the list of object-specific states.

The multiobject state space is the grand canonical ensemble (GCE) of all multiobject states; see (B.15) in Appendix B. It is denoted by $\mathcal{E}(\mathcal{X})$. Random variables whose realizations are in the GCE event space are termed finite point processes.

Probability distributions for a finite point process are defined over the events in $\mathcal{E}(X)$. The goal of Bayesian analysis is, therefore, to determine the posterior PDF of a finite point process on the GCE, $\mathcal{E}(X)$, conditioned on a set of sensor point measurements in the measurement space $y$.

The dimensionality of the GCE is so large that the PDF on this space is represented by one or more summary statistics. The most commonly used statistics are the intensity and the pair correlation functions (see Appendix B). These statistics support the intuition that superposed processes are often good representations of the multiobject state. In contrast, marginalized processes (see Sect. B.9) can conflate the distributional support of well-separated objects.

## 数学代写|组合学代写Combinatorics代考|Superposition with Non-identical Object Models

Recall that the GFL of the JPDA filter for $N$ objects with clutter is, from (3.2),
$$\Psi_k^{\mathrm{JPDA}}\left(h^{1: N}, g\right)=\Psi_k^{\mathrm{c}}(g) \prod_{n=1}^N \Psi_k^{\mathrm{BMD}(n)}\left(h^n, g\right),$$
where $\Psi_k^{\mathrm{c}}(g)$ is the GFL (2.20) of the Poisson clutter process and $\Psi_k^{\mathrm{BMD}(n)}\left(h^n, g\right)$ is the GFL (3.1) of the BMD process of object $n$. It is assumed here that the state space of all $N$ objects is $\mathcal{X}$, but the JPDA models are otherwise unchanged. It allows different objects, so the transition (motion) functions, measurement likelihoods, detection probability functions, and prior PDFs depend on $n$, the object index, and the scan index $k$, e.g., the prior PDF is $\mu_{k-1}^n(x), n=1, \ldots, N$.

For a general discussion of superposition, see (B.72) of Appendix B. The GFL of the superposed process is found by setting $h^n(x)=h(x), n=1, \ldots, N$, in the GFL of the unsuperposed process. Let
\begin{aligned} \Psi_k^{\mathrm{JPD} M s}(h, g) & \equiv \Psi_k^{\mathrm{JPD}}(h, \ldots, h, g) \ &=\Psi_k^{\mathrm{c}}(g) \prod_{n=1}^N \Psi_k^{\mathrm{BMD}(n)}(h, g) . \end{aligned}
Given the scan measurement set $\mathbf{y}k=\left{y_1, y_2, \ldots, y_M\right}, M \geq 1$, and the Dirac delta train (cf. Eq. (2.22)), $$g\delta(y)=\sum_{m=1}^M \beta_m \delta_{y_m}(y), \quad y \in \mathcal{Y}, \quad \beta=\left(\beta_1, \beta_2, \ldots, \beta_M\right) \in \mathbb{R}^M,$$
the GFL of the Bayes posterior process is the normalized cross-derivative,
$$\Psi_k^{\mathrm{JPD} \mu \mathrm{s}}\left(h \mid \mathbf{y}k\right)=\frac{\left.\frac{\mathrm{d}}{\mathrm{d} \beta} \Psi_k^{\mathrm{JPDM \xi}}(h, \beta)\right|{\beta=0}}{\left.\frac{\mathrm{d}}{\mathrm{d} \beta} \Psi_k^{\mathrm{JPDMk}}(\mathbf{1}, \beta)\right|_{\beta=0}}$$

where $\Psi_k^{\mathrm{IPDMk}}(h, \beta) \equiv \Psi_k^{\mathrm{IPD} / \xi}\left(h, g_\delta\right)$ is a secularized function, and the notation $\frac{\mathrm{d}}{\mathrm{d} \beta} \equiv$ $\frac{\mathrm{d}^M}{\mathrm{~d} \beta_1 \cdots d \beta_M}$ is used for enhanced readability. Explicitly,
\begin{aligned} &\Psi_k^{\text {IPDn) }}(h, \beta)=\exp \left(-\lambda_k^c+\lambda_k^c \sum_{m-1}^M \beta_m p_k^c\left(y_m\right)\right) \ &\quad \times \prod_{n=1}^N\left[\int_X h(x) \mu_k^{n-}(x)\left(1-P d_k^n(x)+P d_k^n(x) \sum_{m=1}^M \beta_m p_k\left(y_m \mid x\right)\right) \mathrm{d} x\right] . \end{aligned}

## 数学代写|组合学代写Combinatorics代考|多对象状态叠加

.

GCE的维度非常大，以至于该空间上的PDF由一个或多个汇总统计信息表示。最常用的统计信息是强度和对相关函数(见附录B)。这些统计信息支持这样一种直觉，即叠加过程通常是多目标状态的良好表示。相比之下，边缘进程(见B.9节)可以合并分离良好的对象的分布支持

## 数学代写|组合学代写Combinatorics代考|与非相同对象模型的叠加

$$\Psi_k^{\mathrm{JPDA}}\left(h^{1: N}, g\right)=\Psi_k^{\mathrm{c}}(g) \prod_{n=1}^N \Psi_k^{\mathrm{BMD}(n)}\left(h^n, g\right),$$
，其中$\Psi_k^{\mathrm{c}}(g)$是泊松杂波过程的GFL (2.20)， $\Psi_k^{\mathrm{BMD}(n)}\left(h^n, g\right)$是对象$n$的BMD过程的GFL(3.1)。这里假设所有$N$对象的状态空间是$\mathcal{X}$，但JPDA模型在其他方面是不变的。它允许不同的对象，因此转换(运动)函数、测量可能性、检测概率函数和先验PDF依赖于$n$、对象索引和扫描索引$k$，例如，先验PDF为$\mu_{k-1}^n(x), n=1, \ldots, N$。

\begin{aligned} \Psi_k^{\mathrm{JPD} M s}(h, g) & \equiv \Psi_k^{\mathrm{JPD}}(h, \ldots, h, g) \ &=\Psi_k^{\mathrm{c}}(g) \prod_{n=1}^N \Psi_k^{\mathrm{BMD}(n)}(h, g) . \end{aligned}

$$\Psi_k^{\mathrm{JPD} \mu \mathrm{s}}\left(h \mid \mathbf{y}k\right)=\frac{\left.\frac{\mathrm{d}}{\mathrm{d} \beta} \Psi_k^{\mathrm{JPDM \xi}}(h, \beta)\right|{\beta=0}}{\left.\frac{\mathrm{d}}{\mathrm{d} \beta} \Psi_k^{\mathrm{JPDMk}}(\mathbf{1}, \beta)\right|_{\beta=0}}$$

myassignments-help数学代考价格说明

1、客户需提供物理代考的网址，相关账户，以及课程名称，Textbook等相关资料~客服会根据作业数量和持续时间给您定价~使收费透明，让您清楚的知道您的钱花在什么地方。

2、数学代写一般每篇报价约为600—1000rmb，费用根据持续时间、周作业量、成绩要求有所浮动(持续时间越长约便宜、周作业量越多约贵、成绩要求越高越贵)，报价后价格觉得合适，可以先付一周的款，我们帮你试做，满意后再继续，遇到Fail全额退款。

3、myassignments-help公司所有MATH作业代写服务支持付半款，全款，周付款，周付款一方面方便大家查阅自己的分数，一方面也方便大家资金周转，注意:每周固定周一时先预付下周的定金，不付定金不予继续做。物理代写一次性付清打9.5折。

Math作业代写、数学代写常见问题

myassignments-help擅长领域包含但不是全部: