The landscape of a quantum circuit

This section is a simplified discussion of results in Ref. 5.

Consider the expectation value of $B$ on state $\vert\psi_N\rangle = U_{N:k+1} U_k(\eta)U_{k-1:1}\vert\psi_0\rangle$ with $U_k(\eta)=e^{-i\Xi\eta/2}$. Given $\Xi^2 =1$, we have $U_k(\eta) = \cos(\frac{\eta}{2})-i\sin(\frac{\eta}{2})\Xi$.

Here, In line 1, we used the following shorthands

And in line 5, we have introduced

Finally, we obtained a sine function.

A direct proposition is

For statistic functional

Next, we describe a new class of differenciable loss which can not be written as an obserable easily, the statistic functionals, for simplicity, we consider an arbitrary statistic functional $f(\xset)$, with a sequence of bit strings $\xset\equiv{x_1,x_2,\ldots, x_r}$ as its arguments. Let’s define the following expectation of this function

$$\begin{equation}\Expect_f(\gammaset)\equiv\expect{f(\xset)}{\{x_i\sim \pshift{\gammav_i}\}_{i=1}^{r}}. \end{equation}$$

Here, $\gammaset={\gammav_1, \gammav_2,\ldots,\gammav_r}$ is the offset angles applied to circuit parameters, %Its element $\gammav_i$ is defined in the same parameter space as $\thetav$ that represents a shift to $\thetav$. which means the probability distributions of generated samples is ${\pshift{\gammav_1}, \pshift{\gammav_2},\ldots ,\pshift{\gammav_r}}$. Writing out the above expectation explicitly, we have

$$\begin{equation}\Expect_f(\gammaset)=\sum\limits_\xset f(\xset)\prod\limits_i \pshift{\gammav_i}(x_i),\end{equation}$$

where index $i$ runs from $1$ to $r$. Its partial derivative with respect to $\thetai$ is

$$\begin{equation}\frac{\partial \Expect_f(\gammaset)}{\partial \thetai}=\sum\limits_\xset f(\xset)\sum\limits_j\frac{\partial \pshift{\gammav_j}(x_j)}{\partial\thetai}\prod\limits_{i\neq j} \pshift{\gammav_i}(x_i)\end{equation}$$

Again, using the gradient of probability, we have

$$\begin{align}\frac{\partial \Expect_f(\gammaset)}{\partial \thetai}&=\frac{1}{2}\sum\limits_{j,s=\pm}\sum\limits_\xset f(\xset){\pshift{\gammav_j+s\frac{\pi}{2}\ei}(x_j)}\prod\limits_{i\neq j} \pshift{\gammav_i}(x_i)\\&=\frac{1}{2}\sum\limits_{j,s=\pm}\Expect_f(\{\gammav_i+s\delta_{ij}\frac{\pi}{2}\ei\}_{i=1}^{r})\end{align}$$

If $f$ is symmetric, $\Expect_f(\mathbf{0})$ becomes a V-statistic~\cite{Mises1947}, then the gradient can be further simplified to

$$\begin{align}\frac{\partial \Expect_f(\gammaset)}{\partial \thetai}=\frac{r}{2}\sum\limits_{s=\pm}\Expect_f\left(\{\gammav_0+s\frac{\pi}{2}\ei,\gammav_1,\ldots,\gammav_r\}\right),\end{align}$$

which contains only two terms. This result can be readily verified by calculating the gradient of MMD loss, noticing the expectation of a kernel function is a V-statistic of degree $2$. By repeatedly applying the gradient formula, we will be able to obtain higher order gradients.


  1. Jin-Guo Liu and Lei Wang, arXiv:1804.04168

  2. J. Li, X. Yang, X. Peng, and C.-P. Sun, Phys. Rev. Lett. 118, 150503 (2017).

  3. E. Farhi and H. Neven, arXiv:1802.06002.

  4. K. Mitarai, M. Negoro, M. Kitagawa, and K. Fujii, arXiv:1803.00745.

  5. Nakanishi, Ken M., Keisuke Fujii, and Synge Todo.

arXiv:1903.12166 (2019).