## Thursday, November 19, 2015

### Proof of a determinantal integration formula

While reading about random matrices I encountered the following formula in a blog post by Terence Tao.

If $K ( x,y)$ is such that
1. $\int\! dx \ K(x,x) = \alpha$
2. $\int\! dy \ K(x,y) K(y,z) = K(x,z)$
then $$\label{eq:20151118a} \int dx_{n+1} \det_{i,j \le n+1} \left( K(x_i , x_j ) \right) = (\alpha - n) \det_{i,j \le n} \left( K(x_i , x_j ) \right)$$ For simplicity I have written $\int$ instead of $\int_{\mathbb{R}}$. This formula is used when calculating n-point functions in the Gaussian Unitary Ensemble (GUE). Tao gives a short proof of \eqref{eq:20151118a} based on induction and the Laplace expansion of determinants. In this post, I give a proof using integration over Grassmann variables. The reason I am interested in this alternative proof is that I want to compress the calculation of n-point functions in the GUE as much as possible.
The proof of \eqref{eq:20151118a} proceeds as follows. I express the determinant as a Gaussian integral over Grassmann variables \begin{equation*} \det_{i,j \le n+1} \left( K(x_i , x_j ) \right) = \int \left( \prod_{i=1} ^ {n+1} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n+1} \bar\theta_i K(x_i, x_j) \theta_j \right) \end{equation*} The LHS of \eqref{eq:20151118a} is \begin{equation*} \int dx_{n+1} \int \left( \prod_{i=1} ^ {n+1} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n} \bar\theta_i K(x_i, x_j) \theta_j + X \right) \end{equation*} with \begin{equation*} X = \sum_{i=1}^n \bar\theta_i K(x_i,x_{n+1}) \theta_{n+1} +\sum_{j=1}^n \bar\theta_{n+1} K(x_{n+1},x_j) \theta_j + \bar\theta_{n+1} K(x_{n+1}, x_{n+1} ) \theta_{n+1} \end{equation*} I now perform the integration over $\theta_{n+1},\bar\theta_{n+1}$. I therefore expand the exponential \begin{equation*} e^X = \Big(1 + \sum_{i=1}^n \bar\theta_i K(x_i,x_{n+1}) \theta_{n+1} \Big) \Big( 1 + \sum_{j=1}^n \bar\theta_{n+1} K(x_{n+1},x_j) \theta_j \Big) \Big( 1 + \bar\theta_{n+1} K(x_{n+1}, x_{n+1} ) \theta_{n+1} \Big) \end{equation*} Integrating over $\theta_{n+1},\bar\theta_{n+1}$ picks the coefficient of $\bar\theta_{n+1}\theta_{n+1}$, this gives \begin{equation*} K(x_{n+1}, x_{n+1} ) - \sum_{i,j=1}^n \bar\theta_i K(x_i,x_{n+1}) K(x_{n+1},x_j) \theta_j \end{equation*} At this point I have \begin{equation*} LHS = \int dx_{n+1} \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n} \bar\theta_i K(x_i, x_j) \theta_j \right) \left( K(x_{n+1}, x_{n+1} ) - \sum_{i,j=1}^n \bar\theta_i K(x_i,x_{n+1}) K(x_{n+1},x_j) \theta_j\right) \end{equation*} Integrating over $x_{n+1}$ then gives \begin{equation*} LHS = \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n} \bar\theta_i K(x_i, x_j) \theta_j \right) \left( \alpha - \sum_{i,j=1}^n \bar\theta_i K(x_i,x_j) \theta_j \right) \end{equation*} I elaborate a bit on the second term: \begin{align*} & \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n} \bar\theta_i K(x_i, x_j) \theta_j \right) \left( \sum_{i,j=1}^n \bar\theta_i K(x_i,x_j) \theta_j \right)\\ & = \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \frac{1}{(n-1)!} \left( \sum_{i,j=1}^n \bar\theta_i K(x_i,x_j) \theta_j \right)^{n-1} \left( \sum_{i,j=1}^n \bar\theta_i K(x_i,x_j) \theta_j \right)\\ & = n \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \frac{1}{n!} \left( \sum_{i,j=1}^n \bar\theta_i K(x_i,x_j) \theta_j \right)^n\\ & = n \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n} \bar\theta_i K(x_i, x_j) \theta_j \right) \end{align*} It follows that \begin{align*} LHS &= \int \left( \prod_{i=1} ^ {n} d\theta_i d\bar\theta_i \right) \exp \left( \sum_{i,j=1}^{n} \bar\theta_i K(x_i, x_j) \theta_j \right) \left( \alpha - n\right)\\ &= (\alpha - n) \det_{i,j \le n} \left( K(x_i , x_j ) \right) \end{align*}