Wednesday, September 16, 2015

Numerical sum over primes

In his Mémoire Chebyshev uses bounds he found on the function \( \theta(x) =\sum_{p\le x} \log p\) to calculate numerically sums over primes \begin{equation*} \sum_{p =2 }^{\infty} f(p) \end{equation*} I explain his method in this post; to simplify the presentation, I use less sophisticated bounds on \( \theta(x) \) than Chebyshev. As an application, I calculate numerically the sum \begin{equation*} \sum_{p =2 }^{\infty} \frac{1}{p \log p} \end{equation*} with explicit bound on the obtained error.
The method is to calculate the sum up to an integer \( q \) and estimate the tail using bounds on the function \( \theta \). I thus write \begin{equation*} S = \sum_{p =2 }^{\infty} f(p) =\sum_{p \le q } f(p) + \sum_{q < p }f(p) \end{equation*} Partial summation gives \begin{equation*} S = S_q + R_q \end{equation*} with \begin{equation*} S_q =\sum_{p \le q } f(p) - \frac{\theta(q) f(q)}{\log q} \quad\text{with}\quad R_q = - \int_q^{+\infty} \theta(x) \left( \frac{f(x)}{\log x} \right)^\prime\ dx \end{equation*}

Application
As a concrete example, take \begin{equation*} f(x) = \frac{1} { x \log x} \end{equation*} then \begin{equation*} \left( \frac{f(x)}{\log x} \right)^\prime = - \frac{\log x + 2}{x^2 \log^3 x} \end{equation*} thus \begin{equation*} R_q = \int_q^{+\infty} \theta(x) \frac{\log x + 2}{x^2 \log^3 x}\ dx \end{equation*} I now use the following lower and upper bound on \( \theta(x) \): for all real \( x \ge 1 \) \begin{equation*} \theta_1(x) \le \theta(x) \le \theta_2(x) \end{equation*} with \begin{align*} \theta_1(x) &= A x - 5 \log x - \frac{12}{5} A x^{1/2} - \frac{5}{2\log 6}\log^2 x \\ \theta_2(x) & = \frac{6}{5} A x + \frac{5}{\log 6} \log^2 x\\ A &= \frac{1}{2} \log 2 +\frac{1}{3} \log 3 + \frac{1}{5} \log 5 -\frac{1}{30} \log 30 \end{align*} These inequalities can be obtained using similar methods as in this blog post. Therefore \begin{equation*} \int_q^{+\infty} \theta_1(x) \frac{\log x + 2}{x^2 \log^3 x}\ dx \le R_q \le \int_q^{+\infty} \theta_2(x) \frac{\log x + 2}{x^2 \log^3 x}\ dx \end{equation*} For example, if \( q = 1000 \), then \( S_q = 1.472\) (3 decimal points). Numerical integration gives \( 0.148 \le R_q \le 0.187\). Thus \( 1.620 \le S \le 1.659\). This is consistent with the exact value \( S \approx 1.637\) which can be found on page 6 in the paper "High precision computation of Hardy-Littlewood constants" of Henri Cohen.

I illustrate the formulas in Mathematica.
The code to calculate \( S_q \)
The code to bound the error \( R_q \)
The blue area is the approximation; the red line is the exact value.
The reason why I took this example is that \begin{equation*} \sum_{p =2 }^{\infty} f(p)\quad\text{(sum over primes)} \end{equation*} converges, whereas \begin{equation*} \sum_{n =2 }^{\infty} f(n)\quad\text{(sum over integers)} \end{equation*} diverges. If \( \sum_{n =2 }^{\infty} f(n) \) converges, then it is easier to crudely approximate the tail.

Other posts about Chebyshev's mémoire
  1. How not to prove Bertrand's postulate
  2. Lower and upper bound for Chebyshev's psi function
  3. Variations on a theme by Chebyshev

No comments:

Post a Comment