A weak form of the prime number theorem

The function $\pi(n)$ counts the number of primes $p$ in the range $1\leq p\leq n$. One of the most spectacular theorems in mathematics is the so-called prime number theorem, which states that

\[ \lim_{n\to\infty}\Bigl(\pi(n)\Bigm/\frac{n}{\ln n}\Bigr) = 1 . \]

This is a strong condition on the asymptotic behaviour of $\pi(n)$ as $n$ increases without bound—unlike some asymptotic results, even the base of the logarithm is significant here.

Last year I worked through a proof of the prime number theorem. Two of the main results used were Euler’s product and Cauchy’s residue theorem—it was breathtaking. It still hasn’t completely “coalesced” in my head, but I periodically revisit it. Today I want to prove a weaker form of the prime number theorem, namely that

\[ \pi(n) = \Theta\Bigl(\frac{n}{\ln n}\Bigr) . \]

This uses asymptotic notation, and says that the growth rate of $\pi(n)$ is eventually within a constant factor of $n/\ln n$.

The proof is by Chebyshev and proceeds by finding the following bounds on the central binomial coefficient $\binom{2n}{n}$:

\begin{align}
2^n \leq \binom{2n}{n} &\leq 4^n \tag{1} \\
n^{\pi(2n)-\pi(n)} \leq \binom{2n}{n} &\leq (2n)^{\pi(2n)} \tag{2}
\end{align}

Upper bound of (1)

Using the binomial theorem to expand $(1+1)^{2n}$ gives

\[ 4^n = 2^{2n} = (1+1)^{2n} = \sum_{i=0}^{2n}\binom{2n}{i} \geq \binom{2n}{n} \]

since all the terms in the sum are positive.

Lower bound of (1)

However, note that the central term ($i=n$) in the sum is the largest (to go from term $i$ to term $i+1$ you multiply by $\frac{2n-i}{i+1}$, which is greater than $1$ until $i\geq n$), so we have:

\[ 4^n = 2^{2n} = (1+1)^{2n} = \sum_{i=0}^{2n}\binom{2n}{i} \leq (2n+1)\binom{2n}{n} \]

The required inequality follows after dividing by $2^n$ and using the fact that $2n+1\leq2^n$ (which holds for $n\geq3$, however the required inequality may explicitly be checked to hold for $n<3$).

Lower bound of (2)

Examining the prime factorization of $\binom{2n}{n}=(2n)!/(n!)^2$, we see that it contains all the primes $p$ in the range $n<p\leq 2n$, since they appear in the numerator but not in the denominator. There are $\pi(2n)-\pi(n)$ primes in this range, and they are all larger than $n$, so it follows that:

\[ \binom{2n}{n} \geq \prod_{n<p\leq 2n} p \geq n^{\pi(2n)-\pi(n)} \]

Upper bound of (2)

Note that there are $\lfloor n/p\rfloor$ multiples of $p$ in the range $1\leq p\leq n$. We can use this fact to determine the exponent on $p$ in the prime factorization of $n!$. There will be $\lfloor n/p\rfloor$ factors of $p$ in $n!$, but we also need to count the additional $\lfloor n/p^2\rfloor$ factors of $p$ which are contributed by multiples of $p^2$, and in general an additional $\lfloor n/p^k\rfloor$ factors of $p$ contributed by multiples of $p^k$. The exponent of $p$ in the prime factorization of $n!$ is thus

\[ \sum_{k=1}^\infty \biggl\lfloor\frac{n}{p^k}\biggr\rfloor , \]

where the sum is zero for $k>\log_p n$. It follows that the exponent of $p$ in the prime factorization of $(2n)!/(n!)^2$ is

\[ \sum_{k=1}^{\lfloor\log_p(2n)\rfloor} \biggl(\biggl\lfloor\frac{2n}{p^k}\biggr\rfloor – 2\biggl\lfloor\frac{n}{p^k}\biggr\rfloor\biggr) . \]

An interesting fact about the function $\lfloor 2x\rfloor-2\lfloor x\rfloor$ is that it is either $0$ or $1$ (plot it and you’ll see why), so this sum is actually at most $\lfloor\log_p(2n)\rfloor$. Since $\binom{2n}{n}$ only contains primes $p$ in the range $1\leq p\leq 2n$, it follows that:

\[ \binom{2n}{n} \leq \prod_{1\leq p\leq 2n}p^{\log_p(2n)} = \prod_{1\leq p\leq 2n} 2n = (2n)^{\pi(2n)} \]

Using the bounds

Putting both (1) and (2) together and taking logarithms yields the following bounds:

\begin{gather}
n\ln 2 \leq \pi(2n)\ln(2n) \tag{3} \\
(\pi(2n)-\pi(n))\ln n \leq n\ln 4 \tag{4}
\end{gather}

The lower bound on $\pi(n)$

From (3) we have that

\[ \pi(2n) \geq \frac{\ln2}{2}\frac{2n}{\ln(2n)} , \]

which is shows the required lower bound for even numbers. But it can be slightly modified to work for odd numbers as well (note that $\ln(2n)\leq\ln(2n+1)$):

\[ \pi(2n+1) \geq \pi(2n) \geq \frac{2n+1}{2n+1}\frac{\ln2}{2}\frac{2n}{\ln(2n)} \geq \frac{2n}{2n+1}\frac{\ln2}{2}\frac{2n+1}{\ln(2n+1)} \]

Since $2n/(2n+1)$ is lower bounded by a constant (e.g., $2/3$) for all positive $n$, this shows that $\pi(n)=\Omega(n/\ln n)$.

The upper bound on $\pi(n)$

First we will show the required upper bound for powers of two. That way we will be able to apply (4) not only on $n$ but also recursively on half of $n$ until $n=1$ is reached. However, to get a nice telescoping sum we modify (4) by subtracting $\pi(n)\ln(n/2)$ from both sides. After rearranging and simplifying with $\ln n-\ln(n/2)=\ln2$, we find

\[ \pi(2n)\ln n – \pi(n)\ln(n/2) \leq n\ln4 + \pi(n)\ln 2 \leq (3\ln 2)n \tag{5} \]

since $\pi(n)\leq n$ and $\ln4+\ln2=3\ln2$.

We now sum together (5) used on $n{,}~n/2{,}~\dotsc{,}~1$ to get

\[ \sum_{i=0}^{\log_2 n}\biggl(\pi\Bigl(\frac{n}{2^{i-1}}\Bigr)\ln\Bigl(\frac{n}{2^i}\Bigr)-\pi\Bigl(\frac{n}{2^i}\Bigr)\ln\Bigl(\frac{n}{2^{i+1}}\Bigr)\biggr) \leq (3\ln 2)\sum_{i=0}^{\log_2 n}\frac{n}{2^i} . \]

However, by telescoping the left sum is equal to $\pi(2n)\ln n$ and by the formula for the summation of a geometric series the right sum is at most $(3\ln2)2n$. In other words, when $n$ is a power of two we have

\[ \pi(2n) \leq (6\ln2)\frac{n}{\ln n} , \]

and the required bound follows using $\frac{n}{\ln n}<\frac{2n}{\ln(2n)}$, a consequence of the fact that $\frac{n}{\ln n}$ is an increasing function (for $n\geq3$, although the required bound still holds for $n<3$).

As before, a slight modification shows the bound holds for all numbers, not just powers of two. Let $m$ be an arbitrary integer in the range $n\leq m<2n$ where $n$ is a power of two. Then

\[ \pi(m) \leq \pi(2n) \leq (6\ln 2)\frac{n}{\ln n} \leq (6\ln 2)\frac{m}{\ln m} \]

which shows that $\pi(n)=O(n/\ln n)$.

Minimal polynomial misconception

Yesterday I had a discussion with a friend about computing minimal polynomials. For example, say you are given algebraic numbers $\alpha$ and $\beta$ as specified by minimal polynomials which have degrees $n$ and $m$. How do you compute the minimal polynomial of $\alpha+\beta$, for example?

The method I was taught, which seems to be fairly standard (e.g., see Dummit and Foote Chapter 13.2) is the following:

Multiply $\alpha+\beta$ by $\alpha^i\beta^j$ for each $i=0{,}~1{,}~\dotsc{,}~n-1$ and $j=0{,}~1{,}~\dotsc{,}~m-1$, and use the minimal polynomials of $\alpha$ and $\beta$ to reduce the resulting expressions to linear combinations of $\alpha^i\beta^j$ (again with $0\leq i<n$ and $0\leq j<m$). In other words, what you are doing is computing a matrix $M$ which satisfies

\[ (\alpha+\beta)\left[\alpha^i\beta^j\right]_{i,j} = M\left[\alpha^i\beta^j\right]_{i,j} \]

where $[\alpha^i\beta^j]_{i,j}$ is a column vector containing the $\alpha^i\beta^j$.

From this we see that $\alpha+\beta$ is an eigenvalue of $M$ and therefore is a root of the characteristic polynomial $p_M$ of $M$. If $p_M$ is irreducible then it will be the minimal polynomial of $\alpha+\beta$, but in general the minimal polynomial will divide $p_M$, and so it will be necessary to factor $p_M$.

However, once $p_M$ has been factored, how does one tell which factor is the required minimal polynomial? The obvious answer is to evaluate each factor at $\alpha+\beta$ and see which one gives zero. “You could do that numerically”, my friend says, and I respond with “…or symbolically”. But then he asks if I will always be able to determine if the symbolic expression is zero or not. Well, I hadn’t thought of that, and I admitted I wasn’t sure, but claimed “anything you can do numerically I can do symbolically!”

I spent several hours yesterday trying to solve that problem, but eventually had to go to bed. This morning, after looking in Cohen I found the following passage:

…it does not make sense to ask which of the irreducible factors $\alpha+\beta$ is a root of, if we do not specify embeddings in $\mathbb{C}$…

Wait, what? I got excited as I realized I had just uncovered a misconception of mine!

Note that if the conjugates of $\alpha$ are $\alpha_i$ then they are all “symbolically identical”: $\mathbb{Q}(\alpha_i)$ is isomorphic for each conjugate. From that, I had assumed that the values of $\alpha_i+\beta_j$ would also be symbolically identical for all conjugates of $\alpha$ and $\beta$. Not true! As a trivial counterexample, if $\alpha$ is a root of $x^3-2$ and $\beta$ is a root of $x^3+2$ then two possible values for $\alpha+\beta$ are $0$ and $\sqrt[3]{2}\sqrt{3}i$, and these have very different algebraic behaviour.

So the whole problem of computing the minimal polynomial of $\alpha+\beta$ was not well defined unless you specify which roots $\alpha$, $\beta$ you are talking about—for example, by specifying them numerically.

Leaving my head

The TED website hosts a large collection of inspiring talks on many different topics. The talk on how schools kill creativity is particularly fantastic; both funny and insightful, and is in fact the most-watched TED talk to date. During it Ken Robinson makes the following remark:

There’s something curious about professors in my experience — not all of them, but typically — they live in their heads.

Not to put too fine a point on it, this describes me perfectly. (Although I am a graduate student rather than a professor.)

This behaviour has both its strengths and weaknesses. For example, I have almost never felt lonely in my life, despite being alone a good portion of the time. Perhaps this is because by living in my head I am able to keep myself company. Indeed, my modus operandi when thinking about something could be described as a “conversation” with myself in which I try to make arguments for both sides of an issue.

However, like all human emotions, loneliness undoubtedly exists for a reason. In this case, it seems to be a trigger to make friends, which in turn improves one’s well-being. Without loneliness one has less short-term pain, but also less motivation to seek out ultimately beneficial connections.

In my case, I never gave much thought to the downsides, or even viewed them with a kind of perverse pride; the “cost of doing business” as it were. In short, I didn’t make much of an effort to ameliorate the absent-mindedness; in fact, I thought such traits were essentially unchangeable.

Early this year I decided to make a conscious effort to get out of my head. There were multiple reasons, but one is that almost every job I do involves interacting with other people in some way, so I figure that becoming more empathetic will actually improve the quality of my work.

How does one go about venturing out of their head? Well, I am still learning that. One step I’m taking is to learn—and remember!—the names of people I recognize from somewhere. I haven’t had a 100% success rate, but I’m already much better than I was previously, when I would be likely to forget someone’s name before my conversation with them had finished. Additionally, I will not ignore the fact that yes, I have a body, and to this end will pay more attention to things like exercise, diet, and appearance.

Pokémon Yellow is Turing complete

If you’re anything like me, this will simultaneously shock you, warm your heart, and leave you laughing at its convoluted brilliance.

The video is about 13 minutes long, but the payload comes in the last 30 seconds, where balloons are displayed on screen while music plays in the background. What’s so special about that? The image and music featured do not exist anywhere in the game—they were manually programmed to appear in it by taking advantage of game bugs shown in the first 12 minutes of the video. Assuming you know how to program in the gameboy’s machine language, you can turn Pokémon into any program you want.

The video itself is somewhat tedious to watch, because setting up the “bootstrap” program which allows one to write arbitrary programs was accomplished by acquiring a specific sequence of items in exactly the right quantities. For example, about two full minutes are spent doing nothing but buying lemonade (which can only be purchased one at a time). In addition, for much of the video it is difficult to determine just what’s going on; one gets the impression that the game itself is similarly confused!

For more detail, see this post by the author. My hat’s off to you, Robert McIntyre.