Sometimes patterns can lead you astray. For example, it’s known that

is a good approximation to the number of primes less than or equal to Numerical evidence suggests that is always greater than For example,

and

But in 1914, Littlewood heroically showed that in fact, changes sign infinitely many times!

This raised the question: when does first exceed ? In 1933, Littlewood’s student Skewes showed, assuming the Riemann hypothesis, that it must do so for some less than or equal to

Later, in 1955, Skewes showed without the Riemann hypothesis that must exceed for some smaller than

By now this bound has been improved enormously. We now know the two functions cross somewhere near but we don’t know if this is the first crossing!

All this math is quite deep. Here is something less deep, but still fun.

You can show that

and so on.

It’s a nice pattern. But this pattern doesn’t go on forever! It lasts a very, very long time… but not forever.

More precisely, the identity

holds when

but not for all At some point it stops working and never works again. In fact, it definitely fails for all

The explanation

The integrals here are a variant of the Borwein integrals:

where the pattern continues until

but then fails:

I never understood this until I read Greg Egan’s explanation, based on the work of Hanspeter Schmid. It’s all about convolution, and Fourier transforms:

Suppose we have a rectangular pulse, centred on the origin, with a height of 1/2 and a half-width of 1. Now, suppose we keep taking moving averages of this function, again and again, with the average computed in a window of half-width 1/3, then 1/5, then 1/7, 1/9, and so on. There are a couple of features of the original pulse that will persist completely unchanged for the first few stages of this process, but then they will be abruptly lost at some point. The first feature is that F(0) = 1/2. In the original pulse, the point (0,1/2) lies on a plateau, a perfectly constant segment with a half-width of 1. The process of repeatedly taking the moving average will nibble away at this plateau, shrinking its half-width by the half-width of the averaging window. So, once the sum of the windows’ half-widths exceeds 1, at 1/3+1/5+1/7+…+1/15, F(0) will suddenly fall below 1/2, but up until that step it will remain untouched. In the animation below, the plateau where F(x)=1/2 is marked in red.

The second feature is that F(–1)=F(1)=1/4. In the original pulse, we have a step at –1 and 1, but if we define F here as the average of the left-hand and right-hand limits we get 1/4, and once we apply the first moving average we simply have 1/4 as the function’s value. In this case, F(–1)=F(1)=1/4 will continue to hold so long as the points (–1,1/4) and (1,1/4) are surrounded by regions where the function has a suitable symmetry: it is equal to an odd function, offset and translated from the origin to these centres. So long as that’s true for a region wider than the averaging window being applied, the average at the centre will be unchanged. The initial half-width of each of these symmetrical slopes is 2 (stretching from the opposite end of the plateau and an equal distance away along the x-axis), and as with the plateau, this is nibbled away each time we take another moving average. And in this case, the feature persists until 1/3+1/5+1/7+…+1/113, which is when the sum first exceeds 2. In the animation, the yellow arrows mark the extent of the symmetrical slopes. OK, none of this is difficult to understand, but why should we care? Because this is how Hanspeter Schmid explained the infamous Borwein integrals: ∫sin(t)/t dt = π/2

∫sin(t/3)/(t/3) × sin(t)/t dt = π/2

∫sin(t/5)/(t/5) × sin(t/3)/(t/3) × sin(t)/t dt = π/2 … ∫sin(t/13)/(t/13) × … × sin(t/3)/(t/3) × sin(t)/t dt = π/2 But then the pattern is broken: ∫sin(t/15)/(t/15) × … × sin(t/3)/(t/3) × sin(t)/t dt < π/2 Here these integrals are from t=0 to t=∞. And Schmid came up with an even more persistent pattern of his own: ∫2 cos(t) sin(t)/t dt = π/2

∫2 cos(t) sin(t/3)/(t/3) × sin(t)/t dt = π/2

∫2 cos(t) sin(t/5)/(t/5) × sin(t/3)/(t/3) × sin(t)/t dt = π/2

…

∫2 cos(t) sin(t/111)/(t/111) × … × sin(t/3)/(t/3) × sin(t)/t dt = π/2 But: ∫2 cos(t) sin(t/113)/(t/113) × … × sin(t/3)/(t/3) × sin(t)/t dt < π/2 The first set of integrals, due to Borwein, correspond to taking the Fourier transforms of our sequence of ever-smoother pulses and then evaluating F(0). The Fourier transform of the sinc function: sinc(w t) = sin(w t)/(w t) is proportional to a rectangular pulse of half-width w, and the Fourier transform of a product of sinc functions is the convolution of their transforms, which in the case of a rectangular pulse just amounts to taking a moving average. Schmid’s integrals come from adding a clever twist: the extra factor of 2 cos(t) shifts the integral from the zero-frequency Fourier component to the sum of its components at angular frequencies –1 and 1, and hence the result depends on F(–1)+F(1)=1/2, which as we have seen persists for much longer than F(0)=1/2. • Hanspeter Schmid, Two curious integrals and a graphic proof, Elem. Math. 69 (2014) 11–17.

I asked Greg if we could generalize these results to give even longer sequences of identities that eventually fail, and he showed me how: you can just take the Borwein integrals and replace the numbers 1, 1/3, 1/5, 1/7, … by some sequence of positive numbers

The integral

will then equal as long as but not when it exceeds 1. You can see a full explanation on Wikipedia:

• Wikipedia, Borwein integral: general formula.

As an example, I chose the integral

which equals if and only if

Thus, the identity holds if

However,

so the identity holds if

or

or

On the other hand, the identity fails if

so it fails if

However,

so the identity fails if

or

or

With a little work one could sharpen these estimates considerably, though it would take more work to find the exact value of at which

first fails.

Related