The Large Hadron Collider is still going through a painful commissioning process—coming online in time for the winter shutdown is probably not what researchers had in mind when they broke it the first time. So, what is a physicist to do when the shiny toys are still being polished? Sit around at the pub and gossip about old experiments, of course.

One such session has ended with Jorg Jaeckel, from Durham University, taking a new look at 40-year-old data from a classical electrostatics experiment. He found that this data provided the strongest constraints on a particular set of particles so far, thus proving that some experiments age very gracefully indeed.

What was he looking for? Well, the standard model has a bunch of particles that are ordered into families with like properties. New physics, required to explain cosmological observations and gravity, cannot be based on having these particles bounce off each other. Instead, new particles, some of which only interact very weakly with existing matter, are thought to exist.

Now, we can only detect these particles through their effect on known particles. So the game goes something like this: use some mathematical framework to posit the existence of a bunch of particles with certain properties. Combine these guessed-at properties with the properties of known particles, and figure out how one influences the other. In looking for, and failing to find, these effects, we can eliminate hypothetical particles and constrain the properties of the remaining contenders. It's the classic scientific process of finding something that isn't obviously wrong by eliminating the things that are.

Jaeckel was playing the eliminate-and-constrain game with particles that have an effect on known particles which would be observed as a polarized vacuum. Confirming the existence of a polarized vacuum would be evidence for extremely low-energy particles. How can we see this? By looking at how the orientation of the electromagnetic field of light changes as it propagates through the vacuum. Another clue would be if Coulomb's inverse square law were violated.

Coulomb's inverse-square law states that the force of attraction or repulsion between two charges will fall off at the square of the distance between them. It's a consequence of the electric field between two charges changing at the inverse of the distance. This law is pretty basic physics, and it has been tested and re-tested many times, with the last serious test being 40 years ago, using a method designed by Henry Cavendish in the 18th century.

In fact, because electromagnetism is such a strong force, it can violate the inverse-square law, but, these violations happen in specific circumstances that can be controlled. Once those controls are in place, the experiment just needs to be as sensitive as possible. The usual way to maximize sensitivity is to set one metallic sphere inside another. If you charge the outer sphere to some voltage, then the field falls as the inverse of the distance outside the sphere—but internally, the field is constant. Measuring changes from a constant field that is shielded from the outside world is much simpler than dealing with a changing field that is also exposed to its influences.

This elegant experimental design is what makes the data so extremely precise. It is 10-20 times more sensitive to low-energy particles than "previous" experiments—all of which were actually performed after the last serious Cavendish-type experiment—and allows the data to eliminate a large range of energy/polarization combinations. The data provides significant new constraints on what particles may or may not be out there, and it was had for the price of a few pieces of paper.

Cavendish-type experiments are not going to eliminate the need for the LHC. However, a new, more sensitive version of Cavendish-type experiment might provide evidence for particles that the LHC is not really designed to seek. What's more, it's a particle physicist-on-a-budget's dream.

Physical Review Letters, 2009, DOI: 10.1103/PhysRevLett.103.080402