I note an interesting short piece by James Hrynyshyn about a bit of a local controversy in Nova Scotia about the installation of a tower to provide wireless internet to the area. Leading the opposition is a guy worried about the health effects:

"I think over a period of time it will change the DNA of the garlic because it shakes up the molecules," he said Tuesday. EastLink uses microwave transmission to provide high-speed internet access to rural areas outside its wired network. Levine said he moved to the country to get away from pollution, and he sees the radiation from the towers as another form of pollution. "I view it with dread, fear and panic," he said. "I don't want to grow food under those conditions."

It's tempting to roll your eyes and dismiss it out of hand, and if in fact you did you wouldn't be wrong. But why? It would certainly better if more people knew a little more about molecules and light, and maybe the next generation of farmers could work out for themselves what the effect of microwave radiation on crops is likely to be.

To do some estimation, we need to know about how light carries its energy. It's relatively accurate to think of light as a stream of particles called photons, each with a specific amount of energy proportional to its frequency. We also know that light has wave properties such as diffraction and interference that can't be described in terms of "billiard ball" photons, but for a relatively simple task like this where we're not worried about wave properties we can just keep in the back of our minds that these photons do have a wave nature and aren't only particles in the classical way we think of particles. With that as a caveat, it's completely true that light does come in photons and they all have a certain energy. What is that energy per photon? It's this:

This will give you the energy of a photon in electron volts given the wavelength in nanometers. Visible light ranges from roughly 750 nanometers in the red to 380 in the blue. In terms of energy, our formula tells us that these photons range from about 1.6 to 3.3 electron volts. This is roughly on the order of the energies involved in electron energy levels, which is the basis for chemistry. Chemical reactions can make visible light (glow stocks, fireflies, etc) and light can made chemical reactions (camera film, dyes fading, etc). Ultraviolet light from the sun is of a shorter wavelength, say, 300 nanometers or so from the rays you should protect yourself from at the beach. These are maybe 4.2 eV, strong enough to more excite more energetic electron energy levels and dissociate the molecules. If it's a DNA molecule, these damages and breaks from absorbing the photons can eventually lead to skin cancer and other health problems.

So what kind of energies are we looking at in WiFi signals? WiFi operates in the 2.4 GHz frequency range, the same as a microwave oven. The wavelength of that light is about 12.5 centimeters, which is about 125 million nanometers. Each photon therefore carries almost exactly 1/100,000 of an electron volt worth of energy. This is nowhere near the ~1+ energies involved in electron energy levels, and so this kind of radiation can't damage DNA. The energy just isn't there. Furthermore, because of the quantum nature of electron energy levels, you can't just stack 100,000 microwave photons to cause a 1 eV transision. You have to actually have a 1 eV photon. (Technically there is such a thing as a multiphoton transition, but it's a strongly nonlinear process with a probability that's already very low for 2 photon transitions and exponentially worse as the number increases. 100,000 is out of the question.)

But if microwave photons can't damage DNA, how can they cook food and boil water in your microwave oven? The answer is simple: the energy they have is plenty enough to excite the much less energetic rotational and vibrational states of the molecules. This motion produces heat. Heat is just heat though, there's nothing special about heat produced by a microwave any more than there is by an electric heating element or a heat lamp or a seat warmer. And the total power output by a WiFi transmitter is many orders of magnitude less than a microwave oven - 1 watt tends to be an upper limit for home and business transmitters, while any person standing around would only absorb a tiny fraction of that tiny fraction. The temperature increase from absorbing WiFi signals is not measurable, and mathematically speaking is itself dwarfed by other radio/microwave sources such as cell phones and (depending on your location) broadcast radio and TV.

WiFi just isn't going to hurt you, your DNA, your crops or their DNA, or anything else other than the attention spans of college students when WiFi lets them spend class on Facebook.