I don't have anything all that new to say about last night's Cosmos reboot, and I'm leaving for scenic Madison, WI today to attend DAMOP, so I don't have a great deal of time. Kate did mention something over dinner last night, though, that's a good topic for a quick blog post.

Kate's a big listener of audiobooks and podcasts, including The Naked Scientists podcast, and she mentioned something they said in responding to a question about charging phones and the cost of electricity:

I think my favourite one is a microwave oven. So, the clock on a microwave oven uses more electricity over the course of his lifetime than cooking food does. So, if you want your microwave oven to cost the least amount of money, don’t use it as a clock. Just turn it on when you want to use it.

That struck me as kind of weird, and it's the sort of thing that is amenable to a Fermi problem sort of plausibility argument. So, to Kate's chagrin, I grabbed a napkin and one of SteelyKid's markers, and did some math.

The tricky thing about discusions of electrical usage is always the units-- people get twisted around about what things mean. Most electrical things have ratings in watts, and so people will casually talk about electrical power as if that's the thing consumed. Power isn't a measure of stuff, though, power is a rate-- it's a measure of how much energy gets converted from electrical potential to some other form (light, heat, sound, microwaves) in a given amount of time. When you're talking about electricity used over the lifetime of something, what you need is not the power, but the total energy.

(This is why your electric bill is given in units of "kilowatt-hours." It's mixing units a little-- a kilowatt is 1000 joules per second, and an hour is 3600 seconds-- but a kilowatt-hour has dimensions of energy, and so is the proper measure of consumption.)

So, if you want to figure out the energy usage of your microwave compared to the clock in your microwave, what do you do? Well, the microwave itself has a rating in watts-- typically around 1000W for a household device, which is a nice round number to use in a Fermi-problem estimate. For estimation purposes, it's probably reasonable to assume that the microwave draws this full amount of power when it's running on high. But that's a very short time, compared to the clock, which is on 24/7, albeit at a much lower power level.

So, how do you compare the two? Well, one reasonable thing to do is to calculate an average power for the microwave when it's used for cooking. That is, we can calculate the total energy used over some time period, which is really concentrated in a few short bursts, but then divide it by the full duration of that period. Essentially, we pretend that rather than dissipating energy at 1000 J/s for a short time, it's dissipating a much lower energy over a much longer time. Then we can see how that compares to the power usage of the clock.

So, what's the energy used by a microwave? Well, in Chateau Steelypips, out microwave sees a good deal of use in heating up leftovers and frozen vegetables and the like. I don't think it would be unreasonable to say that it's used for cooking about 10 minutes a day, which is a nice round number for Fermi problem purposes.

So, the average daily energy consumed by the Chateau Steelypips microwave running at 1000W for 10min a day is:

$latex E_{cooking} = (1000 J/s)(600 s) = 600,000 J $

There are 86,400 seconds in a day, so this works out to:

$latex P_{avg,cooking} = \frac{600,000J}{86,400s} \approx 7 J/s $

So, the ten minutes a day we spend cooking with our microwave is equivalent to a smaller appliance dissipating energy at the rate of seven watts. To determine whether the claim that the clock uses more energy than the microwave holds up, we need to compare this number to the average power drawn by the clock in 24/7 operation.

So, what's the average power of a microwave clock? I have no idea. Seven watts seems awfully high, though. My smart phone (a Moto X) is probably on the short side of a watt when I'm talking on it (using a similar calculation to this), and that's working a lot harder than just keeping an LED clock display going. The label on my bedside alarm clock says 4W, but like the microwave, I expect that's probably only when the screechy alarm is blaring, not the constant draw. If it were dissipating 4W worth of power all the time, I'd expect it to be warm to the touch, and it's not.

So, I'm skeptical that this claim is really true. But this problem also demonstrates the limits of the Fermi problem approach. All I'm really doing with markers on a napkin is setting a reasonable range for the problem. To calculate an actual answer would require a good deal of information that goes beyond the simplifying assumptions I'm making here-- that 1000W might be a maximum rating, not the actual power used; maybe it's really running at 800W. And maybe it's not ten minutes a day of cooking, but six. At which point, the average cooking power has come down by a factor of two, into the same ballpark as my alarm clock. And then maybe we're not typical of microwave owners-- we do have two small kids, which may well mean more microwaving of stuff than the average Briton considered by the Naked Scientists.

I still doubt this really holds up, but given the Fermi-problem estimate, all I can really say is that it continues to seem implausible, but it's not totally ridiculous. I doubt it, but it's a close enough thing that it would probably be reasonable to slap a meter on there and check.

Which I'm not going to do, because I'm getting on a plane to Wisconsin in a few hours, but it's the next logical step in the scientific approach.