Transmission Line Losses

By: Aaron Datesman

I enjoyed the comments attached to this recent post about wind-to-hydrogen renewable energy. Some of the discussion involved losses on power transmission lines. I would like to elevate my response up to a blog post since, although this is sort of a dry topic, understanding it is actually somewhat important.

Partly due to history, and partly due to physics, the US electrical transmission grid is actually rather localized. The electrical energy we use is generated pretty near where we use it (also, pretty much just before we use it). This is a big problem if you propose to harvest energy in one place where it is abundant (for instance, via concentrated solar in Arizona, or via wind turbines in North Dakota) for use someplace else far away.

The historical portion of this surprising fact, I think, has to do with how the electric power industry evolved in the early 1900’s - generally as municipal utilities, many publicly owned. The physics contribution has to do mainly with transmission losses.

Now it’s not hard to dredge up the figure that transmission losses account for about 10% of the electrical energy generated in the US - that is, out of every 10W of power generated, we lose 1W getting that power to where it’s put to use. Because we use a lot of power, this represents a huge waste, but measured as a calculation of efficiency it’s pretty good - that’s 90%.

The more I think about this number, however, the more I hate it. The figure is remarkably deceptive. This is true because what it really measures is not efficiency; instead, it measures the total system losses measured against the total system generation. This provides no guidance about what the losses would be like if the distribution system were different - for instance, if it incorporated longer transmission lines. A useful figure would normalize to line length and utilized line capacity.

That’s complex-sounding but means something simple. From my desk at work, I can walk to a 150MW natural gas-fired power plant. It provides power to the Laboratory facilities, all of which lie within a radius of about one mile. Because the transmission distance is quite short, the transmission losses in this case are very small (<1%). Let's say that they are 0.5%, or 0.75MW.

Someplace in Nebraska or some other sparsely-populated state, however, there is certainly a small town or some energy-consuming facility which lies a great distance (say ~50 miles) from a power-generating facility. Maybe it uses 100kW of power. Due to the distance, the transmission losses in this case are quite large (I will estimate 35%, which is representative of the calculation coming up).

In aggregate, the example system comprised of Argonne National Laboratory plus a factory in Nebraska refining corn syrup suffers transmission losses of 785kW, or 0.52%. America is like this. The transmission efficiency looks high (losses of only 10%) because most of the power we use (normalized to capacity) travels only short distances (normalized to length). This does not mean that line losses per distance are low. Mostly it means that the lines are short - not at all adequate to carry power from North Dakota to someplace useful.

So, this is a nice, simple explanation, but is it true? To test it, this site has a useful model of an electrical distribution network. There is a little bit of math ahead, which I only include because I think some readers will like it.

Following the diagram and the text below it, Ohio Edison operates 5757MW of electrical power generation (let’s call it 6000MW). The source voltage is 18kV; this gives 300,000A of current in the generating facilities (rounding for convenience).

The ratio of the step-up transformer to the transmission line is 350/18, which we’ll call 20. This means that the current in the transmission line is 300,000/20, or 15,000A. Current is stepped down in this manner in order to minimize transmission losses, but they cannot be eliminated.

In order to calculate the transmission losses, we’ll assume that the transmission cable used for this application has a resistance of around 0.1 Ohms per km of length. This is at the low end of the range (so the loss estimate will be low) according to a couple of spec sheets I looked at.

This yields (P=I^2R) 22.5MW lost power per km of length, or 0.4% per km. That’s 0.6% per mile, or 10% of the total energy lost over a transmission distance of just 18 miles. After 115 miles, only 50% of the power remains.

This is the basis of my opinion that you really do lose a lot in the transmission lines, which has serious implications for how one might design distribution networks incorporating renewable energy sources.

Oh, OOPS: I wrote this post late last night, and managed to convince myself of something that isn't true. The conclusion that the lost power per unit of length is 0.6% per mile of length is correct. I then took that number and compounded it (like interest) to find the total loss, like this: 0.994^N=0.9, where N is the length in miles, to find the distance corresponding to 10% loss.

Actually, it's simpler. Since the resistance increases at the same rate as the length (0.2Ohms for 2km, etc.), the loss just scales as length as well. So the benchmark comparisons actually are 17mi for 10% loss and 83mi for 50% loss. The overall point remains correct, but the specific numbers were wrong. I apologize for the error.

(What led me to over-think this question is the answer you get using this method to calculate the loss over, say, 300mi: 180%. Of course this is impossible - you can't lose more than you started with - from which I concluded that the physics actually should involve constant bites in percentage terms from a diminishing total. This is incorrect. When the line losses start to get pretty large, I think a slightly more complex analysis of the distribution network must be invoked. Since this doesn't alter the conclusion, which remains correct, I'll omit further explanation.)

— Aaron Datesman



Posted at March 5, 2010 01:12 AM

