The U.S. military is rushing to deploy solar-energy equipment on the battlefields of Afghanistan, according to Elizabeth Rosenthal’s story in the Oct. 5 New York Times, and, though the article doesn’t say so, this could prove to be a huge boon—even a turning point—for the fate of renewable energy here at home.

There are two main motives for the military’s push, neither having anything to do with green consciousness. First, transporting fossil fuel to the landlocked front lines is hideously expensive: The Army and Marines pay only $1 a gallon for the fuel itself but up to $400 a gallon for the truck convoys that move it through Pakistan and up the Khyber Pass. Second, security along these roads is tenuous. Last week, militants blew up one such convoy; in the past three months, six Marines escorting the convoys have been killed.

One Marine unit is now setting up portable solar panels, solar tent shields, solar-powered rechargers, and other energy-saving gear, with other units soon to follow. Meanwhile, the Navy last year deployed the first amphibious ship powered by electricity instead of fossil fuels—saving 900,000 gallons of fuel on its maiden voyage from Mississippi to San Diego. The Navy is also experimenting with fuel made from algae. And the Air Force is planning to convert its entire fleet of airplanes to run on a jet fuel-biofuel mix.

This is all interesting in its own right, but the decisive impact may be on the civilian energy economy.

In the last half-century, many of the United States’ great technological breakthroughs have been made possible because of the demand created by large-scale government projects—which, in this country, has mainly meant military and space projects.

For example, the microchip—the building block of the digital revolution—was introduced by Texas Instruments at the March 1959 radio engineers’ trade show. But it took off as a viable commercial product only after President John F. Kennedy pledged to send a manned spacecraft to the moon and after he and his defense secretary, Robert McNamara, funded the Minuteman II intercontinental ballistic missile.

The microchips made these programs possible. Conventional transistors would have been too big, heavy, and hot to fit inside those rockets’ nose cones and power their guidance systems; and simply wiring together all the circuitry by hand would have been prohibitively expensive.

But, more to the point, those programs made the microchips possible, too. NASA’s rockets and the Air Force’s Minuteman missiles created a demand for the chips that otherwise did not exist. The large-scale production yielded economies of scale, which lowered the chips’ price, to the point where manufacturers could order them for commercial goods, which boosted production and thus lowered costs still further, and on and on the cycle continued.

In 1961, when Kennedy announced the manned space program and the Minuteman missile, a single microchip cost $32. By 1971, the cost had plunged to $1.25. (By 2000, it dropped to under a nickel.) Without that initial spur of demand from the government, the chip might have vanished, along with who knows how many other technological wonders that never took off because they cost too much—and the world today would be incalculably different.

The same can be said of data-processing computers. In 1950, only 20 computers existed in the entire United States; most of them were being used by the military, mainly the nuclear weapons laboratories. Even by 1954, only one private company, General Electric, had ordered a computer—the UNIVAC, or Universal American Computer, a room-sized monstrosity, powered by 8,000 vacuum tubes, built by the long-defunct Eckert-Mauchly Computer Corp.

At the end of the ‘50s, IBM built the 1401, the first computer designed for commercial enterprises. It took up 34 square feet of floor space, weighed one and a half tons, and, if ordered along with the 1403 printer, cost $78,000 (the equivalent of nearly half a million dollars today) to buy, or $1,450 a month to rent. It was practically palm-sized and dirt-cheap compared with the UNIVAC, but still, few commercial companies could afford one.

The 1401 didn’t take off until it received multiple orders from the U.S. government—mainly from the Social Security Administration, the Selective Service Administration, and the Peace Corps. Higher demand spurred production, which lowered costs, which boosted demand, which spurred further production, and so forth, until we have today’s sub-$1,000 laptop (with thousands of times more processing power than the behemoths of yore).

NASA’s space program—itself the product of military rockets and the microchip—has produced countless spinoffs, ranging from communications satellites, diagnostic digital-imaging systems, and municipal water-purification systems to scratch-resistant lenses, virtual-reality software, and shock-absorbent running shoes.

In this same way, the military’s demand for renewable-energy technologies today could create the conditions for a wide commercial market in the years ahead.

Right now, there are companies that manufacture rooftop solar-power systems. But the units retail for $20,000. It would take about 20 years for the average household to amortize the savings from no longer having to pay monthly electricity bills—which is to say, it’s unaffordable.

However, if the military’s demand boosts production, which yields economies of scale, which lowers the price, and on and on—as happened with the microchip, the computer, and other commercial spinoffs—then the cost-benefit ratio would come down to the point where some people will buy a unit for their home, which will spur still more production and perhaps bring in other companies to compete for market share, which will lower prices further … and all of a sudden, solar power becomes not just an environmentally and strategically desirable option but also an affordable one—which, in terms of creating a mass market for a product, is the only thing that matters.

Two other factors increase the chances that the military’s renewable-energy projects might have commercial spinoffs.

First, as with the microchip and the computer, these projects are adapting products that private companies have already developed and built. In other words, the military is bypassing its normal procurement process, with its bureaucratic hassles and excessive “requirements,” which have resulted in the unwieldy designs and exorbitant costs of so many U.S. weapons systems.

Second, Congress is more likely to fund these projects precisely because they’re related to the national defense. The United States has an elaborate nationwide highway system today because, back in 1956, President Dwight Eisenhower sold the program to Congress by calling it the National Interstate and Defense Highway Act (italics added). The Army, Eisenhower said, would need solid highways to move troops or evacuate citizens in the event of a foreign invasion or a nuclear war.

Similarly, after the Soviet Union launched the Sputnik satellite in 1957, state governments across the United States spent scads of money to create, or improve, high-school science and math programs in order to “catch up” with the Russians. (This impulse wasn’t limited to science and math. At the high school I attended in Kansas, money was even appropriated to buy books for a course on the modern novel. The course was still around in the early 1970s, and thus was I exposed at an early age to Conrad, Crane, Hawthorne, and Hemingway.)

Congress today has little appetite for spending billions of dollars on solar power generators or biofuel labs under the rubric of energy independence or “going green.” But to serve the war mission, and especially to protect the troops, no sum is too lavish—and that’s why the road to going green, and to achieving energy independence, might very well be paved through the fighting fields and villages of Afghanistan.

Become a fan of Slate on Facebook. Follow Slate and the Slate Foreign Desk on Twitter.