The electric grid was designed as a one-way highway, with power cascading out from big power plants to cities and towns at the end of the line. But because of changes to how we consume and generate electricity, managing power flows on the grid is becoming more complex—and will be more so in the future.

As more variable renewable energy comes online, grid operators need new ways to maintain the steady balance between supply and demand to ensure reliable service. One emerging path of research is to overlay power lines with hardware controllers and software to optimize how power flows through the network. The goal is to transform the grid from a fixed hub-and-spoke architecture to a flexible, two-way network more like the Internet.

To illustrate how tricky running the electric grid can be, Boston University researcher Pablo Ruiz pulls up an online heat map that shows real-time pricing on the transmission grid from Ohio to Washington, D.C. In one part of Virginia there’s a bright orange patch, indicating that prices are over $200 per megawatt-hour. A few counties over, the map is a cool blue and the price for energy is only $30 per megawatt-hour.

The wide price discrepancy reflects congestion. Because of some temporary glitch—perhaps an overloaded transmission line or substation—there’s a constraint in the system, Ruiz explains. Grid operators will need to call on more expensive, local generators to match the power supply to the demand in that one region in Virginia.

Wind and solar generators make the picture more challenging. Wind farms in Ohio could produce an unexpected burst of power, but if there isn’t an available line to transport that electricity, wind turbine–produced electricity needs to be dialed down, or “curtailed” in industry jargon. In other words, clean energy with no fuel cost goes unused. “Under most conditions, regional grid operators cannot dispatch the lowest-cost resource because of transmission congestion,” Ruiz says. “The transmission system often becomes the limiting factor.”

Grid operators are addressing bottlenecks by installing beefier transmission lines and improving the forecasts of wind and solar farms. There is also hardware that can route power along alternate pathways. But researchers like Ruiz, with funding from ARPA-E’s Green Electricity Network Integration program, are exploring whether software and a new generation of power-flow controllers can achieve some of the same benefits cheaper and faster.

The team led by Ruiz has written algorithms that analyze power flows on the transmission grid and identify less-congested routes, much the way a car navigation program will propose back roads if there’s heavy traffic on the main highway. Grid operators could then open and close circuit breakers to redirect power and make the most cost-effective energy source available. In simulations with the grid operator PJM, Ruiz estimates the project’s Topology Control Algorithms software could save $100 million a year in congestion-related costs and reduce wind curtailments by roughly 50 percent.

Clever algorithms can make running the local distribution grid more efficient as well. California Institute of Technology professor Steven Low is designing the underpinnings of a software system that utilities would use to optimize the operation of the local circuits that deliver electricity to homes and businesses.

Existing methods for controlling circuits, such as tapping capacitor banks, may not be sufficient in neighborhoods with a large number of photovoltaic solar panels or electric vehicles, Low says. Passing cloud cover in solar-heavy areas causes a drop in the line voltage and power-hungry electric vehicles can strain power lines’ capacity if consumers all plug in at once. Homes also increasingly have two-way thermostats that can adjust settings during hours of peak demand.

In a project with Southern California Edison, Low is developing methods for utilities to manage their networks as more distributed technologies take hold. For example, the software would help a grid operator to decide when to slow the charge rate of an electric vehicle (EV) to ease congestion or to tap smart inverters on solar panels to boost the line voltage. “If you make a change to charge an EV at this time, it affects the entire power circuit, according to the laws of physics,” Low says. “How do coordinate all those devices? You can formulate that mathematically as an optimal power flow.”

New hardware can make the grid more flexible as well. The Tennessee Valley Authority, for instance, has installed about 100 power-flow routers—one meter-and-a-half-long metal boxes fitted on transmission lines—from start-up Smart Wire Grid to monitor line activity and shift power to underused lines. Other companies, including Varentec and Gridco Systems, are also making devices to route power dynamically that aim to be cheaper and more reliable than existing gear.

Everyone agrees that the grid needs to move away from its current centralized model, says Santiago Grijalva, director of the Power Systems Engineering Center at the National Renewable Energy Laboratory. Instead of having utilities and grid operators decide when to call in new generators or curtail electricity use, he envisions a time when many customers will generate their own electricity and inject power into the grid at specific times.

A research group led by Grijalva is devising what he calls an “electricity operating system” that would allow consumers to schedule their energy usage in order to provide services to utilities, such as firming up line voltage or balancing the grid’s frequency. A building with rooftop solar panels and a battery, for instance, could store energy and sell it into the wholesale energy market at peak hours to earn money. “Distributed intelligence is the paradigm for the future grid,” Grijalva says. “It’s not something you can avoid. The common user is becoming much more aware of their energy.”

Better power flow control through software is just one piece of the puzzle when it comes to preparing the grid for high levels of variable renewable energy, says J. Charles Smith, the executive director of the Utility Variable-Generation Integration Group. But because computing power continues to grow at an exponential rate, it’s a promising avenue for research, he says. “The raw horsepower you need in a computer to run the load flows and contingency analysis to make decisions for the next minute in operation is huge,” Smith notes. “The tremendous advances taking place in computers are one of the reasons that these things can now be considered.”