Have you ever wanted a mobile app that ties your location to crime statistics, government environmental and health data, and weather and solar flare data to calculate the hourly probability of a zombie apocalypse? While that may not be exactly what the White House has in mind, it’s the sort of mobile mash-up that a new Federal IT policy could make a lot less difficult to create. The Obama administration has added another twist on “open government”—open, as in open API.

On May 23, the White House issued a directive that requires all agencies to establish programming interfaces for internal and external developers to use, and make “applicable Government information open and machine-readable by default.” As part of an effort to push government toward a cloud-computing future, the White House is encouraging agencies to make their data more developer friendly, and to create a shared platform for providing mobile access to data for both citizens and government employees. And they have 12 months to start delivering.

The goal of the new policy, called the Digital Government Strategy, is to jump start the government’s three-year old open data initiative, draw more private developer interest, and encourage the development of mobile applications that connect citizens and government employees more effectively with data that has previously been public, but nearly inaccessible.

Federal CIO Steven VanRoekel hopes the move will spawn an explosion of commercial application development. “Treating the government as an open platform in this way encourages innovation,” he wrote in a White House blog post. “Just look at how the government’s release of GPS and weather data fueled billion dollar industries. It also makes government more efficient and able to adapt to inevitable changes in technology.”

This isn’t the first attempt by the Obama administration to create an app ecosystem around government data. In many ways, the new initiative is an attempt to correct the failings of the government’s first “open data” effort, Data.gov. Launched in 2009, Data.gov was conceived as a clearinghouse for government data sets published in open formats.

Then-Federal CIO Vivek Kundra hoped Data.gov would seed thousands of commercial and citizen apps, in addition to shining some much-needed light on the inner workings of government. In an interview with Fortune Magazine’s Geoff Colvin in July 2011, Kundra pointed to some of the early successes of Data.gov—such as a mobile app that tells parents whether the crib they’re about to buy has been recalled, Microsoft Bing’s use of Medicare/Medicaid data to rate hospitals, and a web app that combines FAA statistics with traveller “tweets” to help airline customers make decisions about which airline and flight to pick and when to leave for the airport.

But despite some modest successes, Data.gov has had some significant problems. Most of the first wave of data posted to Data.gov was in “open” formats, but ones that required data to be downloaded in order to be processed and used—such as comma-separated value (CSV) format. And because it was bulk-exported, there were problems with the data quality in many data sets—including the issue that much of it was stale before it was even posted.

Researchers at Rensselaer Polytechnic Institute paved the way toward making the data more useful by converting some data sets into “semantic web” content. Using the Resource Description Framework (RDF), JSON and the SPARQL query language, a team with RPI’s Tetherless World Constellation research initiative created a framework that allowed different datasets from Data.gov to be linked together into web applications. Since then, the government has published more of its data in RDF format. But that data is still based on bulk-published archives, and it hasn’t solved the data quality problem.

The new presidential directive aims to change all this. First, it aims to decouple information from applications. “Rather than thinking primarily about the final presentation—publishing web pages, mobile applications, or brochures,” VanRoekel wrote in the Digital Government Strategy document, government agencies need to take an “information-centric” approach, “ensuring our data and content are accurate, available and secure. We need to treat all content as data, turning any unstructured content into structured data, then ensure all structured data are associated with valid metadata.” The data would then be accessed by all applications through a common set of Web APIs.

And at the center of the strategy is a transformation of Data.gov itself from a publishing site to a developer resource. In a White House blog post, Federal Chief Information Officer Steven VanRoekel wrote, “To make sure there’s no wrong door for accessing government data, we will transform Data.gov into a data and API catalog that in real time pulls directly from agency websites.”

Just what the nature of those APIs will be has yet to be determined, though the approach outlined in the Digital Government Strategy favors XML data formats. The Office of Management and Budget will issue a government-wide policy on web APIs, open data, and content formats within the next six months, after which agencies will have six months to “ensure all new IT systems follow the open data, content, and web API policy,” set up developer pages with API information, and make data from at least two existing “customer-facing” systems available through those APIs.