Motown is trying to take back some swagger from Silicon Valley.

The dominance of software in automobiles these days in everything from infotainment systems to safety and the coming autonomous revolution has made the San Francisco Bay Area a hub for the auto industry. Several leading automakers have set up shop there so they're close to universities, research partners like Apple and NASA, and all the young talent. On any given day, there's a good chance you'll spot self-driving cars from the likes of Google, Audi, Nissan, and Delphi roaming the streets.

Michigan is eager to get (back) in on the action. The University of Michigan's Mobility Transformation Center worked with the state DOT and companies like Ford, GM, Honda, as well as Nissan and Delphi to create a test center where automakers can refine the most advanced technologies with no possible risk to the public.

"Mcity," which officially opened Monday, is a 32-acre faux metropolis designed specifically to test automated and connected vehicle tech. It's got several miles of two-, three-, and four-lane roads, complete with intersections, traffic signals, and signs. Benches and streetlights line the sidewalks separating building facades from the streets. It's like an elaborate Hollywood set.

Automakers and suppliers plan to test vehicle to vehicle communication (which the feds plan to require within a few years), and see how sharing information between cars can avoid accidents. They'll develop automated systems—like automatic braking to avoid collisions—as well as autonomous features that take the driving out of human hands. As important as these technologies are, they're not the sort of thing you want to shake down on city streets.

"We are going to figure out how the incredible potential of connected and automated vehicles can be realized quickly, efficiently and safely,” says Peter Sweatman, director of the Mobility Transformation Center.

This is about more than safety, too. Mcity allows engineers to test a wide range of conditions that aren't easily created in the wild. They can test vehicles on different surfaces (like brick, dirt, and grass) and see how their systems handle roundabouts and underpasses. They can erect construction barriers, spray graffiti on road signs, and work with faded lane lines, to see how autonomous tech reacts to real-world conditions.

Working in Michigan also will expose the vehicles to a wide range of weather, from triple-digit heat and soggy humidity to bone-chilling cold and brutal wind. Such conditions aren't always found in the Bay Area, which has a more Mediterranean climate. Rough weather is a serious challenge for the sensors that makes automated features possible—precipitation's bad for lidar, snow can block what a camera sees, fog hampers radar.

Such a site is a great tool, but the technology must also prove itself on public roads. A simulated environment has a fundamental limitation: You can only test situations you think up. Experience—and dash cams—have taught us our roads can be crazy in ways we never think to expect. Sinkholes can appear in the road, tsunamis can rage across the land, roadside buildings can collapse and send debris flying. Humans can be even harder to anticipate. But even every day actions, the things we do almost subconsciously,

Google, for one, has learned lessons only human subjects can provide, thanks to more than a million miles of autonomous driving, mostly on public roads. At first, it programmed its car to handle four-way stops the way driver's ed teachers preach: stay behind the line until its your turn. The team soon found the car never advanced, because human drivers kept jumping out ahead of it. So last year, it tweaked the software to make the car creep forward, the way humans do, signaling that it wants its turn.

Google has also reported its cars keep getting rear-ended while stopped at intersections (it's only recorded one minor injury so far, and the self-driving system was not at fault in any crash). There's no obvious lesson here for autonomous technology—except than that humans are indeed crappy drivers—but it's another thing we do the software must be programmed to work around. Google says that since it can gauge the speed, acceleration, and deceleration of other cars, "so we can make more nuanced judgments about what the safest thing to do is." That judgement has to account for the fact that the driver behind you might not notice you there, because he's looking at his phone.

It's that sort of natural behavior a person might not think to program in, but that the machine needs to imitate or anticipate, to get along with human drivers. This would be a lot easier if we started out with a wholesale transition from human to robot drivers, but that's not how this will happen. Humans and machines be sharing roads for the foreseeable future—and both need to be part of the testing.