Crazed robots running amok used to be the stuff of late-night science fiction movies, but for some Stanford professors, that entertaining fiction is close to becoming reality and people need to start thinking about the legal implications now.

Robots have been an increasingly familiar sight in recent years, disarming explosives in Iraq, delivering mail in industrial complexes or bringing drugs to nurses in hospitals. But the coming generation of robots will be cleaning houses, doing security work, helping in nursing homes and handling a widening variety of tasks as their capabilities grow.

As robots leave the factories and move into homes and businesses, there is going to be more and more interaction between regular people and increasingly more competent - and mobile - machines, said M. Ryan Calo, a residential fellow at the Stanford Center for Internet and Society. And more contact always means more problems, and the U.S. legal system better be prepared, he said.

"These are devices that don't have a predetermined usage; they're not toasters," he said. "There's a growing concern now about robot ethics, but what's missing from those discussions is pragmatic lawyers thinking about what's going to happen in the future."

Fleeing for lawyers

It's not hard to imagine Americans fleeing clanking automatons gone haywire, experts say. Instead of calling for the Army, as in science fiction movies, they probably will be screaming for lawyers, with the battle more likely to include lawsuits than rocket launchers.

"These are machines that may not be intelligent, but they are increasingly autonomous. They do things without being told," said Paul Saffo, a futurist who's a visiting scholar at Stanford Media X, which focuses on how people work with technology.

So if something goes wrong, who's to blame?

There's no easy answer. In 1997, for example, a woman claimed she was injured by Zippy the mail robot at Pacific Bell's San Ramon complex after the robot ran over her foot and then slammed her into a filing cabinet. Pacific Bell later settled the case for an undisclosed amount.

But what about a potential problem with a much more advanced cleaning robot, Calo suggested, where a pair of teenagers use a wireless connection to hack into the robot's system and use it to trash the house?

Who pays for the damage?

The teens are judgment- proof, because they have no money, he said. But what about the manufacturer, who built the robot, or the developer who designed it or the software engineer who programmed it?

Model: Internet law

That's not only a legal question, but also a technical and financial one as well, Calo said, because companies that could face multimillion-dollar liability lawsuits in the future aren't going to be eager to sink billions into research and development of that next generation of robots.

"If other countries have a higher bar for litigation, they'll leapfrog right over us" in robotic research, he added.

Calo, who specializes in Internet law and privacy issues, says the way liability has been handled in Internet cases provides a guideline.

Section 230 of the 1996 Communications Decency Act gives "interactive computer services" immunity from liability for information put on their sites, which means Facebook or other Web site hosts can't be sued for what others post on their site.

"It's no coincidence that most Internet companies are based in the United States, which provides that protection," Calo said, arguing that some similar type of immunity might be needed to protect the robotics industry in the United States.

In 2007, Microsoft's Bill Gates wrote a cover story for Scientific American, arguing that the robotics field was in the same position the computer industry was in back in the mid-1970s. The world-transforming revolution sparked by Gates and others in the personal computer area could be repeated, he suggested.

Those changes are beginning to happen now and bringing problems that have to be dealt with, said Saffo.

"Ready or not, robots are racing into our lives," he said. "But for most people, the first time they're going to really notice those robots ... is when the systems go bad."