The U.S. Senate Committee on Commerce, Science and Transportation has called out Tesla CEO Elon Musk to answer some questions about the company’s Autopilot technology, and what the Silicon Valley automaker is doing to educate drivers about it.

Sen. John Thune, chairman for the committee, issued the letter today seeking a response from Musk and Tesla no later than July 29th.

The inquiry was prompted after a fatal Tesla crash in Florida, which we reported on earlier. During that crash, the driver had Autopilot engaged.

In a second crash in Pennsylvania, which was not fatal, the Detroit Free Press reports that the driver thought he had Autopilot engaged.

Musk issued a public statement today, however, that according to investigators’ findings, Autopilot was not on in Pennsylvania, and in fact could have prevented that crash.

The Senate inquiry follows several federal agencies’ more formal investigations into Tesla’s technology business practices.

As TechCrunch previously reported, the U.S. Securities and Exchange Commission is trying to determine whether Tesla had an obligation to notify its shareholders of the Florida crash sooner than it did.

The National Highway Traffic and Safety Administration (NHTSA) and the National Transportation Safety Board (NTSB) are also looking into what exactly happened during the Florida crash, and whether or not Autopilot features contributed in any way.

Sen. Thune published and publicized the committee’s letter on the same day that Consumer Reports called on Tesla to disable Autopilot’s hands-free mode until the technology is further developed.

Gail Gottehrer, a partner at Connecticut law firm Axinn, who specializes in class action defense, management-side labor and employment litigation, said everyone is concerned about liability following in this market, for good reason.

They also were when Google’s self-driving car collided with a bus in California in February — a minor accident with no fatalities — she noted.

“This tech is so advanced and so complicated. Everyone wants to figure out how to regulate it so that we can get self-driving vehicles out on the road, where it could be very beneficial for disabled people, seniors, and to prevent thousands of deaths,” she said. “But it’s hard to know where to start and how to do it effectively. Everyone will be watching to see what NHTSA’s guidelines are as a next step.”

The attorney, who does not represent any automotive clients, or have any connections to Tesla, said she believes specifically “There’s certainly no failure to warn on Tesla’s part.”

Tesla gives its customers more than the small print, she said. There are meaningful disclosures that their cars aren’t toys, and if one does not operate a vehicle safely, a driver can make even the safest car unsafe no matter how well it is manufactured.

Questions remain for regulators and politicians looking out for public safety above profits or rapid advances in technology.

For example, does a tech or automotive company have an obligation to reach out to people who are using their products the wrong way; and how comfortable is the public with some people being part of a beta test out on the road?