U.S. officials said Tesla Inc.’s Autopilot feature contributed to a fatal crash, finding the Silicon Valley company’s semiautonomous technology allowed a driver to go long periods without his hands on the wheel and ignore the company’s warnings.

The U.S. National Transportation Safety Board’s judgment was rendered at a meeting Tuesday in Washington on the probable cause of a collision last year that killed the driver of a Tesla electric car. It represented the first official finding that the auto maker shoulders some responsibility in the crash.

Officials found Autopilot could be used on roads for which it wasn’t designed, and that a hands-on-the-wheel detection system was a poor substitute for measuring driver alertness.

The safety board’s determination is the first stemming from automated-driving technology and comes as U.S. lawmakers are debating legislation aimed at speeding development of autonomous vehicles. Government officials and industry experts contend self-driving cars will minimize congestion and reduce pollution while aiding the elderly and disabled. Lawmakers and regulators also have said they believe the technology will make transportation safer by cutting traffic fatalities largely caused by human error, including impaired and distracted driving.

U.S. Transportation Secretary Elaine Chao visited Michigan on Tuesday to unveil new voluntary guidelines for companies working on self-driving technology, focusing partly on safety recommendations for testing and development of advanced systems that rely less on human interaction. The policy, begun under the Obama administration, aims to encourage adoption of driverless cars and discourage states from safety regulation that has historically been the purview of U.S. officials.