A photo posted on Fresco News' Twitter feed showed a self-driving Uber Volvo SUV on its side. Fresco News/ Mark Beach

The self-driving Uber involved in an accident in Arizona last Friday was driving through a yellow light when it was hit by a Honda CR-V, according to a police report Business Insider obtained through a public records request.

The self-driving Uber was not at fault when the accident occurred, according to a representative for the Tempe, Arizona, police department. But the incident showed how humans may be better equipped to handle complex driving scenarios in which the rules of the road shouldn't necessarily apply.

Alexandra Cole, the driver of the Honda CR-V, was attempting to make a left turn across three lanes of traffic when the accident took place. Cole managed to drive across the first two lanes, which were backed up with cars, and thought she was clear to cross the third.

"As far as I could tell, the third lane had no one coming in it so I was clear to make my turn," Cole said, according to the report. "Right as I got to the middle lane about to cross the third I saw a car flying through the intersection, but couldn't brake fast enough to completely avoid collision."

Uber's self-driving Volvo was driving through a yellow light at 38 mph, just below the speed limit, when it was hit by the Honda. The Uber then drove into a traffic pole and flipped onto its side.

No one was seriously injured in the accident.

A diagram of the accident. Vehicle 1 refers to the Honda CR-V while Vehicle 2 refers to the self-driving Uber. Tempe Police

Patrick Murphy, the Uber employee behind the wheel of the self-driving car, said in the report that he saw the Honda turning left but that there was "no time to react" as traffic in the first two lanes had created a blind spot.

In terms of programming, the self-driving Uber did everything by the book. It had the right of way approaching a yellow light, and therefore zipped right through.

But one has to wonder whether a human driver, approaching the busy intersection while the light was turning yellow, may have slowed down.

"We saw the [Honda], it was coming fine on her end, but the other person just wanted to beat the light and kept going," a witness said in the police report. "All I want to say is it was good on the end of the [Honda] driving toward us, it was the other driver's fault [Uber] for trying to beat the light and hitting the gas so hard."

An Uber representative told Business Insider the self-driving car did not accelerate while approaching the yellow light but maintained its speed of 38 mph. Uber cars are programmed to always pass through a yellow light at their current speed if there is enough time to make it through the intersection.

Uber vehicle operators are also trained to take over at yellow lights if they don't feel comfortable proceeding through the intersection, the Uber spokesperson said.

Still, the witness' account of the accident raises an interesting point: Would a human driver, seeing someone struggling to make a left turn through bumper-to-bumper traffic, have approached that yellow light differently?

It's a similar question to the one posed during the National Highway Traffic Safety Administration's investigation into the fatal Tesla Autopilot accident. (Uber has compared its cars' self-driving capabilities to Tesla Autopilot.)

Joshua Brown was killed in May when his Tesla Model S collided with a truck while Autopilot was activated. NHTSA closed the investigation and determined Autopilot was not at fault because Brown had seven seconds to hit the brakes before the car collided with the truck.

The accident raised concerns that people were beginning to over-rely on Level 2 autonomous systems, believing them to be more capable than they actually are.

After that accident, Consumer Reports called on Tesla to rename Autopilot and to disable its hands-free operation to make it clear the system wasn't fully self-driving. A warning will now sound if a Tesla driver takes his or her hands off the wheel while Autopilot is activated.

Both Ford and Waymo, Alphabet's self-driving-car company, have said they are developing fully self-driving cars because they fear people become too complacent in Level 2 systems.

Although Level 2 driving systems come with their risks, they are also much safer than cars on the road today. Crash rates for Tesla vehicles have fallen 40% since Autopilot was first installed in 2015.

But the Uber accident further showed how human drivers could still be better at handling certain complex driving situations.

We will never know whether the Arizona accident would have been avoided if a person was controlling the vehicle the entire time, but it highlights how we're still far from fully capable self-driving vehicles.