It turns out machines in the tech industry are just as bad at hiring women as the men who run it.

Since 2014, Amazon’s machine-learning specialists have been working on a computer program to automate the hiring process and pick out top talent, but there was one big problem with an early version of the AI tool: It discriminated against women.

Amazon’s system used resumés submitted to the company over a 10-year period to decide which new candidates were preferable, five sources told Reuters. The automated system scored candidates on a scale of 1 to 5 stars, similar to the one you see when buying something on Amazon’s web marketplace.

“Everyone wanted this holy grail,” one of the sources told Reuters. “They literally wanted it to be an engine where I’m going to give you 100 resumés, it will spit out the top five, and we’ll hire those.”

Sounds like a decent idea! But then the machine taught itself that the word “women’s” was bad, and even docked points to graduates from at least two women’s colleges, according to the report.

The problem seemed to stem from the tech industry’s male dominance, so most of the candidates and hires over the 10-year period the machine used for input were, therefore, men. Amazon tried to edit the program to stop disliking women by making “women’s” a neutral term, but ultimately the project was scrapped because even the possibility of discrimination is probably a bad sign, the sources said.

Gender bias wasn't even the only issue. Amazon addressed the "bias problem" in 2015, but the program also simply wasn't good at picking out strong candidates. The machine would, seemingly at random, select applicants who were bad fits. The project never left the development phase and was never used to hire any actual candidates.

The tech sector is overwhelmingly white and male — about 76 percent of technical jobs are held by men, according to Bloomberg, and black Latinx people make up just 5 percent of the workforce.