Matthew Taylor, the director of the Big Tech documentary The Creepy Line, joined Breitbart News Editor-in-Chief Alex Marlow on Breitbart News Daily, Friday, to explain why Google’s denial of bias in their systems is false.

“I’m a huge technology fan, I’ve watched the rise of Google, Facebook. I actually think they did a lot of good things… But the cracks in the veneer have started to show, and all the trust that we’ve built up with these companies over the last fifteen years may not be warranted,” declared Taylor.

“Google and Facebook, especially Google, always had these boilerplate explanations: ‘Oh there’s no bias, oh it just works this way, oh we’re not putting our thumb on the scale.’ So we really wanted to break it down because we don’t view it as a partisan project or anything like that. This affects every single person, everyone with an opinion, everybody who does everything every day, so that was why we made the film, and we think it’s been very successful, since the argument showed up a number of times at the hearing this week with [Google CEO] Sundar Pichai.”

“Whenever you hear there is no bias, that is absolutely false. By design, the machine has to be biased, and we want it to be biased,” Taylor continued. “The example we use is if you’re looking for the best dog food, the engine does two things… particular things. First, it has to look through billions of pages and make a selection: Bias number one. And then it has to put them in an order: Bias number two. And this ordering is very important.”

“Think about it folks. If Purina goes to the top of the list, we are conditioned to think what’s on the first page and what’s at the top is true. Because we do this every single day. So if you’re asking about dog food, or music services, or Audible deals, or vacations, that’s one thing, but if you asked it who’s the best candidate it still has to do those two things that are biased,” Taylor explained. “This is where it comes down to the people who build the algorithm build the algorithm and code their ethics and decision-making into it… By definition, if it wasn’t biased, it wouldn’t work so well, and we wouldn’t use it.”