As news of the allegations against Toronto Mayor Rob Ford rolls and roils its way across the globe, a contingent of skeptics has cast doubt on the story of his alleged crack smoking by claiming the video purporting to show it could have been doctored or faked. Most prominently, Ford’s Deputy Mayor Doug Holyday cited the well-known faked “eagle picking up a baby” video as an example of how, these days, you just never know if something is real.

Whether or not the allegations are true is something I can’t know. Whether a video of Ford behaving as has been described can be digitally faked, however, is something I do have an answer for: it absolutely, positively cannot be.

The way it’s been related, the proof in question is not a grainy photograph, taken at a distance or at night. It is video, well-lit, and allegedly contains someone looking like Ford moving, speaking and gesticulating. What that means is that in order for it to be digitally faked, the sellers wouldn’t simply have had to “doctor” a video, like they were putting a dead celebrity in a commercial; they would have needed to create a believable digital replica of the man, a realistic video game version of the mayor who walks and talks just like him. Or hire a perfect double to act Ford’s part.

Here’s the thing: if you had George Lucas’s special effects team, the world’s faster supercomputers, and an unlimited budget, you couldn’t make that happen. Can we make dinosaurs and aliens? Sure. Are we able to make elves and “the One” do very cool, acrobatic things from a distance? Of course. Can we create convincing digital representations of human beings that move and talk and stand up to scrutiny in a video shot from five to seven feet away? Nope. We’re kinda-sorta getting there, but to even approach believability, you need the person you’re trying to replicate in order to digitize them. A video of a digital Robert Bruce Ford doing whatever his creators want him to do is, at this point in history, unequivocally a technological impossibility.

Whether or not this means the allegations are true, however, isn’t my concern here. Rather, interesting to me are the implications of the seemingly widespread belief that creating such believable alternate realities using technology is not only possible, but easy. After all, it wasn’t just Toronto’s deputy mayor who raised the idea of the video being a fake — everyone from Internet commenters to CBC news anchors have floated the notion.

So what’s going on? For one, it seems our relationship to the image is undergoing yet another change. Beyond specific examples of their use, the mere existence of tools such as Photoshop and other digital video effects means the already shaky assumption that you could believe the truth of images is now even further undercut. Technologies that can make the unreal appear true highlight the fact that images and video are never simply “a window onto reality,” but are constructed and framed in certain ways. That was true previously as well, but it was often hidden. Newer technological advancements just mean that critically evaluating everything we see is now more necessary than ever. Score one for your high-school media studies teacher.

But in an era when no one takes a magazine cover at face value, this is hardly a revelation. That said, there is something else going on here. Relying on a relationship between images or videos and reality has been a staple of law and society for some time now. That phenomenon, otherwise known as “referentiality,” has been part of the reason mass media became so vital for combating ignorance and breaking news: broadcasters could actually show you what had happened or what is happening right now. In that sense, documentary technologies like the camera were part of the tradition of the Enlightenment and the scientific revolution. Forget assumptions or belief, they said — instead, focus on fact and evidence.

The trouble is, when fact and evidence are so easily dismissed — as many citations of the “eagle baby” video attest to — the basic structure of evidence leading to rational deduction also gets undercut. Instead of simply showcasing that images and video can be changed, the very existence of digital technologies capable of such manipulations brings to the fore another phenomenon: that how we relate to the world is as much about what we believe as what we see.

The net effect, then: when commenters suggest the Ford video is faked, it’s because the idea of this evidence doesn’t fit their ideological agenda — a situation neatly mirrored in those who believe the claims unquestioningly. That old chestnut about being entitled to one’s own opinion but not one’s own facts is a little less clear-cut than it used to be. What happens when, instead of arguing over viewpoints, people start debating the legitimacy of what, just a few years ago, you could safely say was proof?

Thus far, Ford and his team have mostly reacted to the accusations with stony silence. I’m not sure they’re smart enough or cynical enough to be relying on what Gawker editor John Cook has called the “epistemological rabbit hole” that comes from trying to ascertain truth in a situation like this.

It doesn’t really matter, though — whether specific examples of media are proven to be authentic or false is, very strangely, irrelevant. The simple possibility that such referential evidence can be fake allows individuals to retreat into ideological bubbles of their own choosing. Because that which we’d once have called evidence is now yet another thing that may or may not be true, agreeing on facts — and forming social or political consensus around them — just gets that much harder. And as long as the mayor of Canada’s largest city stays silent or simply denies that the video exists, we are stuck in a no man’s land where what is true hardly matters at all.

Navneet Alang is Toronto-based writer and a contributing editor to Hazlitt on technology and culture. This article was originally published on Hazlittmag.com.