Terence Broad’s Blade Runner sometimes looks a lot like the classic 1982 film. Sometimes it looks completely different.

His autoencoded version of Blade Runner is the film as a computer sees it – or, more specifically, as a computer sees it, remembers it, and then regurgitates it.

The film is being shown as part of the Barbican’s science fiction exhibition-meets-festival, Into The Unknown. And it’s perhaps the most cutting edge of all the work featured there – not only being about science fiction, but being created in a way that sounds like it comes straight out of the work of Philip K Dick.

Broad’s project works by analogy with memory, and uses cutting edge artificial intelligence to do so. It uses an autoencoder – that encodes a big data sample, in this case individual frames of films, into a tiny representation of itself which it can then reconstruct later on.

Gadget and tech news: In pictures Show all 25 1 /25 Gadget and tech news: In pictures Gadget and tech news: In pictures Gun-toting humanoid robot sent into space Russia has launched a humanoid robot into space on a rocket bound for the International Space Station (ISS). The robot Fedor will spend 10 days aboard the ISS practising skills such as using tools to fix issues onboard. Russia's deputy prime minister Dmitry Rogozin has previously shared videos of Fedor handling and shooting guns at a firing range with deadly accuracy. Dmitry Rogozin/Twitter Gadget and tech news: In pictures Google turns 21 Google celebrates its 21st birthday on September 27. The The search engine was founded in September 1998 by two PhD students, Larry Page and Sergey Brin, in their dormitories at California’s Stanford University. Page and Brin chose the name google as it recalled the mathematic term 'googol', meaning 10 raised to the power of 100 Google Gadget and tech news: In pictures Hexa drone lifts off Chief engineer of LIFT aircraft Balazs Kerulo demonstrates the company's "Hexa" personal drone craft in Lago Vista, Texas on June 3 2019 Reuters Gadget and tech news: In pictures Project Scarlett to succeed Xbox One Microsoft announced Project Scarlett, the successor to the Xbox One, at E3 2019. The company said that the new console will be 4 times as powerful as the Xbox One and is slated for a release date of Christmas 2020 Getty Gadget and tech news: In pictures First new iPod in four years Apple has announced the new iPod Touch, the first new iPod in four years. The device will have the option of adding more storage, up to 256GB Apple Gadget and tech news: In pictures Folding phone may flop Samsung will cancel orders of its Galaxy Fold phone at the end of May if the phone is not then ready for sale. The $2000 folding phone has been found to break easily with review copies being recalled after backlash PA Gadget and tech news: In pictures Charging mat non-starter Apple has cancelled its AirPower wireless charging mat, which was slated as a way to charge numerous apple products at once AFP/Getty Gadget and tech news: In pictures "Super league" India shoots down satellite India has claimed status as part of a "super league" of nations after shooting down a live satellite in a test of new missile technology EPA Gadget and tech news: In pictures 5G incoming 5G wireless internet is expected to launch in 2019, with the potential to reach speeds of 50mb/s Getty Gadget and tech news: In pictures Uber halts driverless testing after death Uber has halted testing of driverless vehicles after a woman was killed by one of their cars in Tempe, Arizona. March 19 2018 Getty Gadget and tech news: In pictures A humanoid robot gestures during a demo at a stall in the Indian Machine Tools Expo, IMTEX/Tooltech 2017 held in Bangalore Getty Gadget and tech news: In pictures A humanoid robot gestures during a demo at a stall in the Indian Machine Tools Expo, IMTEX/Tooltech 2017 held in Bangalore Getty Gadget and tech news: In pictures Engineers test a four-metre-tall humanoid manned robot dubbed Method-2 in a lab of the Hankook Mirae Technology in Gunpo, south of Seoul, South Korea Jung Yeon-Je/AFP/Getty Gadget and tech news: In pictures Engineers test a four-metre-tall humanoid manned robot dubbed Method-2 in a lab of the Hankook Mirae Technology in Gunpo, south of Seoul, South Korea Jung Yeon-Je/AFP/Getty Gadget and tech news: In pictures The giant human-like robot bears a striking resemblance to the military robots starring in the movie 'Avatar' and is claimed as a world first by its creators from a South Korean robotic company Jung Yeon-Je/AFP/Getty Gadget and tech news: In pictures Engineers test a four-metre-tall humanoid manned robot dubbed Method-2 in a lab of the Hankook Mirae Technology in Gunpo, south of Seoul, South Korea Jung Yeon-Je/AFP/Getty Gadget and tech news: In pictures Waseda University's saxophonist robot WAS-5, developed by professor Atsuo Takanishi Rex Gadget and tech news: In pictures Waseda University's saxophonist robot WAS-5, developed by professor Atsuo Takanishi and Kaptain Rock playing one string light saber guitar perform jam session Rex Gadget and tech news: In pictures A test line of a new energy suspension railway resembling the giant panda is seen in Chengdu, Sichuan Province, China Reuters Gadget and tech news: In pictures A test line of a new energy suspension railway, resembling a giant panda, is seen in Chengdu, Sichuan Province, China Reuters Gadget and tech news: In pictures A concept car by Trumpchi from GAC Group is shown at the International Automobile Exhibition in Guangzhou, China Rex Gadget and tech news: In pictures A Mirai fuel cell vehicle by Toyota is displayed at the International Automobile Exhibition in Guangzhou, China Reuters Gadget and tech news: In pictures A visitor tries a Nissan VR experience at the International Automobile Exhibition in Guangzhou, China Reuters Gadget and tech news: In pictures A man looks at an exhibit entitled 'Mimus' a giant industrial robot which has been reprogrammed to interact with humans during a photocall at the new Design Museum in South Kensington, London Getty Gadget and tech news: In pictures A new Israeli Da-Vinci unmanned aerial vehicle manufactured by Elbit Systems is displayed during the 4th International conference on Home Land Security and Cyber in the Israeli coastal city of Tel Aviv Getty

When it does so, a great deal has been lost in the shrinking. But strange things can be found in that reconstruction, too – the technology looks to make up for what it can’t remember by filling in the gaps.

“The reconstructions are in no way perfect, but the project was more of a creative exploration of both the capacity and limitations of this approach,” wrote Broad in an introduction to his work that would later go viral.

In that way it seems remarkably and uncannily similar to human memory. It shrinks everything down and stores it away, so that it can be opened back up and relived with the gaps filled in at a later date.

And the idea was inspired by strange experiments with humans, too. One of the first inspirations was a talk by scientists who had managed to make an MRI machine reconstruct things that people were looking at, simply by looking at the patterns showing in their brains. It could literally see through other people’s eyes, by looking right into their heads.

But all of those human inspirations and influences are taken and turned into a work that is undeniably technological. If the MRI experiment showed us what people are watching from inside their heads, the autoencoded Blade Runner almost allows us to watch how a computer sees, peering inside its own brain in the same way.

It definitely remembers in a different way to how humans do. It’s terrible at recalling and reconstituting faces, for instance, and can’t recognise that the same face belongs to the same person and so needs to move in a straight line. And it also appears to find it impossible to remember a black frame; because there are so few in the film, there’s no point storing the black and instead remembers it as an average of all the other parts of the film, throwing out a beautiful but decidedly unblank green image.

For now the limits of the project are where the interest is found, and the imperfections of the reconstruction make it a work of art. But theoretically computers could eventually become perfect at the work – watching, shrinking and then reconstituting the film as it actually is.

It all sounds eerily like a question that would plague the noir world of Blade Runner. But Blade Runner wasn’t always the aim. The film began as a project for a university course, and required learning techniques that are at the very forefront of AI and visual technology.

“Originally it was really just an experiment; the whole thing started out as a research project,” says Broad.

“For a long time I was training it on these videos of really long train journeys. After a couple of months i got really bored of looking at pictures of trains, so I thought it would be interesting to do it with blade runner.

“But I didn’t think it was going to work - I thought there’d be too much variety in the images. Normally the type of model that i trained it on you train it with lots of pictures of faces, or of bedrooms – lots of the same kind of thing. But then it handled it pretty well.

“It got into being an art project near the end.”

(Even once it became very much an aesthetic piece, it was still fascinating to some people as a technical exercise. When the work was posted onto Hacker News, a number of people were interested in whether Broad had developed a new kind of compression algorithm, apparently inspired by the TV series Silicon Valley. Broad is clear that’s not the case, since the programme uses a great deal of energy and “only works for Blade Runner at a very low resolution”.)

The choice of film happened by a kind of intentional coincidence but fits perfectly into the film because the themes seem to mesh so well. Blade Runner explores the edge of artificial intelligence, the beginnings of humanity and how to know the difference between the two; the autoencoded Blade Runner does the same thing but with the film itself.

“I’d always had the idea [of Blade Runner] in the back of my mind. But I didn’t think it would work.

“But then as soon as we did it we saw that it obviously should be the sole focus for the project.”

Because the computer processes things over time – and takes a while to do it – the process of actually finding that it would work well was one that revealed itself gradually.

“When you’re training it, you would give it a batch of images of random frames. Then it would start giving you the output. So I was just looking at this output while it was training.

“I saw this image and saw you could recognise some of the scenes. But this was a really small resolution. So we saw this and then ti was like, right we need to kind of do this in order and remake the video.

“Then we did a little 10 minute sample. And it was kind of mind blowing, for me and my supervisor. I’ve got the original 10 minute - it’s really noisy and really grainy.

“You can see what’s going on and it’s kind of mind-blowing. Then I thought - let’s just remake Blade Runner, the whole thing.”

That decision put the film squarely in the realm of science fiction – a decision that would see it sit among the greatest work of the genre in the Barbican exhibition. Not simply because it took such a seminal science fiction film, but also because it was a kind of science fiction itself, using brand new techniques to reprocess a film in a way that would be unimaginable and inexplicable to people even 10 or 20 years ago.

But Broad doesn’t seem to see his so much in the history of science fiction as in the history of work using Blade Runner; his piece might be one of the most high-profile reworkings of the film, but it’s far from the first. (One of those was released only this month, putting the sound of the new sequel Blade Runner 2049 over the advert for the Google Home, to neat and chilling effect.)

Broad also likens it to other films that explored the very limits of film and technology as a medium – and of film production studios as copyright owners. Work like 24 Hour Psycho and The Clock both made heavy use of other films and skirted takedowns and copyright claims as they did so.

Despite having worked on a spectacular film about the dangers of AI (and making it even more spectacular), Broad isn’t concerned about the grand predictions of cinema – like those from Terminator.

Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Show all 20 1 /20 Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Business as Usual 30 years may have passed, but LA still looks relatively unchanged in that time. Hey, even Atari still seems to be doing business. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer The Replicant Centre We never really saw much of the replicant world in Blade Runner, outside of the four escaped individuals and Rachael, so this film seems to promise a real look at the inner workings of the industry. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Wallace Jared Leto's Wallace looks like he's a major figure behind the production of replicants. His talk of replicants as being a "disposable workforce" is a clear indication this film is going to be asking some pretty big questions about capitalism. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Replicant Birth Something else we never actually saw in Blade Runner: the birth of a replicant, which seems like a deeply traumatic experience. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer "Happy Birthday" OK, so Wallace is definitely the villain here. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer "That's what we do here. We keep order." Robin Wright's clearly playing Officer K (Ryan Gosling)'s commander, though it's hard to imagine the two get on. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer A Bloody Bad Time Everything we've seen so far of K seems to be a reflection of Deckard, and this look of inner turmoil seems particularly telling. It's but an inevitability, it seems, that K will come to question his role as a Blade Runner. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Replicant Romance? Here we see K and Ana de Armas' Joi share a tender moment. The hint seems to be she's a replicant, considering she looks an awful lot like the women littered across the city's various advertisements. And considering Robin Wright's character speaks of a "war" that might take place if the barrier between human and replicant is broken, it looks as if their romance could have serious consequences. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Out There Unlike Blade Runner's strictly LA setting, the sequel expands that geography a little, reportedly to Las Vegas. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Viva Las Vegas? A theory that is amply backed up by the fact K is seen here entering a building with a big, old 'Vintage Casino' rug. Now, was Las Vegas just gradually abandoned over time, or did something catastrophic take place? Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Deckard Returns A building that appears to be where Deckard (Harrison Ford)'s been hiding out this whole time. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer "I know" The question is: was K originally sent to eliminate Deckard, or has he sought him out by his own means? Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Strike a Pose Judging by the wild fashion and very blank expressions of these two, it seems safe to say the pair are replicants. But what's their agenda? Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Save the Date 6 - 10 - 21... that looks an awful lot like a date. So what happened in 2021? And why would someone carve that into a rock? Is this a grave? Presumably we'll get some explanation as to what happened to Rachael from the first film... Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer "Bring it to me" Wallace's henchperson seems to take the form of Sylvia Hoeks' character, who also looks pretty replicant-y. What's "it"? Does it have something to do with that date? Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Deckard's Dog? Not ranking that dog's chances of surviving highly, to be honest. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer Dave Bautista His character's a complete mystery so far.. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer ... but the shabby door he's seen in front of looks an awful lot like it belongs to this burning shack. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer "You're special" Blade Runner 2049's plot revolves around one earth-shattering secret, clearly. But the indication here is that this secret somehow involves K himself. Blade Runner 2049: A shot-by-shot breakdown of the gorgeous trailer "Your story isn't over yet" So... is K a replicant? That seems the obvious answer, but that's coming pretty close to the original's infamously open-ended question about Deckard's own identity. Could K be something even more unique then?

“I’m not really troubled by Skynet super AIs taking over everything,” he says. “I think what’s more more troubling is there’s lots of evidence that neural networks take biases.”

That isn’t a conscious process – on the part of the AI or the person training it. It’s just a consequence of the fact that computers can only learn from what humans give them, and anything humans give them will be as good or bad as the person doing so.

“It’s just picking up on all the biases of the training data you’re giving it; it’s whatever’s inherent in the people” that have put the data together.

“Until you have some system that really was really intelligent that could empirically understand these things and correct itself” it doesn’t seem like it would be possible to fix such a problem - “you’re always going to be trying to fit some kind of training data that people have labelled in some sense”.

People are always arguing for the transformative power of film, and its ability to make the people who watch it better. So couldn’t robots like Broad’s – with their ability to watch a film – eventually learn away their bad habits in the way that we hope people can?

Probably not, says Broad. At least not yet.

“The auto encoder thing is just working on images,” he says, talking of his own creation. “It doesn’t know any context about what’s going on in the film; it doesn’t have any capacity to understand that. It just understands patterns in images.”

But artificial intelligence has been developing at a stunning rate; it’s one of the rare fields of technology where predictions tend to be conservative and look small in their scope. So is it possible to imagine that such a computer could be generated in the near future, even if it’s not possible to imagine one based on what we have today?

No. Not really, says Broad. AI might be stunningly advanced and developing shockingly fast, but that shouldn’t be our concern for the time being.

“There’s still a lot of research going on,” he says. “People have been able to develop quite efficient ways of doing particular tasks.