Imagine a world where the narrative, background music, colour grade and feel of a drama is shaped in real time to suit your personality. This is called Visual Perceptive Media and we are making it now in our lab in MediaCityUK.

What we're doing

Imagine being able to inspire everybody with more personal, unique content which reflects the diversity of the audience.

Broadcasting over IP enables us to create all kinds of new content experiences that would not be possible or scalable on "traditional" TV or radio. Among the new content experiences we're researching is personalised video, tailored to many individual users. We're investigating how to create personalised media which feels natural to the audience and exciting for the storyteller as it scales for millions of individual audience members.

How it works

Visual Perceptive Media is a film which changes based on the person who is watching the video. Rather than drawing on sensor data to profile the environment, it focuses on the user themselves. It uses profiled data from a phone application to build a profile of the user and their preferences via their music collection and some personality questions. The data then is used to inform which assets are used in which order, what real time effects are applied and ultimately when. Cinematic effects twist the story one way or another.

Edit: To clarify, the "female/male bias" element we refer to in the film above is about presenting variations that encourage audiences to empathise more with one character or another. (The system does not show women one version of the film, and men another.)

Outcomes

We have conducted initial research with a small number of people to assess whether they perceive the film as a coherent whole or whether they notice it is made up of media objects being delivered on the fly. We're building a public prototype to test our ideas with a wider audience. The project will also test the client side video capabilities of the browser as a platform. Watch this space!