JOYRIDE: STREET VIEW VIDEO FROM A STOLEN PHONE



My close friend and collaborator, Sue, had her iPhone stolen earlier this month. The thief had it for 5 days, after which he ransomed it back to her. In the meantime, he had it with him as he drove around LA, presumably looking for other opportunities to be an asshole.



Our phones, clearly, are really personal devices. When we talk about personal data, the mobile phone is as physical an embodiment of this as anything, a data-sensory appendage if you will. What does it mean, then, when we’ve been separated from the device? It feels like identity theft as much as the loss of valuable electronics.



So when Sue got it back, she felt a bit estranged from it. We wondered about the life her device had had away from her, which led her to use OpenPaths to take a look at where it had been. Sure enough, the thief’s home and haunts were pretty readily identifiable.



Sue had also seen the last video Id made with OpenPaths and Google Street View, and we decided to make another one with her data. However, I wanted to take it a bit further. As fun as my first video attempt had been, it’s a bit impressionistic — you just get this blitz of unconnected images. However, Sue’s data had a very clear narrative behind it. We had a collection of points that the thief had visited with the phone, so I thought we should be able to get a smooth path between them.



First, I used the Google Directions API to map the likely route that the thief would have taken between known locations, as well as filling in some intermediary points, which was @blprnt’s idea from our earlier brainstorms. One of the cool things about the Street View panorama data (described by @jaimethompson) is that it shows the linkages between consecutive images taken by the Google car. So by calculating the heading from one point to the next and heuristically choosing links between panoramas headed in the right direction, we can access all the images taken along the way. Again using heading we can point the camera in the right direction, download the tiles we want, and stitch a frame together. Applying this to the thief’s route, we got a complete reconstructed path that plays back much more like a continuous video than my previous experiment (it evens out after the frantic first 30 seconds).



It’s a bit like if Google was driving the getaway car, starting downtown where the phone was stolen, and traveling over the city until it’s finally given back. Of course, we’re leaving out the pauses when he wasnt moving, and the temporal displacement of Street View images make this a kind of a weird frankendata — while the video retains some relationship to the truth of the human interaction behind it, it remains a kind of data fiction.



Oh, and for those who prefer the written word, theres always the driving directions.



Edit: some press love from Gizmodo and Flowing Data