I thought I’d take the time to update a post I made earlier with an animation of TTC buses and streetcars. The original post had the animation … and that’s pretty much it. The only take-away being ‘hey cool they look like ants’ and ‘hmm… you can still get around at 3 am’. If that’s what you’re looking for, that’s fine. That video is still here. So please enjoy. Otherwise, I’ve added some material describing the creation of the animation.

In the animation, red buses are slow (< 5 km/h), blue buses are faster (> 15 km/h), and yellow is in between.

The Inspiration for the Animation of TTC Buses and Streetcars

I’ve always liked visualizations that animated flights or shipping. As an occasional straphanger on the TTC I thought I’d make something similar for Toronto transit. I know a little python so I thought it should be relatively straightforward. Yeah, I was wrong.

Finding the Data

Finding the data turned out to be the easy part. Luckily it was early in the project. The City of Toronto has a good open data website and the real time location data is made available in a series of well documented XML feeds. So, knowing where the data is, it’s just a matter of collecting it.

Collecting the Data

The real time location data, by definition, tells you were the buses are at the time the XML feed is accessed. To make the frames for the animation, the vehicle locations over the course of an entire day are needed. That means collecting the data for at least a day. To do that I used a python script that retrieved information from the XML feed at set intervals of a minute. The script parsed the XML and extracted the values necessary for the animation i.e. the vehicle id, latitude, longitude, the route its on, and the time at which the data is applicable (sometimes the feed is behind).

As it turns out, despite the intervals between accessing the real-time data being only a minute, it took a few minutes to go through the XML feeds for all the routes. As a result, the data is coarser in terms of time than I had originally planned for.

Having figured out how to get the real-time locations I needed a backdrop. Something to plot the positions onto like the bus routes, or a map of Toronto. I settled on the routes because that information was also in the XML feed and it happened to presented in a similar format. That meant I could use a lot of my existing coding to get it. So I collected that from the NextBus API as well.

Making the Frames

The first thing I did was experiment with plotting the backdrop – the routes that the animated buses and streetcars will travel. After experimenting with some line widths it looked pretty good.

I used un-projected longitude and latitude as x and y coordinates. Where Toronto is positioned on our globe the map doesn’t look too distorted – I’ll have to learn basemap later. Anyways, it’s definitely recognizable to anyone familiar with the TTC system map.

So from here the rest should be simple. It’s just a matter of putting some markers where the buses are throughout the day. Making the process of generating the plots algorithmic and putting it all together to make an animation. Yup. Real simple… It’s done… Let’s take a look here… aaaand there’s a bus in Lake Ontario. Maybe the driver is just lost?

Scrubbing the Data

Okay, so the driver probably isn’t lost because something similar happens a few times. It turns out that the GPS data sometimes reports buses as being somewhere they are not. The data needs some scrubbing. To deal with this issue, I need a way of rejecting obviously miss-reported positions. If I know the latitude and longitude of the reported vehicle location and the latitude and longitude of the straight lines that make up the route it’s supposedly on then I can exclude vehicles positions that are too far off the route. How far is too far? Well a few hundred meters sounds reasonable. I ended up using half a kilometer. Luckily ways of converting latitude and longitude pairs to distances are readily available and remembering the linear algebra required, while not quite like riding a bike, does come back.

A test of the rejection process using only one route seems to work. In the GIF, the big markers are the buses that are too far from their route and will be excluded. On the other hand, the small markers are buses close enough to their route to be considered, well, on their route. It’s not perfect. In the GIF you can see one bus that is momentarily considered ‘on its route’ even though it isn’t. It’s just crossing a street on its route and doesn’t get excluded when it’s near the intersection. The process is good enough for the purposes of the animation though. Also, now that I can readily estimate distances I can also pick colours for the buses based on their computed speed.

Reading Between the Lines

After remaking the frames excluding the supposedly amphibious buses everything should look good. Crumbs. It doesn’t. I put the frames together with FFmpeg and they are choppy. The time between the real time bus locations I collected is too large. But it takes more than a minute for my script to go through the XML feeds. So, there isn’t much I can do on the collection end. I have to create information between the times I have measurements for. To make things smoother I decided to interpolate positions. It works pretty well even for the more serpentine routes where I was worried it would result in markers looking like they’re cutting corners. As a benefit interpolating the positions lets me set the time that elapses between frames to be whatever works best for the animation.

Lesson Learned

I started adding the coding for the interpolation and in the process of debugging when I run into an error I haven’t had before. The routes have changed. I was still relying on the XML feed to get the route information. I don’t have it locally. It’s now inconsistent with the data I collected. I have to start over.

Lesson learned: Don’t rely on the online data remaining unchanged and/or available. Whenever possible get yourself a local copy.

Putting the Frames Together

After I repeated the process of making the frames, it was pretty smooth sailing. It was just a matter of smashing together the frames with FFmpeg. There was some trial and error when it came to frame rates, total time, and video quality but it was relatively painless.

A Few More

Like most projects, this one took a little longer and was a little harder than I originally thought. That’s okay. I’m hoping the python coding will prove general enough so that I can squeeze a little more content from it. So, next stop Boston…

Sources:

Notes: