Last year, Flowingdata came up with a dynamic visualization of the average day in the life of Americans. This visualization was based on data from the American Time Survey, consisting of questions measuring the amount of time people spend doing various activities throughout their day.

We at Sentiance thought this was a cool idea, so we decided to shamelessly copy, however with a little twist. Instead of relying on survey data, i.e. user declared activities, we decided to use mobile sensor data as a proxy for observed behavior.

We had a few thousand users install our mobile SDK, and used machine learning to automatically detect their home and work locations, their transport mode and real-time context, and the venue types they visited. All of this was learned automatically by fusing mobile sensor data such as accelerometer and gyroscope streams with the location subsystem.

Based on the resulting timelines, we trained a time varying Markov Chain model, and simulated a full 24-hour day of 1000 typical users, similar to the approach of Flowing data. We then dumped our data into the Flowingdata visualization and decided to share the result with you.

In the 24h-simulation below, you can see how a person allocates their time to different activities during the day and night. Basically we get a peek into people’s daily schedule on a typical weekday from when they get up, what they do and where they go throughout the day to when they check out and go to sleep.

Every dot represents a person; the dots move and change color depending on the activity throughout the day.