To validate this assumption, we filter our data by a single (popular) CPU - as we’re CPU-limited, the GPU usually doesn’t have an effect on overall frame time.

This is what we saw. Top plot is number of players, the middle line is CPU frame time in ms, and the bottom line is GPU frame time in ms, all plotted by date.

Previously we’d measured an improvement in frame time of 4.2ms, but here we’ve measured a much more sensible change of 0.5ms. The CPU graph overall has less noise as well, the variance seems to be around +/- 1.5ms. Note the different colours in the top “Number of Players” plot - each colour indicates a new deploy with new code and data to PBE. The GPU time seems to be even smoother. Looking even closer, the 0.5ms dip in CPU time seems to be due to a relatively small sample size in the first day of the transition to the new patch (the deploy happened later in the day), which introduced more noise.

Checking the Next Patch Cycle

This is promising; let’s extend the domain to look at more data. Looking at the next patch cycle, we notice even less of a discontinuity.

This is all looking good now. There’s still some noise in the data, but given that this is PBE with a much smaller number of players, this could be acceptable. But how accurate are these values? How can we verify that these numbers are correct?

As luck would have it, with the introduction of patch 8.13, we introduced a fix for a stall on the CPU. This stall was exactly 2ms - a great test for the accuracy of our data. If we see a change of 2ms in our data then we can be pretty confident that our data is reliable.

The improvement in frame time from patch 8.12 to 8.13 was 1.93ms - which is close enough to 2.0ms for me to finally have confidence in our data.

Deciding What to Measure Like with all good problems, in retrospect the answer is obvious. Different CPUs run League at different speeds. A low-end machine will run slower than a high-end machine, so the distribution of machine specs will have an impact on the average frame times. If more low spec machines are playing the game at a given time, then the average frame time will be higher; if more higher spec machines are playing, then the frame time will be lower. This explains the diurnal behaviour of average frame times - at certain times of the day, there are more fast machines playing than at other times. The distribution of CPUs varies by time of day (which is very interesting in itself). So in order to collect accurate information we need to filter by CPU (and possibly GPU) power. Given that these metrics have an impact on the frame times we measure, we have to ask ourselves, “What else could have an impact on our frame times? What else should we be measuring?”

Frame Time Distribution

To answer that, let’s take a step back and look at the distribution of average frame times. A histogram of frame times might show us outliers or unexpected modalities. The graph below is that histogram: fast frames on the left, slow frames on the right - and the higher the bar, the more games that had the average frame time.

This graph is surprising - I was hoping for a nice Gaussian curve with a smooth shape but this shows spikes in a few places: 7ms, 9ms, 12ms, and 16ms. Those frame times correspond roughly to 144Hz, 120Hz, 80Hz, and 60Hz, and indicate that there are concentrations of players at those frame rates. Of course this is the result of the graphics quality settings for the game, where players can lock their frame rates to 144fps, 120fps, 80fps, or 60fps.

If we colour the histogram for those players who lock their frame rates we see the following:

Here, we can clearly see the number of players that have capped their frame rates. If we remove those players then the histogram becomes smoother, but there’s still a big spike around 60Hz.

Gathering More Data

To try and determine what other factors are at play here, we collected lots of different information that could possibly have an impact on frame times and plotted them.

This mess of coloured boxes shows the frame time histogram (top left) as well as what each player’s settings were - their screen resolution and display mode, what CPU they used, vsync state, antialiasing state, frame cap settings, and all the graphics quality settings, as well as game modes and memory available. In this view we can select any or all of these different settings to filter by them.

The view below shows the frame time distribution for all games played at 1920x1080 with AA on and vsync off, frame capping off, graphics quality set to “Medium High,” for all URF games running on the most common CPUs.

Check out the top left chart - this selection of settings gives us a nice Gaussian curve! Well, almost - there are still some games that run at 60fps, but that doesn’t seem to be League related. It could be HW related, like the GPU locking to 60Hz due to some external setting (if you have any ideas, please let me know).