This is a guest post from Rob Vollman, notably of Hockey Prospectus, ESPN, Bleacher Report, and creator of the player usage charts. He, along with Christophe Perreault and myself, worked to create and fine-tune this metric, but there's still work to be done. Feel free to give us your feedback and suggestions in the comments.

How Far Does Team Shooting Percentage Regress?

In the analytic world it's typically assumed that a team's shooting percentage will move towards the league average over time, and for well-established reasons, but is that always correct? Perhaps Chicago's natural resting point is a little higher than Florida's, and that not all of a team's excess (or shortfall) over the short term should be chalked up to "luck" or other temporary factors.

It was a great deal of work, but we have devised a way to calculate a team's expected shooting percentage based on the previous shooting percentages of its players. And indeed Chicago could have been expected to score on 9.46% of their shots, third best in the NHL, while Florida was dead last at 7.89%.

We'll explain the methodology and dive into the results in a moment, but first let's address the idea of assuming a team's shooting percentage will gravitate to the league average in the first place. At first glance it just doesn't seem intuitive that every team's shooting percentage will regress towards the same number, but there is a well-documented basis to this practice.

The idea was first popularized by blogger Jlikens of Objective NHL, who found that random variation alone could account for virtually the entire difference between a team's (even strength) shooting percentage and the league average, leaving almost no room for a team's skill or shot quality. It was a surprising result that blogger Vic Ferrari also confirmed on his Irreverent Oiler Fans blog over five years ago.

A lot of work has been done since then that shows how the validity of regressing a team's shooting percentage to the league average. Quite frankly there are very few analytic concepts that are as well-established. We are very likely going to waste our time, but onwards ho.

What is Expected Shooting Percentage?

The methodology was time-consuming, but really quite simple. We took the past nine seasons, all the way back to the 2005 lockout, and worked out each team's expected shooting percentage based on each of its player's individual shots and previous career shooting percentages.

For example, for Montreal you would multiply Max Pacioretty's 270 shots by his previous career shooting percentage of 9.83% to get 27 expected goals. Do this for all the Canadiens and you get an expected number of goals, their actual number of shots, and therefore an expected shooting percentage for the entire team.

There are a few complexities, however. The biggest challenge was deciding what to do with rookies and other players with fewer than (say) 100 shots. Eventually we arbitrarily settled on assuming that those players with limited NHL experience would score at a rate 75% of the league average for their position.

Rookies weren't the only ones with whom we had issues. Veterans had their expected shooting percentages overestimated for two reasons. First, shooting percentages have dropped over the years, sliding from 10.09% in 2005-06 to 8.89% this year. Consequently only two out of every nine teams actually met expectations, an obvious flaw with our first pass. With our second pass we adjusted everyone's shooting percentages to match the current season's league average. That gave Pacioretty an era-adjusted career shooting percentage of 9.69% for example, for example.

The second problem with the veterans is that it doesn't factor in the natural drop in shooting percentages as players get older. Teams with lots of veterans will therefore have their expected shooting percentage overestimated. Applying an age curve could be a useful future refinement.

There are other problems of course, such as using each player's overall shooting percentage rather than separating it by manpower situation, but quite frankly it took forever just to get this far. These (and other) shortcomings notwithstanding, this is at the very least a reasonable basis for our first steps.

What Were the Results?

Edmonton surprisingly led the league in expected shooting percentage last year at 10.13%. They were followed by Montreal, Chicago and Pittsburgh. The highest post-lockout expectation was for the 2007-08 Pittsburgh Penguins (11.53%) with only the 2005-06 Avalanche, 2005-06 Redwings, 2006-07 Senators and 2005-06 Oilers over 11.0%.

And once again the Florida Panthers were dead last in 2013-14 with a 7.89% expected shooting percentage, but it was surprising to see San Jose second last at 8.00%. The only post-lockout teams to fare worse than this year's Panthers were the 2007-08 Panthers (7.72%) and the 2007-08 Coyotes (7.77%).

How big is the gap between expected and actual goals? It can be huge, actually. The 2008-09 Boston Bruins, for example, enjoyed 67.9 more goals than one would have expected based on each player's previous career shooting percentages. That's the year everybody had crazy career seasons. As a team they scored on 10.88% of their shots rather than the expected 8.14%, or league average 9.43%. The following season they scored on only 7.54% of their shots.

Last season the biggest overachievers were the Colorado Avalanche, who scored an extra 41.6 goals by scoring on 10.12% of their shots instead of an expected 8.40% (or league average 8.89%).

On the flip side the 2012-13 San Jose Sharks missed the mark by 55.1 goals, if you extrapolate to an 82-game season, scoring on just 7.60% of their shots instead of an expected 9.71% (or league average 9.11%).

Last year the greatest underachievers were obviously the Buffalo Sabres who scored on just 6.96% of their shots instead of an expected 9.13%, costing them 46.8 goals.





Is Expected Shooting Percentage Better Than Using the League Average?

While there are a lot of interesting research topics that can spin off from this data, the most interesting ones are those that might challenge our traditional use of the league average as each team's resting point.

For instance, is the relationship with a team's actual shooting percentage closest with the league average, or this new expected shooting percentage? Do those with higher expected shooting percentages achieve better actual results than those with the lowest? And is the expected shooting percentage a better predictor of the following year's shooting percentage than the league average?

This is admittedly a very preliminary and partially arbitrary way of calculating the expected shooting percentage but if it truly is a superior option then it ought to have at least some advantage over the league shooting average. An advantage that can be further improved with a more sophisticated model that adds in age curves, separates manpower situations or handles rookies in a better way.

So what are the results? Well the good news is that correlation between a team's actual shooting percentage and their expected shooting percentage (0.45) is closer than it is to the league average (0.40). The difference is incredibly small, though.

Check out the chart of all the post-lockout teams, some of which are labeled for reference. The horizontal axis is the expected shooting percentage while the actual shooting percentage is on the vertical axis. Even to the eye there is somewhat of a relationship between the two, with the actual shooting percentage rising with the expected shooting percentage. The main difference is that the range of actual results is much wider than those of the expected.

I divided all 270 teams into five buckets of 54 teams apiece based on where they ranked in expected shooting percentage. The average expected shooting percentage of each bucket in decreasing order was 10.32%, 9.62%, 9.27%, 8.91% and 8.39% respectively. My hope was that the actual shooting percentage of each bucket would match, or at the very least be in the same order.

The results? 9.96%, 9.45%, 9.21%, 9.02% and 8.89% respectively. They did decrease in order, but each one was just a little closer to the average. If there was nothing to it these would all be roughly the same, or possibly all over the place, instead there's a clear downwards pattern.

Before we hurt our hands slapping high-fives, the bad news is that it's a lousy way to try to predict the following year's team shooting percentage. The following season's shooting percentage for each bucket was 9.39%, 9.20%, 9.34%, 9.06% and 8.97% respectively. The league average shooting percentage has a closer correlation with a team's shooting percentage the following season (0.26) than expected shooting percentage does (0.19). D-oh!

Closing Thoughts

This was meant only as an introductory background for the more detailed dives that are to follow, which will be conducted by me, Arik Parnass and Christophe Perreault. We are also happy to make our data available to others who would like to join us in our efforts, simply shoot any one of us an email.

As for these high-level findings, there's some evidence to suggest that teams have very slightly different resting points, and that there may therefore be a way to improve on simply regressing shooting percentages to the league average by blending in this process.

However, it takes a lot of effort, the benefits are of indeterminately small size, and it doesn't seem to offer any predictive advantage for the following season. But that doesn't mean that there isn't any value in this exercise, so stay tuned!