In prior posts, we demonstrated how to download projections from numerous sources, calculate custom projections for your league, and compare the accuracy of different sources of projections (2013, 2014, 2015, 2016). In the latest version of our annual series, we hold the forecasters accountable and see who had the most and least accurate fantasy football projections over the last 5 years.

The R Script

You can download the R script for comparing the projections from different sources here. You can download the historical projections and performance using our Projections tool.

To compare the accuracy of the projections, we use the following metrics:

R-squared (R 2 ) – higher is better

) – higher is better Mean absolute scaled error (MASE) – lower is better

For a discussion of these metrics, see here and here

Whose Predictions Were the Best?

The results are in the table below. We compared the accuracy for projections of the following positions: QB, RB, WR, and TE. The rows represent the different sources of predictions (e.g., ESPN, CBS) and the columns represent the different measures of accuracy for the last five years and the average across years. The source with the best measure for each metric is in blue.

Source 2012 2013 2014 2015 2016 Average R2 MASE R2 MASE R2 MASE R2 MASE R2 MASE R2 MASE Fantasy Football Analytics: Average .670 .545 .612 .573 .618 .577 .626 .553 .645 .535 .634 .557 Fantasy Football Analytics: Robust Average .667 .549 .612 .573 .613 .581 .628 .554 .644 .536 .633 .559 Fantasy Football Analytics: Weighted Average .626 .553 .645 .535 .636 .544 CBS Average .637 .604 .479 .722 .575 .632 .500 .664 .559 .625 .550 .649 ESPN .576 .669 .500 .705 .498 .723 .615 .585 .630 .551 .564 .647 FantasyData .531 .639 .531 .639 FantasyFootballNerd .370 .785 .281 .767 .501 .641 .384 .731 FantasyPros .613 .572 .608 .585 .610 .561 .610 .573 FantasySharks .529 .673 .606 .592 .568 .633 FFtoday .661 .551 .550 .646 .530 .659 .546 .626 .574 .618 .572 .620 NFL.com .551 .650 .505 .709 .518 .692 .582 .632 .605 .584 .552 .653 WalterFootball .472 .713 .431 .724 .483 .718 .462 .718 Yahoo .547 .645 .635 .554 .624 .562 .602 .587

Here is how the projections ranked over the last four years (based on MASE):

Fantasy Football Analytics: Average (or Weighted Average) Fantasy Football Analytics: Robust Average FantasyPros Yahoo (ProFootballFocus) FFtoday FantasySharks FantasyData ESPN CBS NFL.com WalterFootball FantasyFootballNerd Notes: CBS estimates were averaged across Jamey Eisenberg and Dave Richard. FantasyFootballNerd projections include only their free projections (not their full subscription projections). We did not calculate the weighted average prior to 2015. The accuracy estimates may differ slightly from those provided in prior years because a) we now use standard league scoring settings (you can see the league scoring settings we used here) and b) we are only examining the following positions: QB, RB, WR, and TE. The weights for the weighted average were based on historical accuracy (1-MASE). For the analysts not included in the accuracy calculations, we calculated the average (1-MASE) value and subtracted 1/2 the standard deviation of (1-MASE). The weights in the weighted average for 2016 were: CBS Average: .344

ESPN: .329

FantasyData: .428

FantasySharks: .327

FFToday: .379

NFL.com: .329

WalterFootball: .281

Yahoo Sports: .400 Here is a scatterplot of our average projections in relation to players’ actual fantasy points scored in 2016:

Interesting Observations

Projections that combined multiple sources of projections (FFA Average, Weighted Average, Robust Average) were more accurate than all single sources of projections (e.g., CBS, NFL.com, ESPN) every year. This is consistent with the wisdom of the crowd. FFA projections were more accurate than projections from FantasyPros. This may be because we include more sources of projections. The simple average (mean) was more accurate than the robust average. The robust average gives extreme values less weight in the calculation of the average. This suggests that outliers may reflect meaningful sources of variance (i.e., they may help capture a player’s ceiling/floor) and may not just be bad projections (i.e., error/noise). The weighted average was equally accurate compared to the simple average. Weights were based on historical accuracy. If the best analysts are consistently more accurate than other analysts, the weighted average will likely outperform the mean. If, on the other hand, analysts don’t reliably outperform each other, the mean might be as or more accurate. Given the mean and weighted average were equally accurate each year, the evidence suggests that analysts don’t consistently outperform (or underperform) each other. The FFA Average explained 57–67% of the variation in players’ actual performance. That means that the projections are somewhat accurate but have much room for improvement in terms of prediction accuracy. 1/3 to 1/2 of the variance in actual points is unexplained by projections. Nevertheless, the projections are likely more accurate than pre-season rankings. The R-squared of the FFA average projection was .67 in 2012, .57 in 2013, .62 in 2014, .63 in 2015, and .65 in 2016. This suggests that players are more predictable in some years than others. There was little consistency in performance across time among sites that used single projections (CBS, NFL.com, ESPN). In 2012, CBS was the most accurate single source of projection but they were the least accurate in 2013. Moreover, ESPN was among the least accurate in 2014, but they were among the most accurate in 2015. This suggests that no single source reliably outperforms the others. While some sites may do better than others in any given year (because of fairly random variability–i.e., chance), it is unlikely that they will continue to outperform the other sites. Projections were more accurate for some positions than others. Projections were much more accurate for QBs and WRs than for RBs. Projections were the least accurate for Ks, DBs, and DSTs. For more info, see here. Here is how positions ranked in accuracy of their projections (from most to least accurate): QB: R2 = .73 WR: R2 = .57 TE: R2 = .55 LB: R2 = .53 RB: R2 = .48 DL: R2 = .45 K: R2 = .38 DB: R2 = .37 DST: R2 = .24 Projections over-estimated players’ performance by about 4–10 points every year across most positions (based on mean error). It will be interesting to see if this pattern holds in future seasons. If it does, we could account for this over-expectation in players’ projections. In a future post, I hope to explore the types of players for whom this over-expectation occurs.

Conclusion

Fantasy Football Analytics had the most accurate projections over the last five years. Why? We average across sources. Combining sources of projections removes some of their individual judgment biases (error) and gives us a more accurate fantasy projection. No single source (CBS, NFL.com, ESPN) reliably outperformed the others or the crowd, suggesting that differences between them are likely due in large part to chance. In sum, crowd projections are more accurate than individuals’ judgments for fantasy football projections. People often like to “go with their gut” when picking players. That’s fine—fantasy football is a game. Do what is fun for you. But, crowd projections are the most reliably accurate of any source. Do with that what you will! But don’t take my word for it. Examine the accuracy yourself with our Projections tool and see what you find. And let us know if you find something interesting!

Share this: Twitter

Facebook

Reddit

Email



Like this: Like Loading...

Related