So we've talked about Covariance for a while now. The idea is both simple and intriguing: which teams play their best against the best competition, and which play best against the worst?

The simple 'math' definition of covariance is as follows: Covariance provides a measure of the strength of the correlation between two or more sets of random variates. We look at correlations a lot with football stats, and for obvious reasons. It allows us to figure out when our eyes are lying to us, it tells us how seriously we should be taking certain factors (returning talent, recruiting, etc.) when making projections, etc. We're going to use covariance to look at which teams tended to play well against good teams (play up to their level of competition) and which ones tended to only play well against bad ones. Were Florida State's defensive tendencies mentioned above rare? Let's see.

Within the frame of a given season, it can give you a glimpse at a team's personality. But what can it tell you about a team from year to year?

After the jump, you'll find 2005-11 Covariance rankings for all FBS teams. As you'll see, very few teams are consistently best against good teams or bad. Moving forward, I'm going to take a look at experience levels and compare it to covariance; I'm now developing a theory that a decent amount of year-to-year variation could be explained simply by how experienced a team may be. Then again, it could just be the personality of certain groups of personnel, too. Using my own Missouri Tigers as an example, Mizzou ranked 116th and 99th in 2007-08, Chase Daniel's final two seasons in Columbia, and have ranked 29th, 23rd and 12th the last three years.

But we'll look at this at a later date; it's making my head hurt at the moment. For now, you just get rankings, rankings, rankings.

Remember: the lower your Covariance rank (i.e. the closer to 120th you get), the more you tend to play better against the best competition. The higher your rank, the more you tend to most heavily dominate weaker competition.

Most Consistent

1. Tennessee (Std. Dev.: 13.8)

2. UL-Monroe (16.3)

3. Fresno State (16.4)

4. Central Michigan (16.5)

5. Florida International (16.7)

6. West Virginia (17.0)

7. Texas A&M (18.6)

8. Akron (19.9)

9. LSU (20.2)

10. Eastern Michigan (20.5)

That Tennessee is tops on the consistency list kind of blows my mind considering they have had three coaches in four years. But no matter what, they tend to finish between 14th and 33rd most years.

Least Consistent

1. Arizona State (Std. Dev.: 50.1)

2. South Florida (47.8)

3. Rutgers (47.0)

4. Pittsburgh (46.0)

5. Miami (44.2)

6. Nevada (43.5)

7. SMU (43.3)

8. Oklahoma State (42.9)

9. Hawaii (42.9)

10. New Mexico (42.8)

A Dennis Erickson team tops the "least consistent" list. I feel that makes quite a bit of sense. But seriously ... ASU has been kind of nuts. They ranked fourth twice, between 25th and 30th twice, and 99th or lower three times. In each of the last two years, they played quite well against good teams and let down their guard against lesser ones.

Most Consistently "Best Against Worst"

1. LSU (Average Rank: 22.1)

2. Mississippi State (25.0)

3. Tennessee (26.3)

4. UL-Monroe (27.4)

5. Penn State (29.1)

6. Florida (30.6)

7. Kentucky (33.1)

8. Connecticut (33.3)

9. Troy (33.4)

10. Ohio State (38.1)

I really don't know what to make of the fact that five of the top seven teams on this list are from the SEC.

Most Consistently "Best Against Best"

1. West Virginia (Average Rank: 95.6)

2. Maryland (94.1)

3. San Diego State (86.3)

4. UTEP (85.3)

5. SMU (85.1)

6. Stanford (84.9)

7. Cincinnati (84.3)

8. North Carolina (84.1)

9. USC (83.4)

10. East Carolina (81.4)

I was shocked that giant-killing Iowa State didn't make this list, though in fairness, they did rank 11th; at the same time, I did not expect Maryland to be that high on the list. USC and East Carolina do not surprise me, however.

Anyway, giant data table after the jump. Enjoy.