According to the ISBE, the reason for this discrepancy is "because standardized test criteria has evolved over the years, while grading has not, thus explaining the lack of consistencies. Under our current system, grades do not serve as a proper indicator; they do not match up with the way we are assessing students."

The ISBE found that 70 percent of those schools that provided data had a discrepancy of at least 8 percent between Honor Roll and PSAE percentages. But not all of the schools studied showed a negative correlation. A recent PowerPoint presentation created by the advisory council showed a chart of four randomly selected schools, all of which had a higher percentage of juniors who met or exceeded standards on the PSAE than those who were on the honor roll. For instance, one school had 44 percent of its juniors on honor roll, but 62 percent of them meeting or exceeding on the PSAE.

The Illinois State Board of Education's 2014 Advisory Council has found that the same disparity McDermott discovered is also prevalent across the state. The ISBE Advisory Council contacted more than 200, and received data from more than 175, public high schools in Illinois and 2012-2013 enrollment and honor roll numbers for those schools' junior class. It then compared the percentage of juniors on honor roll to the percentage of those who met or exceeded PSAE standards for the same year.

Michael Romain

Editor, The Village Free Press

A report by District 209 Board of Education member Kevin McDermott shows that there are 160 students at Proviso East and Proviso West with 0.0 GPAs and that even students with GPAs above 3.0 are testing at levels that far fall below what those high grades indicate. At PMSA, the District's flagship school, he found that, although 97 percent of 4th-year students had GPAs above 2.0, only 67 percent of them met or exceeded PSAE testing standards. That makes for a 45 percent disparity between those students' grades and their test performance.

However, Superintendent Nettie Collins-Hart disputes McDermott's findings, saying that his analysis is riddled with flaws and that the basic comparison between GPAs and test scores is inadequate for accurately gauging whether or not students are actually absorbing lessons taught in the classroom.

She noted that there are other dynamics, such as the fact that the GPAs are weighted and that students take non-academic courses like physical education, that must be considered as well.

McDermott said he was prompted to create the report after reviewing a student discipline case over a year ago and finding that one student had a GPA of about 0.33, but was ranked above roughly 180 other students--students who had even lower GPAs. He said that he encouraged the administration to look into the matter, but they refused. That's when he decided to look into the matter himself.

McDermott said he presented his report at a school board meeting last month, but that his findings have barely caused a whimper among fellow board members and the administration.

"The central point I was making in my presentation is that we have a substantial disparity between what we say about the percentage of students who meet our standards versus what the Prairie State Achievement Examination (PSAE) says," he said.

McDermott himself acknowledged the limits of his analysis, even conceding that the sample student population he looked at was less than ideal, but he said his main problem with the administration was its lack of interest in pursuing the matter in the first place.

There are two main elements of the PSAE--the test itself and the college readiness benchmark. McDermott said that he had assumed that students with GPAs of above 2.0, or a 'C' average--those meeting district standards--were also meeting the state standards reflected in PSAE test results. But he realized that that assumption can't be further from the truth.

"I thought if you have a 2.0, you're basically getting it," McDermott said. "By our definition, you're meeting standards, so I looked at the number of students with a 2.0 or better and analyzed their performance on the PSAE."

Those 4th-year PMSA students meeting or exceeding the PSAE's college readiness benchmark was only about 51 percent, despite the fact that 97 percent of them earned at least a 2.0 GPA. Therefore, the disparity between PMSA students' grades and their performance on both the college readiness benchmark is 88 percent, respectively.

If the disparity is that large at the district's flagship school, it grows to a chasm at Proviso East and Proviso West. At Proviso East, the average GPA of 4th year students was about 2.1, with about 58 percent of those students maintaining GPA's above 2.0. However, the percentage of those students meeting or exceeding PSAE standards and who test at or above the college readiness level was about 14 percent and 9 percent, respectively. That means that those students' grades were inflated more than 300 percent above their PSAE test performance level and more than 500 percent above their college readiness level, McDermott asserted.

In an email response, the Superintendent wrote that McDermott's analysis is riddled with flaws and that it "appears to have been based on weighted grade point averages. This would tend to skew the GPAs higher, resulting in more students in the 2.0 [and above] GPA group." She also pointed out that there are certain factors that make it difficult to compare GPAs and test performance, which she said are two "separate data sets" that aren't linked by student, "which means that only a very general attempt at correlation can be expected."

"Student grade point averages (GPAs) include grades from all classes, except physical education. A student's GPA reflects the grades earned throughout the length of his or her high school career. Standardized test scores, however, are based on student performance in discrete subject areas, usually during a single testing event, with scores normed to national data," she wrote.

"When student data is linked, we see that there is a positive correlation between GPAs and standardized test scores. Even with linked data, there remains the issue of whether it is useful to compare GPAs inclusive of all classes, or if it would be better to compare the data on a subject-by-subject basis (e.g., compare students' grades in English with ACT English scores, grades in math with ACT math scores, etc.)," the statement read.

McDermott said that the administration has nonetheless never submitted to the board the type of information in his report. He also claimed that when he asked the administration for the information, they were initially uncooperative.

"I want us to track this information permanently," McDermott said. "Our grades should reflect what our students are learning, that's the purpose of grades. If our grades are reflecting something dramatically different from what the standardized tests say, then that disparity should cause us to reconsider what we're doing differently."

For her part, however, Collins-Hart disputed McDermott's claim of regression by pointing out that graduation rates increased by 10 percent at Proviso East and by 8 percent at Proviso West between 2012 and 2013, despite declines statewide. She also emphasized that the average districtwide score on the reading portion of the ACT increased by one point, from 17 to 18, between 2012 and 2013--an improvement she said was consistent with overall progress students were making on the test.

At the Aug. 12, board meeting Superintendent Nettie Collins-Hart said that the administration would start issuing quarterly updates on student GPAs, so that the Board would be regularly notified about this matter.