When Debarghya Das’ friends approached him to obtain their final grades from the education board before the official announcement, he gave it a try and got nowhere. Das, a computer science student at Cornell, had just finished his own exams and was to begin an internship at Google in a few days. Having graduated from the same education system as his friends a few years before, he had always suspected that the board was doing something fishy, but had never found any proof of it — he’d only heard rumors.

Intrigued by the prospect of analyzing the data of thousands of students, he began poking around on the results page after the grades were announced and found that the web security was rather lax. In a few hours, he slipped past the education board’s poorly-designed website, scraped together the results of the thousands of students who had written the exam that year and found potential evidence that the grades were being tampered with.

The quality of education in India depends heavily on the kind of syllabus followed by the school, which in turn depends on the state the school is located in. But the exception to this are the Central Board of Secondary Education (CBSE) controlled by the central government of India and the Indian Certificate of Secondary Education (ICSE or ISC) run by the independent Council for the Indian School Certificate Examinations (CISCE).

The CISCE does not have governmental oversight and compared to the other state-run systems, it has a smaller, economically well-off student population. This year over 200,000 students took the CISCE’s tenth and twelfth grade exams. Their results were distributed to external media sources and published on various websites. The security on these websites was almost non-existent and since they had posted the data in the public domain, Das was able to scrape it easily.

When Das analyzed the “26 megabytes of pure, magnificent data,” as he puts it in his blog post, he found that in six different subjects — English, Hindi, computer application, science, math and history, civics and geography — not one student had scored over half of the possible grades attainable to pass the exam. The tests are scored out of 100 and the minimum score required to pass is 35, which means students should theoretically be able to get any of the 66 numbers between 35 and 100 to pass. However, Das’ data showed 33 missing scores.

In other words, of the nearly 150,000 tenth grade students who had taken the exam, no one had scored 33 of the possible 66 marks. Not one student, in all six subjects! It resulted in the sharp peaks shown in the graph above, instead of the expected smooth bell-shaped curve. And even more shockingly, the same 33 numbers were missing in all six subjects. Coincidence?

“It’s statistically impossible,” says Bhubaneswar Mishra, a professor of computer science at NYU. And Das agrees; having heard stories of students complaining about how they expected higher grades, he had always suspected there might be some tinkering going on.

But Gerry Arathoon, the chief executive and secretary of the CISCE, denied Das’ allegations. In a statement to the Times of India, he said, “in keeping with the practice followed by examination conducting bodies, a process of standardization is applied to the results."

Since the difficulty level of exams varies every year, many examination systems like the SAT and GRE, compute a raw score based on the number of correctly answered questions, which is then converted into a standardized score. If the exam is tougher than previous years, the standardization corrects for it and students receive a higher final grade. However, if the exam is easier, then the grades are lowered.

“But the inherent flaw with the council’s standardization is that they’ve been passing off the standardized score as the raw score,” says Das. Arathoon’s statement also doesn’t explain how the exact same scores could be missing in six different subjects’ results.

In addition, it’s common practice amongst standardized testing bodies to release both the raw and standardized score, which the CISCE doesn’t follow. The SAT, for instance, provides statistical data including averages and percentile distributions in order to reveal how students did on the whole without violating privacy laws. It makes the SAT a lot more transparent in its dealings when compared to the CISCE.

Das compared the CISCE’s system to a big black box. “Things go in and things come out but we don’t know what’s happening inside. It’s unfair to the students. They deserve to know more,” he said.

The Indian education system places a lot of importance on tenth and twelfth grade marks. It decides what subjects you’ll be allowed to study, the colleges you’ll be accepted in and how much you’ll pay to enroll in the major you’re interested in. Colleges announce “cut-off marks” for every major and only those students who have scored above the minimum required score are considered by the college’s admission office. Often the cut-off marks are as high as 99% or incredibly, even 100%.

The pressure to do well is intense during the final years of high school and a shift of one or two points on an exam can make the difference between studying on scholarship or paying thousands of dollars in capitation fees or “donations” to guarantee a place in the university. In such a competitive environment, the CISCE’s lack of transparency can lead to a lot of psychological and financial stress for students as well as their families.

Despite the significant coverage of Das’ allegations by the Indian media, it would be unrealistic to expect any major changes in the CISCE’s system. Since Arathoon has openly accepted that results are standardized, without acknowledging the other discrepancies that Das’ analysis revealed, it’s highly unlikely that the CISCE will change its methodology. At best, they might improve the security on the websites where the results are released.

Image credits: IgnitionMind (top) and Debarghya Das (bottom).