You may remember back to a simpler time when the SAT dealt with math, science, and analogies, not with Google searches and Wikipedia. But as information and communication technology (ICT) have become crucial student skills in the last decade, organizations like the Educational Testing Service (the group behind the SAT) have created material to judge student performance in these new areas.

ETS has developed an ICT Literacy Assessment (test a demo version) that gives students short tasks (3-5 minutes, testing one particular skill) and long tasks (15 minutes, testing skills in combination) to complete on a computer. These include things like sifting through e-mail and developing accurate search queries for academic databases, along with other, more business-related projects.

Another group, the National Forum on Information Literacy, has just announced the creation of an "ICT Literary Policy Council" that will review the ETS exam and issue recommendations for "cut points." These will be used to map exam scores to achievement levels, so that educators can determine "which students are proficient and which may need additional ICT literacy instruction or remediation."

The local level

While having such national standards may be a boon for businesses and schools that want to know how to interpret the test scores, they are still under development. What does a teacher today do if he wants to measure students' IT literacy in an educational setting?

Michael Lorenzen, a Michigan librarian, had his students visit an eclectic mix of websites and asked them to determine whether each was legitimate or not, and if so, whether it made a good source for information. What he found was exactly what you might suspect: just because students have been exposed to the Internet for most of their lives does not make them magically able to discern the quality of Internet sources.

"The students had mixed success," he says. "They correctly identified the whale watching, shards of glass, male pregnancy, and tree octopus sites as being hoaxes. They correctly identified the Texas independence and the human extinction movement as real. They mistakenly labeled the Hawaiian and West Florida independence sites as hoaxes. They also believed the Fredericton site [a fake town] and the Feorran site [a fake language] were real."

One social science professor that we spoke with confirmed that properly evaluating and using sources remains a challenge even for college students, who might have been thought to have mastered the skill. This professor instructed students in the importance of using scholarly sources in research papers. He went so far as to show video clips from The Colbert Report which highlighted the fact that sites like Wikipedia were not reliable places for getting information on (for instance) elections in Estonia. Nevertheless, upon grading these papers last week, the professor found that not one but three students had cited Wikipedia, and another quoted the Encyclopedia Britannica.

Will national, standardized exams help to solve such problems? If the exam is a fair metric of what students know, it can be used by educators and businesses to identify people who need a bit more ICT knowledge and give them extra assistance. A more likely scenario, especially in business, will be the use of such exams to weed out job candidates, which could make an exam like the ICT Literacy Assessment as important for finding a white-collar job as the SAT currently is for getting into college.