A defining moment for the Common Core State Standards movement will come next week, when students nationwide will begin taking standardized tests tailored to the much-maligned academic standards.

Parents, teachers and school leaders have expressed concern that the tests may be too difficult or time-consuming, or that their schools' technology might not be up to snuff for what the tests require. To give stakeholders a better idea of what's coming, the Partnership for Assessment of Readiness for College and Careers (PARCC) – one of two testing consortia developing the tests – has released practice tests on its website.

The beginning of testing season comes in the midst of congressional efforts to reauthorize and update the No Child Left Behind Act, which requires public school students to be tested once annually in math and English in third through eighth grade, and again in each subject once in high school. In total, federal mandates account for 17 tests students take throughout their academic careers: seven for English, seven for math and three grade-span tests (once each in elementary, middle and high school) for science. And states and local school districts have added other tests to comply with requirements that student growth be used as a factor in teacher evaluations.

I took a practice, computer-based PARCC test at the third-grade level for both math and English language arts, and participated in a webinar the consortium hosted for reporters this week to explain how the new assessments differ from previous tests states have administered. For time's sake, I did not write the two essays required in the English assessment. It took me about an hour to complete the two sample tests, one of which featured 13 questions and the other 17.

PARCC officials who spoke during the webinar said the English and math tests make fundamental shifts in how they measure what students know and are able to do. In English and literacy, the tests focus on "building knowledge through content-rich nonfiction," they said, as well as assessing reading, writing and speaking skills. The math exams, meanwhile, are meant to contain less content than previous tests, but focus on a deeper understanding of certain concepts, they told reporters. The exams are also meant to connect concepts within and across grades, and to be more rigorous in the sense that they focus equally on understanding concepts, demonstrating skills and applying those skills.

The computer-based format of the test was intuitive enough to understand. I was able to easily click through the questions, drag and drop items to different places on the page when required and use the drop-down menu to select different items.

As for the questions themselves, I can't personally speak to whether their content is grade-level appropriate, as I am neither an educator nor a current third-grade student. Any difficulty I had also could stem from the facts that I haven't taken a math class since 2008 and finished third grade in 1999.

On the math exam (answer key here), some questions focused on fairly basic concepts about addition and subtraction, multiplication and division, fractions, and beginning geometry, such as finding the area of a rectangle. Others asked the test-taker to explain how a hypothetical reasoning was or was not correct.

For example, one question reads:

"Cindy is finding the quotient of 27 ÷ 9. She says, 'The answer is 18 because addition is the opposition of division and 9 + 18 = 27.'"





The test-taker would then have to explain why Cindy's reasoning is incorrect in the first part of the question, and then explain how she could correct her reasoning and find the quotient in the second part.

In the third-grade English exam (answer key here), test-takers read three short stories, each consisting of about 30 short paragraphs. Some questions are not new in terms of standardized tests – they ask students to define certain words in the story. But subsequent questions go much more in-depth, asking students to identify phrases or clauses in the text that support their answer.

Other questions were more summative, asking me to identify which details best showed the story's central theme of "contentment" and to select which specific details quoted from the story best support that answer. The test also asked me to write one essay on two of the stories – explaining how words and actions in each are important to the plots – and a second essay in the form of a journal entry that focuses on the third story.

In Middlesex County, New Jersey, a local teachers union organized an event on Thursday to allow parents and teachers to try the PARCC test, New Jersey 101.5 reported. The participants generally found the tests to be difficult, with one saying there was too much information in the questions and another saying it would be easy for students to lose their focus.

And in one Illinois school district, school board member Tom Brabec took a PARCC math assessment for third-graders, but appeared to give up before moving on to an English exam, according to The Chronicle, a local paper in Homewood, Illinois. A school principal, Cece Coffey, told the paper the format of the test is "just too complex" for students to support their answers.

"We teach them on paper how to outline a story recognizing the highlights and then to discuss those points and then write an essay," Coffey told The Chronicle. "With this test, we’re asking our students to compose an essay in their heads and get it onto a computer screen."

In Ohio, which will be the first state to administer the PARCC exams next week, middle school teacher Jocelyn Weeda said the tests are developmentally inappropriate and is urging parents to opt their children out of taking the test, a local Fox affiliate reported.

"When you look at the reading levels they're way above where students are developmentally at that age," Weeda said after taking a practice test.

In response to concerns about the content of the questions, PARCC officials told reporters during the webinar that each test item is reviewed by more than 30 experts, including teachers. The consortium has also drawn on information from last spring's field test of more than 1 million students, which informed the test developers whether an item needed to be revised.