Network News

X My Profile
View More Activity


Posted at 12:30 PM ET, 09/22/2010

A critical look at a report on standardized tests

By Valerie Strauss

This post was written by Richard Innes, education analyst for the Bluegrass Institute for Public Policy Solutions, who criticizes a report recently issued by the Center on Education Policy about standardized test scores.

Last week, I published this guest post about that report by Jack Jennings, the center’s president and chief operating officer, which said that fourth- and eighth-grade students are doing better on standardized tests than commonly thought.

This is more than a debate about numbers; with education policy today centered on standardized test scores, these issues wind up affecting millions of students and teachers.

By Richard Innes

The Center on Education Policy (CEP) recently issued a report titled “State Test Score Trends Through 2008-09, Part 1, Rising Scores on State Tests and NAEP,” lauding a supposed discovery that improvement on both state public school tests and the National Assessment of Educational Progress (NAEP) is moving in the same direction and looks much better than previously reported.

Unfortunately, the report turns the mathematics of statistical sampling on its ear while attempting to claim that the performance of most states is improving on the NAEP reading assessments.

The facts: Between 2005 and 2009, the period of major concern in the CEP report, NAEP’s own Reading Report Card for 2009 clearly shows that a strong majority of the states did not post statistically significant improvements in either fourth- or eighth-grade reading. That general lack of progress is found both for the percentages of students scoring at the level NAEP calls “Basic” and at the level NAEP calls “Proficient.” Trying to claim otherwise, which the CEP report attempts to do, simply is not statistically defensible.

There are more issues with the CEP report, such as the selection of an unrealistically low target performance level – NAEP Basic – as a suitable comparison level for state assessment programs.

This all matters because it may affect new tests designed in line with the Common Core standards and the willingness of people to make changes that some experts say are necessary.

You can read more about this in the Bluegrass Policy Blog article, "Report builds mountain out of education ant hill."

Follow my blog every day by bookmarking washingtonpost.com/answersheet. And for admissions advice, college news and links to campus papers, please check out our Higher Education page at washingtonpost.com/higher-ed Bookmark it!

By Valerie Strauss  | September 22, 2010; 12:30 PM ET
Categories:  Research, Standardized Tests  | Tags:  bluegrass institute, cep report, standardized tests  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   Del.icio.us   StumbleUpon   Technorati   Google Buzz   Previous: Are charter schools really innovative?
Next: College Board vs. FairTest

No comments have been posted to this entry.

The comments to this entry are closed.

 
 
RSS Feed
Subscribe to The Post

© 2010 The Washington Post Company