Network News

X My Profile
View More Activity

Posted at 2:29 PM ET, 02/20/2010

''The Wilen Index"--a new high school ranking, sort of

By Valerie Strauss

My esteemed colleague Jay Mathews recently published his annual, well-known and controversial Challenge Index, in which he ranks high schools with this formula that he devised years ago: The number of Advanced Placement and/or International Baccalaureate and/or other college-level tests taken by all students at a school, divided by the number of graduating seniors.

The index has its many critics--I have been one of them--who say that it's not fair to judge a school by a single test score. Jay says he isn’t just using test scores--which are generally a measure of average family income--but, rather, showing how much a school is challenging its students with rigorous curriculum.

I received some letters and emails from readers who said they agreed with me, so I decided to challenge them to devise a better way to rank high schools.

(I played around with my own ranking--which would require schools to make mandatory the reading of my favorite books, which include “Anna Karenina” and “Jane Eyre”--but quickly thought better of it.)

Here is one response from Louis Wilen, a Montgomery County parent of two Magruder High School graduates and one Magruder senior.

Said Wilen:

"The problem with all of the high school ranking systems is that none of them establishes a baseline. How can you determine how effective a school is at teaching -- in other words, how 'good' it is -- if you don’t take into account what students know when they enter the school? I’ll offer a solution that solves that problem. Most of the schools in our area administer the PSAT in 10th grade (as well as in 11th grade). The PSAT is supposed to predict how well students will perform on the SAT."

So here it is, The Wilen Index:

- Calculate the mean 10th grade PSAT score (Reading + Math) for each school as a baseline (multiplying by 10 to normalize the scores with the SAT).

- Calculate the mean SAT score of the same students in 12th grade (Reading + Math). (In other words, use only the scores of students who took the both the PSAT and SAT, and who were at the school throughout the 10th, 11th and 12th grades.)

- Subtract the normalized PSAT from the SAT score, and you get: The Improvement Delta.

Readers, what do you think?


Follow my blog all day, every day by bookmarking And for admissions advice, college news and links to campus papers, please check out our new Higher Education page at it!

By Valerie Strauss  | February 20, 2010; 2:29 PM ET
Categories:  High School  | Tags:  high school rankings  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   StumbleUpon   Technorati   Google Buzz   Previous: Did Tiger HAVE to drag in the school kids?
Next: The panic over 'missing' application documents


Couple of problems in "low performing" schools

Low performing schools have notoriously high student mobility and high numbers of students that don't take the SAT... the result is that a large percentage of students would be not counted... not a realistic interpretation of school quality, as having a large percentage of students NOT take the SAT would probably boost your score.

Posted by: someguy100 | February 20, 2010 2:59 PM | Report abuse

The Wilin index would show little more than which schools have a higher percentage of students enrolled in SAT test prep. The SAT is intentionally not linked to any specific high school curriculum, so it cannot measure classroom learning. And, the College Board itself, explicitly warns that SAT scores should not be used to compare schools (or school systems).
However well intentioned Louis Wilin might be, his proposal is fundamentally unsound. Its predictable fallout would be even more money diverted from real education to test prep. courses focused on teaching teenagers techniques to boost their scores on multiple-choice exams.

Bob Schaeffer, Public Education Director
FairTest: National Center for Fair & Open Testing

Posted by: FairTest | February 20, 2010 5:15 PM | Report abuse

Here's a very simple ranking system: don't.

Comparative rankings encourage meaningless comparisons - my school is #8; your school is #15. My school is better - ha! Really? Maybe, maybe not. Maybe for your kid, but not for mine.

Comparative rankings encourage uniformity of outcomes. Kids at your school did better on standardized tests, so the school must be better - ha! Really? Kids at my school spent their time doing activities which resulted in a variety of learning outcomes, which are hard to rank but are excellent preparation for future learning.

Comparative rankings encourage more conformity and social indoctrination. Kids at your school took more AP classes, so they must be smarter/better. Ha! Really? Kids at my school figured out better ways to learn than taking AP courses. Who's better prepared for the future?

Comparative rankings encourage meaningless competition and snobbery - which really is no laughing matter. Huh.

Posted by: jetchs | February 21, 2010 9:46 AM | Report abuse

Bob (and others) -- The point behind The Wilen Index (a name Valerie assigned to it) is that it establishes a *baseline* using a widely administered test, and then goes on to use that baseline in a final calculation of improvement.

I'm no fan of the SAT, and I agree that the PSAT and SAT are not ideal for the purpose of rating schools. But what else do we have that measures baseline skills so that we can tell how much students have *improved* through their time in high school?

Posted by: louiswilen1 | February 21, 2010 10:38 AM | Report abuse

As a former high school teacher, I have to second Mr. Schaeffer's critique. You've invented a useful and worthwhile test, but it is a test measuring the effectiveness of test prep programs. I would have hated for my curriculum to be judged against measures other its stated purpose.

Please see texts like "Understanding by Design" to see what K-16 teachers are and should be doing.

Posted by: wyoung515 | February 21, 2010 11:36 AM | Report abuse

What about those students who choose to take the ACT for college admissions?

Posted by: poparoni | February 21, 2010 1:30 PM | Report abuse

Also, some colleges allow students to enter without reporting test scores.

The problem with using any standardized test for anything is that the only thing is measures is how good the student is at taking standardized tests. And, given the problems I saw when I worked in preparation of textbooks and standardized tests, the only thing a test really measures is how well the student was able to guess at what answers the people making up that particular test thought were right.

Posted by: sideswiththekids | February 22, 2010 4:25 PM | Report abuse

The comments to this entry are closed.

RSS Feed
Subscribe to The Post

© 2010 The Washington Post Company