Why Maryland high schools dropped off the U.S. News top 100
I blogged a few days ago on Montgomery County schools superintendent Jerry D. Weast's search for an explanation for why three of his best schools, which had appeared in the top 100 list of U.S. News & World Report's America's Best High Schools before, disappeared from the latest U.S. News list.
Some people consider me hopelessly biased on this issue, since I do a competing list of America's Top High Schools, using a somewhat different formula, every year in Newsweek. But that is not so. I like the U.S. News list makers and their list, and have welcomed them into the School Ranking Scoundrels Club, since we all take abuse for making judgments on which schools are best. When their new list appears, the page views for my list increase, and I assume vice versa.
I promised to blog again when U.S. News responded to Weast's concerns about the disappearance of Wootton, Whitman and Churchill from the top 100. Here is the message Paul Gazzerro, a very smart guy who is the lead analyst for the U.S. News list, sent to Montgomery County. I asked him some followup questions---I was curious about how his formula is affected by schools that have few or no disadvantaged students--and will report his answers when I get them.
Here is what he said to the folks in Montgomery County. (I have had to delete some references to tables he included which I cannot reproduce here):
The way in which we evaluate schools for America's Best High Schools is a little different than some rankings. Instead of a general purpose formula that simply weights many different variables and allows for trade-offs, we have created a multi-step method for selection. These steps are based on our core philosophy that underlies the selection method - that the best schools serve all students well.
The three steps of the method are as follows:
1) identifying high schools that serve all students well by achieving performance levels in the core subjects of reading and math on state accountability tests that exceed statistical expectations given their relative levels of student poverty;
2) identifying high schools that serve the least advantaged student groups well by producing proficiency rates on state tests for black, Hispanic, and economically disadvantaged students that exceed state averages for these groups;
3) identifying high schools that provide students with access to a college-level curriculum, measured through participation and performance on AP and/or IB tests.
Basically, the first two are used as screens to filter down to schools that are eligible to be evaluated for their college readiness.
As it happens, the reason that some of your district's schools dropped from being recognized (or dropped to being Honorable Mention) is that they did not meet step 1, so I will explain in greater detail how this step works.
In step 1, we first create a state test performance index, and then compare schools to one another based on the statistical relationship between these scores and their student poverty rates. The state test performance index takes Maryland HSA data for math and reading, as reported on the state report card, and weights it according to the percentage of student scoring at each level. 0 points go to the lowest level, 1.0 points to level 2 (proficient), and 1.5 points to the highest level, level 3 (advanced). The resulting index ranges from 0-150, with higher scores being better.
The 2009 list used data from 2006-07, and the 2010 list (just released) used data from 2007-08. All schools improved their state test performance index substantially from one year to the next, but some more so than others. This was true statewide.
We then take these index values and plot them along with each school's student poverty rate (typically the percentage of students receiving free or reduced price lunches). While the statistical correlation is not perfect, it is by far the strongest predictor of performance, and allows us to create a level playing field for schools by effectively comparing them more to their peers with similar poverty rates than to the state as a whole. The resulting regression line, or expected performance, can be compared to the actual performance index. In order to take into account measurement error to define exceptional performance, we then draw a confidence interval of plus/minus one standard deviation around this expected value. Only those schools that performed at a level that was more than one standard deviation better than expected were considered to have met this criterion. To present this, we created a risk-adjusted performance index, and only values that equal or exceed 1.00 (as in the one standard deviation) meet the criterion.
The reason that your district's previously recognized schools did not meet this criterion is that, while they improved from year to year, other schools in the state improved more, and the bar got higher. This is not to say that these are not very good schools - just that they did not meet the full set of criteria required to be recognized as among America's Best High Schools.
I hope that this is a bit clearer, and offers you some specific details as to how your schools fared using the method. I welcome any additional questions you may have - please feel free to contact me directly.
Director, Analytical Criteria and Research
School Evaluation Services | Standard & Poor's
55 Water Street, 42nd Floor | New York, NY 10041
For more from Jay, go to http://washingtonpost.com/class-struggle
For all the Post's Education coverage, please see http://washingtonpost.com/education
| December 16, 2009; 1:36 PM ET
Categories: Jay on the Web | Tags: Montgomery County, Newsweek high school list, Paul Gazzerro, U.S. News high school list
Save & Share: Previous: A holiday guide to books for kids
Next: Maryland, Virginia and the U.S. News best high schools list
Posted by: someguy100 | December 16, 2009 11:19 PM | Report abuse
Posted by: LadybugLa | December 18, 2009 11:24 PM | Report abuse
Posted by: someguy100 | December 19, 2009 1:10 PM | Report abuse
The comments to this entry are closed.