Any value to U.S. News rankings?
It's the time of year for more U.S. News & World Report rankings, these of graduate schools around the nation.
The reason they are worth writing about is to remind everyone how these rankings are created, so that nobody takes them seriously.
Let’s look at the rankings of education schools. (Also ranked were schools of business, medicine, law and engineering.)
Because you probably want to know, here’s the top 10:
#1 Vanderbilt University’s Peabody School; #2 Teachers College at Columbia University; tied for #3: the schools of education at Harvard and Stanford universities; #5 University of Oregon's College of Education; tied for #6 the schools of education at Johns Hopkins University and the University of California at Los Angeles; #8 Northwestern University's education school; #9 University of Wisconsin at Madison's education school; tied for #10: University of California at Berkeley's education school, and the colleges of education at the University of Texas at Austin, and the University of Washington in Seattle.
Surely these schools offer excellent programs. But you should know that the largest single factor in the rankings--worth 25 percent--is something called “peer assessment.
That’s done by asking education school deans and education school deans of graduate studies to rate programs on a scale from marginal to outstanding. Individuals who don’t know enough about a school to evaluate it fairly were asked to mark “don’t know,” and those weren’t counted. About 47 percent of those surveyed responded.
Need I say more? How much is enough for someone to think they can actually rank schools? How carefully do officials at one school investigate competitors to know if one is better than another, by a hair or a mile?
Then, superintendents of K-12 systems were asked to rate programs on the same scale and given the same option to say they didn’t know. For this year's rankings, the two most recent years’ superintendents’ survey results were averaged.
These responses were worth 15 percent of the total, meaning that peer assessment was a whopping 40 percent of the total ranking.
About 19 percent of the superintendents surveyed responded. The wonder is that any of them did.
The other 60 percent included multiple factors, including the results of the Graduate Record Examination that graduates students must take. If you believe that a single standardized test score actually speaks to the quality of a student, then you would think this is a useful measure. I don’t. The test scores amount to 18 percent of the total.
That leaves 42 percent of the other factors, including acceptance rate, faculty resources, student-faculty ratio, percent of faculty with awards or editorships among selected education journals, total research expenditures, and average expenditures per faculty member.
It’s all very, very complicated. I just can’t find much meaning in it.
Follow my blog all day, every day by bookmarking washingtonpost.com/answersheet And for admissions advice, college news and links to campus papers, please check out our new Higher Education page at washingtonpost.com/higher-ed Bookmark it!
| April 16, 2010; 6:30 AM ET
Categories: Higher Education | Tags: U.S. News & World report rankings, U.S. News and graduate schools, U.S. News and rankings, U.S. News rankings, business school rankings, engineering school rankings, law school rankings, rankings of education schools, rankings of graduate schools
Save & Share: Previous: Sexting scandal at Montgomery County schools
Next: What should happen to kids who sext?
Posted by: LoveIB | April 16, 2010 7:49 AM | Report abuse
Posted by: someguy100 | April 16, 2010 8:03 AM | Report abuse
Posted by: BarkingMad | April 16, 2010 10:01 AM | Report abuse
Posted by: campus411 | April 17, 2010 11:55 PM | Report abuse
Posted by: heretic2 | April 19, 2010 10:58 AM | Report abuse
The comments to this entry are closed.