Network News

X My Profile
View More Activity

Posted at 6:30 AM ET, 04/16/2010

Any value to U.S. News rankings?

By Valerie Strauss

It's the time of year for more U.S. News & World Report rankings, these of graduate schools around the nation.

The reason they are worth writing about is to remind everyone how these rankings are created, so that nobody takes them seriously.

Let’s look at the rankings of education schools. (Also ranked were schools of business, medicine, law and engineering.)

Because you probably want to know, here’s the top 10:

#1 Vanderbilt University’s Peabody School; #2 Teachers College at Columbia University; tied for #3: the schools of education at Harvard and Stanford universities; #5 University of Oregon's College of Education; tied for #6 the schools of education at Johns Hopkins University and the University of California at Los Angeles; #8 Northwestern University's education school; #9 University of Wisconsin at Madison's education school; tied for #10: University of California at Berkeley's education school, and the colleges of education at the University of Texas at Austin, and the University of Washington in Seattle.

Surely these schools offer excellent programs. But you should know that the largest single factor in the rankings--worth 25 percent--is something called “peer assessment.

That’s done by asking education school deans and education school deans of graduate studies to rate programs on a scale from marginal to outstanding. Individuals who don’t know enough about a school to evaluate it fairly were asked to mark “don’t know,” and those weren’t counted. About 47 percent of those surveyed responded.

Need I say more? How much is enough for someone to think they can actually rank schools? How carefully do officials at one school investigate competitors to know if one is better than another, by a hair or a mile?

Then, superintendents of K-12 systems were asked to rate programs on the same scale and given the same option to say they didn’t know. For this year's rankings, the two most recent years’ superintendents’ survey results were averaged.

These responses were worth 15 percent of the total, meaning that peer assessment was a whopping 40 percent of the total ranking.

About 19 percent of the superintendents surveyed responded. The wonder is that any of them did.

The other 60 percent included multiple factors, including the results of the Graduate Record Examination that graduates students must take. If you believe that a single standardized test score actually speaks to the quality of a student, then you would think this is a useful measure. I don’t. The test scores amount to 18 percent of the total.

That leaves 42 percent of the other factors, including acceptance rate, faculty resources, student-faculty ratio, percent of faculty with awards or editorships among selected education journals, total research expenditures, and average expenditures per faculty member.

It’s all very, very complicated. I just can’t find much meaning in it.

Follow my blog all day, every day by bookmarking And for admissions advice, college news and links to campus papers, please check out our new Higher Education page at Bookmark it!

By Valerie Strauss  | April 16, 2010; 6:30 AM ET
Categories:  Higher Education  | Tags:  U.S. News & World report rankings, U.S. News and graduate schools, U.S. News and rankings, U.S. News rankings, business school rankings, engineering school rankings, law school rankings, rankings of education schools, rankings of graduate schools  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   StumbleUpon   Technorati   Google Buzz   Previous: Sexting scandal at Montgomery County schools
Next: What should happen to kids who sext?


The power of these rankings is to create a hiring bias in the workplace. As a hiring authority for the federal government, I rely on such ratings to aid in ranking applicants. I have few other measures from which to choose. When there are ten applicants who have taken very similar curricula and all have excellent academic records, yet one has to differentiate among them these rankings are helpful.

Applicants know this and choose their grad schools accordingly. Is this a self-selection bias? Perhaps. But if those who are most competitive in nature and who seek an edge gravitate to the highest ranked schools, and if those are the sorts of people you want to hire, then the rankings are ideal. And we do use them.

Posted by: LoveIB | April 16, 2010 7:49 AM | Report abuse

But when you apply for a job, the only thing the potential employer knows about your school is it's reputation... so how is a schools reputation not immensely important?

Posted by: someguy100 | April 16, 2010 8:03 AM | Report abuse

Bravo for pointing out the uselessness of the peer assessment. There isn't a college president or dean in the nation with enough knowledge to judge fairly their peer institutions and the quality of the education. It becomes a popularity contest or an effort to knockdown others so that one's own stock might rise. These rankings aren't going to go away anytime soon, so let's at least remove this biased and much too heavily weighted and subjective portion.

Posted by: BarkingMad | April 16, 2010 10:01 AM | Report abuse

It is too bad that the federal government "hiring authority" who commented above is unable to distinguish between applicants and must rely on a generic, contrived, and formulaic ranking system to ferret out qualified applicants. Surely if an employer is going to make an investment in an employee it is useful to develop an application system that tells you a little bit more about applicants than the school they went to and the grades they received. It seems as though an employer is going to end up with a very one-dimensional work force if every person in the office comes from only a handful of schools.

Applicants are certainly responsible for distinguishing themselves in some way, but it should be based on more than a publications flawed rating system.

Posted by: campus411 | April 17, 2010 11:55 PM | Report abuse

You have to remember the typical GPA for a worker in the federal government is a "C". Couple the aforementioned GPA with "Industry Best Practice" and you have the perfect storm of circumstances resulting in someone not thinking for themselves.

Posted by: heretic2 | April 19, 2010 10:58 AM | Report abuse

The comments to this entry are closed.

RSS Feed
Subscribe to The Post

© 2010 The Washington Post Company