Network News

X My Profile
View More Activity

Daring new high school rating system

[This is my Local Living section column for March 18, 2010.]

On my blog I gush over my many genius ideas, worthy of the Nobel Prize for education writing if there was one. Here is a sample from last month:

“Why not take the Collegiate Learning Assessment, a new essay exam that measures analysis and critical thinking, and apply it to high schools? Some colleges give it to all of their freshmen, and then again to that class when they are seniors, and see how much value their professors at that college have added. We could do the same for high schools, with maybe a somewhat less strenuous version.”

Readers usually ignore these eruptions of ego. But after I posted that idea, a cruel young man named Chris Jackson e-mailed me that his organization had thought of it four years ago and had it up and running. Very cheeky, I thought, but also intriguing.

I never thought anyone would try such a daring concept. If your high school’s seniors didn’t score much better than your freshmen, what would you do? What schools would have the courage to put themselves to that test or, even worse, quantify the level of their failure, as the program does?

It turns out, not many. Jackson, project manager for the Collegiate Learning Assessment and its high school version, the College and Work Readiness Assessment, said only about 50 high schools have signed up. In the Washington area, there is just one: the Severn School, a private, sixth-through-12th-9grade school with about 600 students on a 19-acre campus in Anne Arundel County’s Severna Park.

John Turner, an English teacher for 16 years and Severn’s academic dean, said he first heard of the program from John Austin, academic dean of St. Andrew’s School, a private school in Middletown, Del., even smaller than Severn. St. Andrew’s hosted a conference on how to teach and assess critical thinking in 2008. Several of the private high schools there signed up for what is called the CWRA, along with about a dozen public schools in Virginia Beach, several schools in the New Tech public high school network that began in California and an assortment of other private and public schools in 18 states.

The 100-minute exam is done online. There are no multiple choice questions. The student is given a performance task, such as a memo to a company president on whether to buy a certain small private plane shortly after an accident involving that model. The student reviews news articles, a federal accident report, performance charts and other data, then writes a memo justifying a recommendation. Independent scorers grade for clarity, persuasiveness, balance and other factors. The program charges $40 per student, or for Severn, about $8,000 a year.

When Turner explained to his seniors what they were going to be asked to do, for no credit or reward, “there was not a whole lot of applause in the room,” he said. The April weather last year was lovely. The senior class president warned Turner that several people planned to stay home on testing day. What could he do? The exam didn’t count.

To his surprise, everyone showed up. Many told him afterward that they had enjoyed an exam that made them think. The results were encouraging. The school scored in the 92nd percentile, compared with other high school seniors or college freshmen who had taken the test. Last fall, the incoming ninth-graders took it. Next month will be this year’s seniors’ turn. By 2013, the school will know how much value Severn has added to the analytical and critical thinking skills of one class after four years.

Few colleges using the test have released their results. High schools such as Severn have also been careful to keep the numbers to themselves. But Turner said his colleagues are working on teaching these skills. As schools get used to it, who knows what might happen? Someday, we might measure academic worth not with SAT averages but by how well each school’s students thought through a complicated problem. That might force a major change in the way we teach high school, not a minute too soon.

Read Jay's blog every day at http://washingtonpost.com/class-struggle.

Follow all the Post's Education coverage on Twitter, Facebook and our Education web page, http://washingtonpost.com/education.


By Jay Mathews  | March 17, 2010; 10:00 PM ET
Categories:  Local Living  | Tags:  College and Work Readiness Assessment, Collegiate Learning Assessment, John Turner, Severn School, high school rating, judging high schools by analytical ability  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   Del.icio.us   StumbleUpon   Technorati   Google Buzz   Previous: Star principal retires
Next: New Admissions 101 topic

Comments

Jay,
Maybe high schools could learn important things by implementing such a system. However, do you remember your article from last week "School budget cuts not such big news"? Well, with teacher and staff positions being downsized and existing programs being scaled back, it seems highly doubtful that scxhool systems suc has my own, Montgomery County, would be willing and able to add something new like this.
On a lighter note, I can just hear my daughter, who will a high school freshmen in the fall, screaming about having to take still another standardized test.

Posted by: Wmcfam | March 18, 2010 7:23 AM | Report abuse

I'm wondering what gives the CLA credibility?

Is there some independent means of verifying the accuracy and consistency of the CLA? Because if there is the web site doesn't provide any indication. If there isn't, why is being accorded the credibility it obviously enjoys?

Posted by: allenm1 | March 18, 2010 8:11 AM | Report abuse

Jay,

Interesting column. I recently sat through a rant by some elementary school educators about the Maryland School Assessment, or MSA. My third and fifth grade children just finished taking it. The big question seems to be, does having the MSA allow teachers to really teach, or is everybody just robotically teaching to the test? Is there real learning happening, or just a lot of busy work satifying the requirements? I found the point of view rather cynical, but I have to wonder.

Posted by: mdtay11 | March 18, 2010 12:46 PM | Report abuse

For allenm1--the CLA credibility derives from the high profile people who started it, including some experienced college presidents, and the results they have gotten, which many colleges say have been useful in seeing where they are weak in encouraging analytical skills. But it is still early in its life. Oh, and RAND did most of the work designing it, and they usually get respect.

Posted by: Jay Mathews | March 18, 2010 6:38 PM | Report abuse

Here's what I posted in the Ad101 thread:

Of course not (to the idea of an essay). I mean, good lord. Do you have any idea how many kids can't write at all? A simple essay--like the ones the ACT and SAT use--is fine.

You're a writer. You're showing your bias. Stop insisting that the rest of the world share your preferences. It's absurd to make kids write a long essay which--as Lisa points out--will be scanned for suitable opinions, too.

I swear, one of the things that bothers me about educational policy discussions is that the major players, regardless of the specific opinions, are so frigging ELITIST about your expectations for kids. You can't accept education as a tool. It has to be a moral endeavor in self-improvement. We have to pretend that all kids are intellectually curious, are interested and engaged in academic endeavors.

Education is apolitical. Stop trying to use it to mold people into your personal pet shape.

End of what I posted, but I will add one more thought:

The reason why essays are so popular is NOT because they "allow kids to develop their ideas" or "provide the full range of their knowledge" (quotes not from the post, but indicative of the pro-IB and essay people). That's total nonsense.

The reason why educational experts push for essays? Because at the local and national level, it allows everyone to hide or diminish the achievement gap.

Multiple choice tests are highly effective and extremely predictive (and spare me the crap about the SAT not predicting anything. What do most colleges use to determine remediation needs? The SAT or ACT. Why? Because grades are a fraud, at both the high school and college level.) If there were a nationwide multiple choice test--by all means, include an essay per subject--the breadth and width of the achievement gap would make the country curl up in a fetal position and give up on education entirely.

Anyone who protests otherwise is in denial at best or flatly lying. (Don't worry, Jay, you're just in denial.)

Posted by: Cal_Lanier | March 18, 2010 8:35 PM | Report abuse

Sorry Jay but the height of a profile only equates to the depth of the credibility when accompanied by evidence. A good deal of what's wrong with the public education system derives from the assumption that credibility derives from profile and not evidence.

The endless stream of edu-crap that flows from schools of education, all the dreary pedagogies du jour that plague teachers, confuse parents, delight administrators and boards, and result in uneducated students, are based on an assumption of credibility which is based on "profile". I would say that in no small part the glacial movement towards educational accountability is driven by a widespread perception that assumed expertise is no substitute for proof of success.

Posted by: allenm1 | March 19, 2010 9:59 AM | Report abuse

The comments to this entry are closed.

 
 
RSS Feed
Subscribe to The Post

© 2010 The Washington Post Company