Network News

X My Profile
View More Activity

Judging high schools by students' college success

[This is my Local Living section column for March 25, 2010.]

Montgomery County School Superintendent Jerry D. Weast loves numbers like most human beings love steak, and he shares what he loves. He dumps on me stacks of graphs and flow charts. They follow a familiar theme, the rise of student achievement in his district. But sometimes he surprises me.

Among the pieces of paper he unloaded during a recent visit was a blue, green, orange and yellow bar graph titled “MCPS Graduates Who Earned a 4-Year College Degree, 2001-2004.”

Huh? High schools usually don’t have that information. They can only guess how their students do in college. “Where’d you get that, Jerry?” I asked.

“National Student Clearinghouse,” he said.

I knew what that was. I knew what the clearinghouse was trying to do. But I didn’t know it had gotten that far.

The National Student Clearinghouse began to build its database of more than 93 million students in more than 3,300 colleges and universities to verify enrollment of students for loan companies. Now it is focused on informing high schools how their graduates are doing.

The clearinghouse gave me a sample report. By the 2008-09 school year, 23.8 percent of the Class of 2004 at a sample school had graduated from college, 15 percent were still in college, 28.5 percent were no longer enrolled and 29.4 percent were not in the database, and so had probably never attended college. About 3 percent had returned to college after dropping out.

These are vital data for someone such as Weast, the energetic, media-savvy leader of one of the largest and most successful school districts in the country. What he is doing with this information, others will soon do. Using the clearinghouse’s data could change the way we assess and run high schools — public and private — in significant ways.

Weast used the chart to justify his efforts to get more students to take Advanced Placement courses and tests. More than 76 percent of those in the classes of 2001 to 2004 who had gotten a score of 3 or higher on an AP exam had graduated from college, the chart showed. Even among students who got a failing grade on an AP test, the college graduation rate was pretty good, 59.4 percent, compared with 24.7 percent of those who did not take AP.

Similar data are being used in different ways by the D.C. College Success Foundation. With funds from the Bill and Melinda Gates Foundation, D.C. College Success has begun a startlingly ambitious, $116.million program to ensure a college education for about a third of the graduates of six public high schools in wards 7 and 8 in the next decade. Up to 250 students a year are getting four-year scholarships valued at up to $50,000, plus mentoring while they adjust to the demands of higher education.

Members of the D.C. College Success staff, led by Executive Director Herbert R. Tillery, work to raise their students to the level of successful college students revealed in the clearinghouse data. Louis Josey, a student at Maya Angelou Public Charter School, is taking the AP courses that the numbers indicate are helpful. The foundation helps Louis find tutors to help him over academic rough spots, mentors to guide his college application process and a four-week summer program at a college campus to boost his language and math skills and acquaint him with undergraduate culture.

Two other public charters, Friendship Collegiate and Thurgood Marshall, and three traditional high schools, Anacostia, Ballou and H.D. Woodson, also participate. The Gates foundation also supports improvements in state data collection and the work of the clearinghouse. (Bias alert: Washington Post Co. Chairman Donald E. Graham is on the D.C. College Success board.)

It will be intriguing to see whether the clearinghouse data inspire more high schools to focus more intently on what leads to college success. Even private school leaders, who rarely share student data, say they might release their college success rates.
If that happens, Weast will be calculating how his public schools compare and telling me all about it.

Read Jay's blog every day at http://washingtonpost.com/class-struggle.

Follow all the Post's Education coverage on Twitter, Facebook and our Education web page, http://washingtonpost.com/education.


By Jay Mathews  | March 24, 2010; 10:00 PM ET
Categories:  Local Living  | Tags:  D.C. College Success Foundation, Gates Foundation, Herbert R. Tillery, Jerry D. Weast, Louis Josey, Maya Angelou Public Charter School, National Student Clearinghouse, judging high schools by college success  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   Del.icio.us   StumbleUpon   Technorati   Google Buzz   Previous: Meet Washpost's education team tonight in Rockville
Next: Admissions 101: The 2/2/2 rule

Comments

Isn't it a bit early to judge schools based on the class of '04? That would seem to be biased in favor of wealthier schools where more students can afford to go directly on to college rather than spending time working or serving in the military first.

My alma mater would presumably rank very highly but I would say that's less due to the school itself than the demographics of its students.

Posted by: CrimsonWife | March 24, 2010 10:54 PM | Report abuse

Yes, these longitudinal data should be useful. But, you should be careful to distinguish among failures. There's a difference between students who get a "1" for showing up for the AP exam, signing in and staying, as required by the HS to get course credit; and those who earn a "2", which most AP teachers would agree indicates high-school course level learning of the material (or the smarts to study the material and the practice tests well, without bothering to do the classwork.) That difference between a 1 and 2 is one we should care very much about before making further claims for the value of AP experience.
You elsewhere blogged recently about cheating. Many HS's are doing just that, mandating and subsidizing AP courses and exam-taking. (The universities can and do discriminate among the passing grades, many schools giving no credit for any "3"s, and requiring a "4" or "5"; they don't matter to your Challenge Index)

Posted by: incredulous | March 25, 2010 2:23 AM | Report abuse

Crimsonwife, you've got to start somewhere and Jay's reaction to the idea is illustrative of the degree to which accountability is still a somewhat exotic notion in the public education arena.

I mean, my gosh, tracking students to see how they do after they graduate? What a concept! That implies that how students do later in life is related to the quality of the education they receive or at least introduces the idea follow-up is worthwhile. It's certainly worthwhile to imply that relationship but to the extent of measuring it?

Next thing you know people will want to see if there's a correlation between funding levels and education quality and while those of us who are wonky enough to comment on blogs like this know there isn't it's not the sort of correlation, or lack thereof, that ought to be more widely appreciated.

Posted by: allenm1 | March 25, 2010 6:52 AM | Report abuse

Thanks for the support, allenm1. You are right, but CrimsonWife is also right. Looking at the classes of 2003 or 2002 would provide an even more useful look at what is happening to grads. I sent out some emails this morning to try to do that very thing.

Incredulous is right about the difference between 1s and 2s as shown by research, but those 1s are not blank exams. Many kids try very hard and still get 1s. I have met enough of those 1 recipients to conclude that their efforts strengthen their preparation for further academic challenges and are in many cases better for them than what they would get in the limp standard course at their high school. I would love another comment from you that tells me precisely what is cheating about what schools like Columbia Heights are doing to make sure every student, even those below grade level, takes an AP course and an AP exam. The students themselves and their families don't think they are being cheated, and having spoken to them, neither do I. Nor have I ever even READ an interview, much less conducted one myself, with a student who thought such programs were a cheat. So please tell us more.

Posted by: Jay Mathews | March 25, 2010 11:33 AM | Report abuse

Is there a relationship between students who take foreign language courses and college?
(What about for ethnic groups?)

Posted by: celestun100 | March 25, 2010 12:38 PM | Report abuse

Of course, one must remember the cardinal rule of interpreting study data: CORRELATION DOES NOT EQUAL CAUSATION. Is it not possible that kids who are more likely to go to, and succeed, in college, are also more likely to take AP courses? One cannot extrapolate from this statistical information, as Mr. Weast is doing, that getting more students to take AP courses will lead to greater enrollment or more success in college. In fact, I could argue that getting students who are unprepared or unsuited for the rigors of AP might discourage them from considering higher education as they become convinced that they "aren't cut out for college."

Posted by: lazarus100 | March 25, 2010 1:06 PM | Report abuse

Jay:
The significant incidence of hard work by some failed students doesn't conflict with an assertion that many students are being pushed through AP exams following courses in which they have scarcely participated. The AP is designed to most powerfully discriminate at the cut-point of 3. So, yes, it discriminates less well below and above that. But, I have asserted frequent incidence of a different phenomenon, what used to be (and still can be) "spoiled" tests, ones which are only pro forma attempted. If 5-10% of exams involve the traditional cheating you've elsewhere written about, you don't discount it as a serious problem. I am sure that level is exceeded at some schools, and as a matter of school policy.
One of your expert contributors here has not been able to reconcile the number of exam takers with the number enrolled in AP classes at one or more DC high schools. I cannot understand how so many Spanish language-background students supposedly benefited by a Spanish Language AP test can have created NO demand for a course or a test in so rich a literature as Spanish. I look at the number of "1"s on the Language test from the school and understand.
A cheating school will do whatever is legal to boost its Challenge Index scores. What do you think is going on when students who are absent for 30% of the school year remain enrolled in AP courses, and then join others to take the exam. You must have had your own concerns when you created your "Challenged" group following extraordinary growth in AP test taking at some DCPS schools.
Our difference here is over the epidemiological ideas of sensitivity vs specificity and false postives vs false negatives. What does not get enough attention in most educational experimentation are harmful effects; as though beside the opportunity costs, there are not also toxic effects on some, and harm to the system by overloading it and harming it.

A popular t-shirt at some age-group swim meets shows a lump of coal compressed under the heat and pressure of competition into a hard diamond. Yes, but do it wrong and the lumps are crushed to dust. Further, false valuation of diamonds over coal can generate demand for large subsidies for the special machinery to create them.

Andrew Hacker has pointed out that the US locations BMW and other advanced engineering auto manufacturers have chosen for plants have not been those with outstanding numbers of college graduates or high schools sending most everyone to college. Please, tie this back to the good blog discussion you hosted on universal university education. (And if not, then why universal AP?)


Posted by: incredulous | March 25, 2010 2:31 PM | Report abuse

My point Jay, is that your reaction to the idea of longitudinal tracking is interesting because you think the idea's so terrific. It's the reaction to the idea for it's novelty that I'm trying to highlight.

In what other area of human endeavor would measuring outcomes be condsidered sufficiently novel to be noteworthy?

Is there the slightest doubt that there wasn't a single, relaxed person associated with NASA after the "Challenger" disaster? Every last person who worked for NASA or whose paycheck depended on NASA was wondering how the disaster would effect them. Bad outcome and everyone's on the edge of their seats.

With one or two clicks I can tell you who the highest rated ice dancer in the world is today. They measure that stuff. They even measure artistic expression and style. The outcome's important so it gets measured.

But in public education the outcomes that have been measured traditionally apply only to the kids. A kid gets an ACT score but a school doesn't.

Seems odd, doesn't it? After all, the school must make some contribution to the score even if the specific kid, and the specific kid's family circumstances, makes a contribution to the score.

Yet here we are at the beginning of the twenty-first century and the concept of measuring the value of a school by measuring the quality of its product is still sufficiently novel to merit enthusiasm and kick off controversy. What the heck's that about?

What it's about is that until fairly recently the notion of explicitly measuring the quality of a school, or a school district, wasn't just novel, it simply wasn't done. Educational officialdom was simply unaware of the concept of measuring the educational quality of schools or districts. The only quantity of sufficient importance to merit careful scrutiny was the funding level. Whether the kids were learning or not was their lookout. That's changing and the change is signaled by the exciting novelty of measuring student outcomes to determine the quality of the school.

Oh, lazarus100 correlation does quite often equal causation so the proper formulation would be CORRELATION DOESN'T *NECESSARILY* EQUAL CAUSATION. If correlation simply didn't equal causation fisherman wouldn't use bait.

Posted by: allenm1 | March 26, 2010 9:02 AM | Report abuse

Jay,

Where is this data available to parents on the MCPS website?

Glad you have the inside track, but how about sharing with the parents whose kids are actually in the schools. Thanks.

Posted by: jzsartucci | March 26, 2010 11:59 AM | Report abuse

The comments to this entry are closed.

 
 
RSS Feed
Subscribe to The Post

© 2010 The Washington Post Company