L.A. Times testing series raises more questions
Few education stories have excited me as much as the series on teacher assessment being done by reporters Jason Song, Jason Felch and Doug Smith of the Los Angeles Times. They have dug up a goldmine of data on the student test score gains of 6,000 individual elementary school teachers in the Los Angeles Unified School District, information that the district has refused to show to parents despite pleas from its staff to do so.
The latest story in the series, "L.A.'s leaders in learning," does many things that I think are crucial to improving American education, and fit what I have been trying to do calculating the level of challenge in high schools, nationally and in the Washington area, the last 12 years.
The latest Times story focuses on how schools as a whole, not individual teachers, are doing in raising achievement. That emphasis encourages schools to create team-like cultures in which everyone works to make everyone else better. The story buttresses the central point of the series--that schools that seem similar to parents trying to choose where to send their children look very different when unreported data like relative test score gains are revealed. It also shows in a dramatic way the uselessness of our usual means of rating schools. Those that have the highest test scores are considered the best, even though achievement measured that way reflects the average incomes of the parents far more than it does the quality of the teaching.
But I found reading the series, particularly this latest part, frustrating because it often fails to answer questions raised by the deep digging its reporters have done. Also, the stories seem to me to mischaracterize, in some spots, the data they present.
A prime example is their reporting on the focus of the latest story, Wilbur Avenue Elementary School in an affluent part of the San Fernando Valley. Wilbur is highly sought after even by families outside the neighborhood, who line up days in advance to register for a few empty places. The third paragraph of the story prepares the reader for surprisingly awful results from the Times value-added analysis, compared to the school's high test score averages as reported by the state each year. "Wilbur's record was among the worst in Los Angeles for boosting student performance in math and English," the story said.
But a reader eager to see the details backing up this statement has to go 76 more paragraphs before finding it, in this sentence: "On average, students started the third grade in the 77th percentile in math, but by the end of the fifth grade were in the 67th. In English, they slid from 79th to 76th."
You see my problem? Not only did I have to go almost to the end of the story to get this information, but the numbers do not seem that bad to me. These are after all percentiles, not actual scores. They are dependent on the how wide the scale of measure from first to 99th percentile is--something we are not told in the story. A ten point percentile drop in math is likely significant, but this is an average for the entire school and leaves Wilbur still considerably above average. I doubt that the three-point percentile drop in English is statistically significant at all. Again, we are not told. To say that the English achievement level "slid" seems wrong to me, since like candidates within three points of each other on a poll, this seems the equivalent of no difference at all.
The reporters probably sensed that some readers would react that way, so four paragraphs later they say that other elementary schools with similarly affluent families do much better. Wonderland Avenue Elementary in the Hollywood Hills, for instance, made "some of the biggest gains in the district, particularly in math," the story says. But, sadly, it does not tell us what those gains were so we can decide for ourselves, as they let us do with the Wilbur statistics.
This story also fails to address adequately what might be statistical peculiarities when judging schools and teachers whose students are performing to far above average. The Times is analyzing data based on how much students improve compared to their past performances. But if the students being examined are near the top of the scale in preparation because of their affluent and well-educated parents, might this mean their past performances were unusually high, and that a drop in test scores in a subsequent year would mean less than it seems?
The story says only that "research has shown no significant 'ceiling effect.'" I think we need more than that. It says other high-income schools "made great strides" compared to Wilbur, but it doesn't tell us which schools those were or what their results were. In discussing another school in an affluent area, Topeka Elementary, it says high achievers at the school were "essentially flat in English and steadily failing behind in math," but again doesn't give the actual scores. That is a problem because I think they mislabeled the slide in English scores at Wilbur. I would prefer to make my own assessment of the changes in scores at other places.
The story suggests that the growth in Wilbur's average test scores as reported by the state may be the result of better educated families moving into the area. But it does not say if anyone determined if the education levels of the parents of the students taking the tests were higher in the most recent year tested than before, since a large portion of students at Wilbur did not take the tests.
I can hear the mutters of the terrific reporters who are doing this series. "Is Mathews serious? Give a reporter a blog or a column and he is ruined. Has he forgotten what it is like to do a long news story with space limits and stubborn editors?" I confess that if I had $10 for every time, during my 35 years as a reporter, an editor told me to take out some details I thought were essential to a story, I would be writing this blog on the Riviera, not in an upstairs bedroom with our cats' litter box a few feet away.
Newspaper editors fear that readers will drown in too much detail. Most of the time they are right. But the Times editors should realize that this is the rare series that is being read not only by Times subscribers in southern California, but by people all over the country interested in the issue of how to assess teachers. The portion of readers in this case who know something about such assessments, and have questions about what the Times is finding, is much higher than usual.
So please give us a break and provide a few more facts. I know that the data will be there in total when you release all the results, but you haven't told us exactly when that will be, We don't want to wait to know more about the schools you are writing about now.
I will send this post to the Times reporters and ask them to fill in the blanks for me, if they have the time. Whatever they send me I will put here as a special addendum to this blog post.
| August 23, 2010; 10:40 AM ET
Categories: Jay on the Web | Tags: LA Times series on teacher assessments key to education policy future, how much did schools really decline, series so far leaves some questions unanswered
Save & Share: Previous: Rhee is central to future of DC schools
Next: Why it's okay to be bored in school
Posted by: johnt4853 | August 23, 2010 2:03 PM | Report abuse
Posted by: Nikki1231 | August 23, 2010 2:07 PM | Report abuse
Posted by: jaymathews | August 23, 2010 2:45 PM | Report abuse
Posted by: edlharris | August 23, 2010 3:19 PM | Report abuse
Posted by: bsallamack | August 23, 2010 6:36 PM | Report abuse
Posted by: bsallamack | August 23, 2010 6:53 PM | Report abuse
Posted by: fairfaxvaguy | August 23, 2010 7:55 PM | Report abuse
Posted by: Linda/RetiredTeacher | August 23, 2010 8:11 PM | Report abuse
Posted by: bsallamack | August 23, 2010 10:08 PM | Report abuse
Posted by: Linda/RetiredTeacher | August 23, 2010 11:43 PM | Report abuse
Posted by: david_r_fry | August 23, 2010 11:59 PM | Report abuse
Posted by: Cal_Lanier | August 24, 2010 1:19 AM | Report abuse
Posted by: Linda/RetiredTeacher | August 24, 2010 11:59 AM | Report abuse
Posted by: Cal_Lanier | August 24, 2010 1:52 PM | Report abuse
Posted by: mcstowy | August 24, 2010 3:27 PM | Report abuse
Posted by: sideswiththekids | August 24, 2010 5:38 PM | Report abuse
Posted by: sideswiththekids | August 24, 2010 8:47 PM | Report abuse
Posted by: sideswiththekids | August 24, 2010 8:52 PM | Report abuse
The comments to this entry are closed.