Question raised about new KIPP study (with update)
UPDATED AT BOTTOM
No sooner had a new study been released saying that middle school students in the charter school network called Knowledge Is Power Program significantly outperform their public school peers on reading and math tests than critics raised an important question about the report.
Mathematica Policy Research issued the study yesterday, my colleague Bill Turque reported, about the KIPP network, which now has 82 schools that serve children from low-income households. Seven are in the District, including three middle schools -- KEY, WILL and AIM academies -- which are among the highest performing on the DC-CAS standardized tests.
The study looked at 22 KIPP middle schools including AIM and KEY, and discovered that by seventh grade, half of them showed growth in math scores equal to an additional 1.2 years of school. Reading gains for KIPP were not as dramatic but still significant, the researchers reported, reflecting an additional three-quarters of a year of growth.
However, an initial analysis of the study conducted by Professor Gary Miron of Western Michigan University found that attrition data appears to have been misrepresented.
Miron conducted his initial analysis at the request of the Think Tank Review Project, a collaboration of the Education and Public Interest Center (EPIC) at the University of Colorado at Boulder and the Education Policy Research Unit (EPRU) at Arizona State University. Here’s a post from the project on Miron’s conclusions:
A key finding of the study is that attrition at KIPP schools is not much different from attrition at comparable conventional public schools. This finding is important because past research about KIPP suggests that selective attrition -- struggling students disproportionately leaving, with more successful students staying and then scoring well on tests -- may give KIPP a substantial boost.
However, an initial analysis of the report by Professor Gary Miron of Western Michigan University concludes that this initial study report misrepresents the attrition data.
According to Miron, "While it may be true that attrition rates for KIPP schools and surrounding districts are similar, there is a big difference: KIPP does not generally fill empty places with the weaker students who are moving from school to school. Traditional public schools must receive all students who wish to attend, so the lower-performing students leaving KIPP schools receive a place in those schools."
In contrast, Miron explains, "The lower performing, transient students coming from traditional public schools are not given a place in KIPP, since those schools generally only take students in during the initial intake grade, whether this be 5th or 6th grade."
The KIPP study’s description of attrition only considers half the equation, when comparing KIPP schools to matched traditional public schools. The researchers looked at the attrition rates, which they found to be similar - in the sense of the number of students departing from schools. But they never considered the receiving or intake rate. Even though the researchers agree that the students who are mobile are lower performing, they do not take into account the reality that KIPP schools do not generally receive these students.
Professor Miron conducted his own quick analysis, using the Common Core database, and concluded that there is a 19% drop in enrollment in KIPP schools between grades 6 and 7 and a 24% drop in enrollment between grades 7 and 8. (This analysis only included KIPP schools that had enrollments in all three grades). In comparison, traditional public schools in these grades maintain the same enrollment from year to year.
While Miron’s review questions about the validity of this report’s particular findings, this is solely because of this single problem. In other ways, he found the study to be rigorous and high quality, promising to be even better in subsequent years of the evaluation. Those future reports can, and Miron hopes, will address the questions raised here and also about students retained in grade.
Importantly, Miron is also not saying that the KIPP schools do poorly. Those schools provide about 50% more instructional time and place rigorous demands on students and their families.
"We have every reason to believe that KIPP likely does a great job with the low-income students of color who wish to attend and who have relatively supportive parents who can do things like drive them to Saturday school," Miron says.
But he does question whether this is a viable model for larger numbers of students, and he also wonders whether the different departure and receiving policies may make matters worse for students who are left behind or who later leave KIPP schools. How would the KIPP model work if students who cannot handle the rigorous KIPP demands could not move to conventional public schools?
In a separate e-mail, Professor Kevin Welner, director of the Education and Public Interest Center (EPIC) at the University of Colorado at Boulder, told me:
I hope Gary’s raising of this important issue is received as it was intended. This was a difficult study to undertake, and in many ways the Mathematica researchers did an excellent job. Gary’s comments below make that clear. There seems to be a tendency among advocates for school models such as this to react very defensively instead of welcoming criticism and finding ways to address genuine issues. In this case, Gary focused in on the validity of a key claim -- an attrition claim that has been trumpeted by KIPP’s advocates -- which is substantially undermined by the researchers’ failure to account for the differences in "receiving" policies. This was almost surely a source of bias, resulting in a more positive result for KIPP.
This doesn’t mean that KIPP educators are not doing a good job. It doesn’t mean that parents shouldn’t choose to send their children there. It doesn’t mean that donors should not support KIPP. It just means that this interim evaluation report has a weakness that should be addressed in later reports, and it means that the tentative findings presented in the report should be taken with a grain of salt.
Welner sent this update today:
I want to correct something I wrote yesterday. "This was almost surely a source of bias, resulting in a more positive result for KIPP." I had misunderstood Prof. Miron, so "bias" was the wrong word to use here. As your commenters "amr11" and "jane100000" correctly noted, the measurements were not directly affected by the "receiving" differences described by Prof. Miron. His contentions are, instead, that:
(1) Researchers should be aware of the substantial indirect effect of these receiving differences -- on a school's ability to create a healthy learning environment. Selective attrition has a net impact on the capacity of sending and receiving schools. The research should account for the apparent situation whereby KIPP schools are only on the 'benefiting' end of that impact.
(2) Grade retention differences may also have played an important role, and the researchers would be wise to consider their impact on test scores and on selective attrition. A strong finding from education research is that grade retention is strongly associated with later dropout rates. Among comparable students, a holding one back in grade will, on average, increase his relatively likelihood of later dropping out. Also, grade retention changes the makeup of takes a given test (e.g., the 7th grade state standardized exam) in a given year.
Regarding the attrition issue, one suggestion that Prof. Miron noted to me was that the researcher may want to analyze the variance in test score data for the different grade levels at the sending (KIPP) and receiving (traditional public) schools. If selective attrition is occurring in the way he suspects, the KIPP variance will be substantially reduced (with the lower-achieving students largely disappearing), while the traditional public school variance will hold basically constant (with leaving students who leave generally being replaced by comparable entering students).
A final suggestion: Almost all high-quality studies (and this one certainly qualifies as that) include a section clearly explaining the study's limitations. For some reason, this one does not include such a section. If it had, these sorts of concerns could be expressed in the study/report itself. The Mathematica team is working with a good, if not great, dataset and has the talent to answer all these questions or clearly explain when their ability to answer them comes up short.
Follow my blog all day, every day by bookmarking washingtonpost.com/answersheet. And for admissions advice, college news and links to campus papers, please check out our Higher Education page at washingtonpost.com/higher-ed Bookmark it!
| June 23, 2010; 12:30 PM ET
Categories: Charter schools, Research | Tags: KIPP network, KIPP schools, Knowledge Is Power, charter schools, is
Save & Share: Previous: Ravitch to Obama: 'Change course before it is too late"
Next: Law student James Madison rediscovered through lost notes
Posted by: fsg2118 | June 23, 2010 7:07 AM | Report abuse
Posted by: jane100000 | June 23, 2010 8:22 AM | Report abuse
Posted by: aby1 | June 23, 2010 8:30 AM | Report abuse
Posted by: edlharris | June 23, 2010 8:39 AM | Report abuse
Posted by: amr11 | June 23, 2010 9:06 AM | Report abuse
Posted by: downclimb | June 23, 2010 9:20 AM | Report abuse
Posted by: musiclady | June 23, 2010 9:23 AM | Report abuse
Posted by: PLMichaelsArtist-at-Large | June 23, 2010 10:00 AM | Report abuse
Posted by: kmlisle | June 23, 2010 11:07 AM | Report abuse
Posted by: efavorite | June 23, 2010 12:24 PM | Report abuse
Posted by: celestun100 | June 23, 2010 6:16 PM | Report abuse
Posted by: Nikki1231 | June 23, 2010 7:36 PM | Report abuse
Posted by: lyn122 | June 23, 2010 9:01 PM | Report abuse
Posted by: SalesA1 | June 23, 2010 9:23 PM | Report abuse
Posted by: landerk1 | June 23, 2010 11:55 PM | Report abuse
Posted by: efavorite | June 24, 2010 1:50 PM | Report abuse
The comments to this entry are closed.