Network News

X My Profile
View More Activity

Your school's AP secrets

[This is my Local Living column for Jan. 21, 2010.]

Ever seen the Advanced Placement Grade Report for your high school? I thought not. Most people don’t know it exists. That is why I have so much pleasure going over the reports. It is like reading the principal’s e-mails, full of intriguing innuendo and secrets that parents and students aren’t supposed to know.

Although these subject-by-subject reports rarely appear on public Web sites, some schools will show them to me if I ask, for the following reasons: 1. I am very polite; 2. no reporter has ever asked for them before, so there are no rules against it; and 3. they don’t think anyone will care.

They are wrong on that last count. The AP Grade Report allows the public to see which AP courses at a school produce the most high grades, and the most low grades, on AP exams. You can gauge the skill of the teachers and the nature of the students who take various AP subjects.

This region’s schools have made AP (and the similar International Baccalaureate, which provides comparable reports) the most challenging and influential courses they have. On Feb. 1, The Post will publish my annual rankings of Washington area schools based on participation in these tests, written and scored by outside experts. Students who do well on them can earn college credit. Many people would be interested in the actual results (different from the participation figures I use in the rankings) if they were readily available. To my surprise, that is beginning to happen.

The AP Grade Report is a multi-page document that a high school receives each summer from the College Board. It reveals student scores on the May AP exams. I don’t ask for the individual names and results. That’s private. But the page at the end summarizing the results for each subject is fair game.

The reports reveal how many students in each course received fives, fours, threes, twos and ones, the rough equivalent of college A’s, B’s, C’s, D’s and F’s. Figuring out what the numbers mean is fun. Why, for instance, did 36 of the 52 students taking AP European History at Lackey High School in Charles County get ones (the worst grade), but only 10 of the 30 students taking AP World History? Was the world history teacher better?

Remember, classroom teachers do not write or mark these exams. They cannot hide their flaws with easy questions and generous grades. Students at the school who know the teachers will have clues to what happened, and some questions of their own.

I have long thought that families should be able to see this stuff. I e-mailed the spokespeople of the local school districts asking if and how outsiders could get the information. I assumed that most would say it is not public or that they haven’t thought about it, or that they would simply not reply. I was wrong. A few districts have the data on their Web sites. Others seem willing to give it to anyone who asks.

Stafford, Carroll and Frederick county school officials say they will give the subject-by-subject results to anyone who contacts them. Officials from Fairfax and Fauquier counties say they will do the same, except for very small AP classes where the results would make it too easy to figure out who did well and who didn’t. Falls Church says it can’t give out the numbers from its small high school because the IB classes are also small and make it possible to identify individuals.

Other districts are further along. Arlington County testing director Kathy Wills put the subject-by-subject results online before she retired last year. Charles and Montgomery counties have their AP subject results for each school on their Web sites. Alexandria posted not only its results, but a detailed analysis. Prince William and Howard counties say their numbers should be online soon. Other districts haven’t replied.

The reports take you beyond the school course guide. They suggest which courses might be best for you and your family members. They are puzzles that, for the first time, everyone has a chance to solve.


Read Jay's blog every day at http://washingtonpost.com/class-struggle.

Follow all the Post's Education coverage on Twitter, Facebook and our Education web page, http://washingtonpost.com/education.

By Jay Mathews  | January 20, 2010; 10:00 PM ET
Categories:  Extra Credit  | Tags:  AP subject results, Advanced Placement, International Baccalaureate, Washington area schools, rating teachers  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   Del.icio.us   StumbleUpon   Technorati   Google Buzz   Previous: Challenge Index delayed, a bit
Next: TV, games, iPods vs. school

Comments

Jay,
This a good information to provide folks but it should come with caveats.
It would be helpful to have national pass rates for the various subjects along with national score means. The temptation will be for people to take the pass rates at face value without asking some additional questions. For example, the national pass rate in 2009 for European History was about 67% whereas for World History it was only about 50% (the examples you used). The mean for EH was 2.92 and for WH was 2.63. So in the example you gave, it MIGHT mean that the difference between the two teachers was even greater than what the raw pass rates suggest.
OTOH interpreting scores across subjects can get tricky for a variety of other reasons. For example, if some subjects at a school are open enrollment whereas others have GPA requirements or other sorts of restrictions, pass rates are really not all that comparable. At my last school, AP US History was open enrollment whereas AP Psych was not. Not surprisingly, we had much higher pass rates in AP Psych because (a) the national rates are much higher anyway and (b) we had restricted the range of students taking AP psych.
Also, there may be a bit of self-fulfilling prophecy going on because "everyone knows" Mrs. Rabbit is a fabulous AP English Lit teacher and certain types of kids sign up for her class but avoid Mr. Turtle's AP English Language class. So it's helpful to realize that the students who enter the classes (even at the same school) may be very different.
Again, I think it's great that you are publicizing this data, but unfortunately as you know from your rankings, people tend to look only at the bottom line and ignore grain of salt advice.

Posted by: patrickmattimore1 | January 20, 2010 11:20 PM | Report abuse

Good Post patrickmattimore1. Just to add - don't forget the afterschool / weekend AP prep classes as well as the private tutors for all the BCC, Blair magnet, Churchill, RM magnet, Whitman, Wootton, and WJ students paid by their MD JD MBA CPA PhD six figure Mothers and Fathers. Yo Gburg, Crimestein, El Wheatón eat your heart out.

Posted by: motherseton | January 21, 2010 2:14 AM | Report abuse

"...don't forget the afterschool / weekend AP prep classes as well as the private tutors for all the BCC, Blair magnet, Churchill, RM magnet, Whitman, Wootton..."

Hmmm...my kid goes to one of these schools and has taken several AP and now all IB classes and I don't know anyone in an AP prep class or who gets private tutoring for the tests (I didn't even know you could get tutoring for AP tests). Am I missing something? By the way, my kid has all 5s on the AP tests. I think its the quality of the schools and the students.

Posted by: commentator3 | January 21, 2010 6:56 AM | Report abuse

The reason that World History gets higher AP grades than European History is that the European History test is very broad and tough. It is a more concentrated and specialized course. Also European History is often taught in the early years of high schools and thus lower grades.
Also five is very difficult to get. The proportion of 5s is significantly lower than the proportion of As is a typical four year college course. Five is also very tough to get in AP Calculus.

Posted by: peterroach | January 21, 2010 8:18 AM | Report abuse

“Having trouble finding Montgomery's” AP results? You will also have “trouble finding Montgomery's” 2009 results on gifted and talented screening--after it has been convincingly demonstrated that the program is unevenly implemented, and the identification process seemingly flawed (see http://www.examiner.com/x-29782-DC-Gifted-Education-Examiner).

Educational outcomes must be measured by meaningful indicators, preferably ones that are outside the influence of the school system. Now, if I could only convince you to drop the easily manipulated “AP participation” statistic in your rating system, I can boast you have joined me on the “dark side!”

Posted by: DC_Gifted_Education_Examiner | January 21, 2010 8:29 AM | Report abuse

AP results for Montgomery County can be found here:
http://sharedaccountability.mcpsprimetime.org/reports/list.php?selection=833.

Posted by: susan507 | January 21, 2010 8:31 AM | Report abuse

Thank you for this tip. We move frequently and it is sometimes difficult to judge the quality of the education offered and the caliber of the students at a particular school. This is a great tool to assist in the analysis.

Posted by: mpmiles | January 21, 2010 9:14 AM | Report abuse

PGCPS has an old report:
http://www1.pgcps.org/WorkArea/showcontent.aspx?id=26000

Posted by: edlharris | January 21, 2010 10:01 AM | Report abuse

While it would be good to use this information for future students and also to assess teacher effectiveness, it's easy to loose sight of the fact that a 3 for a lot of schools is not going to get the student credit. The AP tests provide a great assessment of the student's understanding of the subject matter, but at a school like UVa, to receive credit for a course a student most score a 4 or 5 on the exam. If available, a better alternative in Virginia is Dual Enrollment courses. The Commonwealth of Virginia has an agreement between the state universities and the community colleges where those courses are counted toward credit (since they are college level courses and most times taught at the local CC) if the student passes the class.

If the idea of an AP class is to get college credit, it helps to understand the scoring requirements of the colleges your students are looking at.

My son was able to take 8 AP and 2 DE classes in his 4 years of high school. He ended up with college credits for 4 of the AP classes (he got 3's on the other 2 exams and didn't get the credit) and received credits for both DE courses.

While the 2 AP classes weren't wasted, he is now sitting in two classes that he has basically already taken. It's great for the GPA, but does have its drawbacks.

Posted by: thensell | January 21, 2010 10:44 AM | Report abuse

Peterroach's comments are entirely false: "The reason that World History gets higher AP grades than European History is that the European History test is very broad and tough. It is a more concentrated and specialized course. Also European History is often taught in the early years of high schools and thus lower grades." Typically AP European History scores for the national average are between 15-20% higher. The amount of material covered in European History covers a much shorter time period and the European History course is typically offered only to 11/12 grade students. In contrast, AP World History national pass rates are typically only 50% and this course is most often taught at the 10th grade level (Fairfax,Loudon, Arlington, Alexandria, etc.).

MORE IMPORTANTLY... Jay has done the AP program and AP courses a great disservice with this article. Jay claims that ALL students will benefit by taking AP courses, regardless of whether or not they pass the exam. He, through his "Challenge Index," ranks schools based only on the NUMBER of AP exams that they give, NOT the scores that students receive for this very reason. Then, in this article, Jay pushes parents to investigate into how each school's AP programs are performing. While I don't disagree that teachers should be accountable for their scores, there are many factors that make up how students perform on AP exams. As patrickmattimore1 explained, an outsider has no knowledge of national averages, how students are selected to be in the classroom, how long the teacher has been teaching the course (as would be expected a first year teacher may have lower scores than a teacher who has taught the course many years). Jay's article encourages parents to look at the bottom line of pass rates and to teacher or school shop in an attempt to get their children into a school or class that has the highest pass rates. As a teacher with over 10 years of experience both teaching AP courses and grading the national exams for ETS, I feel that Jay has once again written an article simply to "stir the pot" and gain attention, while failing to look at the more broad effects of what he is encouraging. My courses pass rates are well over 90% (40% higher than the national average) consistantly. I have nothing to fear from people researching my students' scores. However, I think what Jay is proposing will only lead to less effective teachers and less effective AP courses.

Posted by: edspecialist | January 21, 2010 10:48 AM | Report abuse

Good points. All of the additional points made by wise commenters here have merit, and I have discussed in the past, but it wasn't possible to get them all into a 700-word column. For edspecialist, I am trying to inform people. All new information, as you say, stirs the pot, but my experience with parents and students looking at these programs is that they are more than smart enough to figure out what information works for them and what information in unimportant. The broad effects of increased participation in AP and IB courses and tests seem pretty clear, both from the messages I get from parents and students and from the data available. We have more kids being challenged in high school, one of the least challenging parts of our education system, and more of them getting passing marks on AP and IB exams. You notice I was not emphasizing passing rates so much in this piece as I was emphasizing the number of 1s, the bottom score, in AP. We have data showing that even students who get 2s tend to do better in college than students who do not take AP. But 1s don't give you any advantage, and it is unusual to see so many 1s, as I noted in this piece, in an AP test at a suburban school. The reasons for this can be, as you and Patrick and other smart readers say, very numerous. I have run across every variety mentioned, and several more. But the poor beknighted parent will have no chance to even look into the possible reasons if she doesn't know that the course her son wants to take has had an unusual number of 1s, thus the importance of writing the column so people know how to get the info.
For thensell, keep in mind that the vast majority of first year college students who might get some benefit from credits for AP courses do not attend UVA or schools like it. They attend community colleges and big state universities, where the rules for AP credit are usually more generous.
I also share commentator3's puzzlement at the notion of widespread tutoring and prep sessions for AP tests. I have not run into much of that in suburban schools. (Some low-income urban schools do it with very beneficial effect for kids who come into these courses not nearly as well prepared as suburban kids.) The tutoring and prep in the burbs is most often directed at making sure the kid gets a good grade on his report card (over which AP tests usually have no influence) or a good score on the SAT or ACT.
My thanks to susan507 for that link to the Moco web site. If it checks out I will add it to the piece. Same to edlharris for that PG link.

Posted by: Jay Mathews | January 21, 2010 11:14 AM | Report abuse

The Montgomery County link provided by Susan507 works great. I have added it to the piece and deleted a phrase I had had about my trouble in finding a link for subject by subject results per school in Montgomery. The PG link provided by edlharris was, as he said, an old one, for 2007, and does not provide subject by subject results. The PG public information people have not responded to my request for information on this, and did not respond when I sent them a draft of the column.

Posted by: Jay Mathews | January 21, 2010 11:36 AM | Report abuse

Jay, you report on test scores that have high "interest value" to readers. Please be careful in how you set readers up to interpret the scores.

To lead with the question of why AP scores in one subject are higher than in another subject you need to consider many things.

1) are the scores from different tests equated? (probably not. different content area tests--even if within the same general subject area of history--are probably different wrt likelihood of passing or failing).

2) were the children in each course "comparable" -- meaning, did they come in with similar levels or variability in prior knowledge?

Etc.

To ask the provocative question: "Is teacher A better than teacher B" can set teachers up for very unfair comparisons.

Please be careful!

Posted by: Magoo1 | January 21, 2010 11:46 AM | Report abuse

As an AP English Lit teacher, I wholeheartedly support publicizing AP scores to parents and members of the community. Mr. Matthews is correct when he states that a student who achieves a score of "2", which is not considered a "passing" score, is considered an AP success in the sense that they are statistically far more likely to do well in college than a student who did not attempt the AP challenge. Parents deserve to know both their child's school's scores on various exams and what those scores mean.

Posted by: cmerry | January 21, 2010 11:56 AM | Report abuse

Jay,

So Montgomery County told you their AP scores were on their website? Well, not exactly. "SOME AP score percentages are in a report" is more like it. MCPS has redacted the information and is only showing % scores for the "15 most popular" AP exams.

What about the rest of the exams??

Want to see ALL the AP results for your local high school?
We have as many as we could obtain on the Parents' Coalition website.

http://parentscoalitionmc.com/Schools.html

AP score information for all exams given at a school are sent to colleges each year in your student's MCPS transcript package, as part of the college application and printed on a SCHOOL PROFILE SHEET.

Colleges are routinely given this detailed information.

Ask MCPS' Public Information Office for it? They have said in the past that it doesn't exist in response to public information requests.

I believe this is the first year that limited AP score information by 5,4,3,2,1 has been released by MCPS. And even then, there is still a lot that is being withheld unless you take a look at your school's College Profile Sheet.

The Parent's Coalition pulled these sheets in the summer of 2008. Ask your local school for their most recent School Profile Sheet for the 2009 scores for the complete score report.

Posted by: jzsartucci | January 21, 2010 12:02 PM | Report abuse

The article and comments already posted are both helpful. Knowing that the information is available allows parents to ask informed questions when considering schools. Other good questions may be:

How long has the teacher been teaching the AP course?
Has the teacher been an AP reader?
Has the teacher been to AP workshops/institutes relevant to the particular course?
Is there actually an AP course at the school, or do students take the exam at the end of a non-AP course?

Posted by: Thinking123 | January 21, 2010 12:10 PM | Report abuse

I am sorry to see that the Post (and Newsweek) continue to publish this list - based on the number of tests taken at a school divided by the number of seniors.

As this has been around for awhile, you would think that an analysis of how schools that rank high on this list do in terms of what the list purports to rank: success in college!

How about a study of how students from these "best high schools" do compared with others *in actual college work*?

You see, the AP number used here is an input -- the output is college performance. The inference is that since in the past students who took AP tests did better in college -- was drawn from a time when only qualified students took the tests.

Today, with such lists, you have schools eliminating creative honors classes and often forcing large numbers of kids, ill-prepared as they may be, into AP courses so that their schools get better rankings on this list.

What SHOULD be done, of course, is a study to determine if all this tracking of many students into AP actually created increased college success -- that is the output.

Posted by: rayburt456 | January 21, 2010 12:14 PM | Report abuse

For a good example of the information that your local high school has on AP scores, look at the Blair High School Profile Sheet on the Parents' Coalition website. Blair is probably putting out the most AP exam information to colleges of any MCPS high school. (Scores by class, scholar info)

http://parentscoalitionmc.com/Schools.html

Note Blair shows scores for 31 AP exams.

MCPS is only putting out data in a report for 15 AP exam subject areas.

Posted by: jzsartucci | January 21, 2010 12:14 PM | Report abuse

Why do we have to protect individual's scores so much that we can't release data? Everyone can see who the fastest or slowest runner is in gym class, and the top times will get put in the paper, but we have to hide who got the 1 or the 5 on the AP test? Maybe part of the problem with a lack of respect for academics is that you get public recognition for being a good athlete, but a good test score is a secret.

On the other hand, just the final score is poor way to evaluate the effectiveness of a class. If you had data on how well prepared students were coming into the course, you'd have a better idea of how much they gained from the course. Pre-test, post-test. Simple concept, not applied often enough.

Posted by: staticvars | January 21, 2010 12:16 PM | Report abuse

@Jay

"I have added it to the piece and deleted a phrase I had had about my trouble in finding a link for subject by subject results per school in Montgomery."

Please ask the MCPS Public Relations office directly where the "link" is for subject by subject for ALL exams for all high schools. We'd all like to know. Thanks!

You will have to ask them as our Superintendent is in Kentucky this week.

Posted by: jzsartucci | January 21, 2010 12:17 PM | Report abuse

It's been several years since I took AP courses and exams (nearing two decades, to be sure), but I do recall that the difficulty varied greatly across several of the exams regardless of the subject matter and teacher effectiveness.

For example, the AP History exam in 1993 focused on topics which my basic history classes had covered in great detail--no need for the AP class itself on that one, and if anything, that teacher was pretty ineffective in her role--yet students from our school did very well on it. The AP Biology exam was very well-covered by the materials in our AP Biology course (and we had a very effective teacher, too); again, many 5s all around. Yet the AP Chemistry exam had large sections which were covered in Chem I, and which AP Chemistry students hadn't really seen for more than 2 years (a lot of questions on freezing & boiling point elevation/depression due to different solutes, for example, and very little on topics like enthalpy). I think of the few to take that exam at our school, I got the highest score with a 4 for that reason. AP Calculus using the "AB" version of the exam (vs. the "BC" one--no clue if they still break it down that way, but none of us took the BC exam) was a piece of cake despite our teacher having taught herself the material, often only the day before presenting it to us in class.

And AP English... well, the capstone essay question dealt with works of comedy, which we simply didn't study in a single course, as our teacher focused largely on tragedies. Yet we as a school did well nonetheless (I got a 5, as did several of my classmates), largely because of that teacher's effectiveness even though the subject matter didn't well-match the exam.

I suspect the exams, though covering the same broad topics every year, do vary significantly enough that that class-to-exam meshing could be thrown off, as it was for our AP Chemistry exam, and that the gestalt of subject material covered in similar courses can make up for poor teaching (as it did in our AP History exam). And, of course, one year might simply have a "harder" exam than another.

So simply looking at scores doesn't give a great picture of the school's ability to prepare students for those particular exams... just how well the students did that particular year on those particular exams.

Posted by: exerda | January 21, 2010 1:13 PM | Report abuse

Mr. Matthews,
You're contradicting your core message when you ask schools to display AP and IB test results. Matthewsian philosophy suggests that High School students expose themselves to advanced level academics, regardless of how they score on the test. Teachers have lowered and in some cases removed restrictions to entering these rigorous classes. Opening the floodgates has, at least for now, led to a lower over all score report at some schools. Schools are rightfully fearful of releasing their scores because they’ve been following your philosophy. Please understand, sir, that you can’t have it both ways.

Posted by: hokiematt10 | January 21, 2010 1:20 PM | Report abuse

Jay:
Huzzah! Huzzah! Thank you for this. Please share all of the other secret, semi-secret or just not well known information like this. It's all valuable and useful.
Thank you so much!

Posted by: LoveIB | January 21, 2010 1:25 PM | Report abuse

Just looked through the MCPS data and it made one of my worries about data vs. facts real. If you compare the scores on AB and BC calc, what conclusion can you draw? You have to make the assumption that the students taking the AB test aren't as good...but they don't have data on that. Need some on the ground facts to make decisions- or more data on the pre-coursework knowledge of the kids going in.

Posted by: staticvars | January 21, 2010 1:48 PM | Report abuse

I love hokiematt10's post, but he or she should be warned that true Mathewsians spell it with only one T. That mistake could get you thrown out of the cult. I know it seems like I am contradicting myself, but I am not. I am a journalist, hoping to get as much useful information to readers as possible. I thought it was very useful to tell readers which schools were working hardest to get average kids into challenging college-level courses, which research showed helped them do well in college. Thus the Challenge Index. But it was also clear to me as I got deeper into this that although in general an AP or IB course was better for a student than a regular course, even AP and IB had some disappointing teachers who dumbed down their assignments and tried not to let anyone know that most of their kids were getting 1s on the exam. In a suburban school with well-prepared kids, that was a red flag I thought people should see. In the urban schools that interest me most, new AP programs often have most of the kids getting 1s, and in those cases it is best to focus on which schools are reducing the number of 1s and increasing the 2s and 3s and so on over time, as their teachers improve and as their lower grades work harder to prepare kids for AP. Its complicated, but I discovered long ago that most readers are smart and can handle the complexities, as the posts above prove once again.
Let me assure Rayburt456 that the latest data on the effect of AP course and test taking on college success is very recent, including thousands of students who entered college in the 2000s, long after the movement to open AP to all gathered steam in the 1990s. The best data is from Texas. Here is a column I did on it:
http://www.washingtonpost.com/wp-dyn/content/article/2009/01/09/AR2009010901085.html

Posted by: Jay Mathews | January 21, 2010 1:59 PM | Report abuse

Some classes are better than others. Scores fluctuate some from year to year when you look at AP scoring data. Also, depending on your district -- some districts decide consiously to limit who is allowed to get into an AP course while some allow a wider array of kids because they beleive being in that class is beneficial even if the kid does not score high on the exam. It shows up in the overall spread of scores.
There is actually good research to back that idea up when you look at success rates in college. Studetns who take the course, and the exam --even oif they score a 1 -- do better overall than kids who do not take the clas or take it and not the exam.

Also, and interesting correlation to look at is the student scores across other exams -- if the student got a 1 in this exam, what did they get in others?

ETS makes correlations between AP and PSAT scores as well. there are predictors of performance.

And a last thing - ETS instituted about two years ago a requirement that any school that wants to use "AP" in a course title and have their blessing has to undergo a review process. AP reviews your school and if it meets their criteria it gets AP certification.

Posted by: John1263 | January 21, 2010 2:26 PM | Report abuse

Jay,

You do not need to know individual student AP scores, only the average score for any particular AP Test and number of students taking the test. You keep dancing around this error in your "Challenge Index." Without incorporating the average AP score and number of students taking the AP test in your Index the only challenge is believing its validity.

Posted by: waxtraxs | January 21, 2010 6:34 PM | Report abuse

Jay said, "...Montgomery counties have their AP subject results for each school on their Web sites..."

Still no link to substantiate that statement. Has the $10 million Montgomery County Public Schools PR department been unable to supply the link to validate what has been printed in this Washington Post article?

Posted by: jzsartucci | January 21, 2010 6:47 PM | Report abuse

Gee, that's funny; my daughter's private school took pains to publish its AP course results, by teacher and class. I sent her to private school for social, not academic reasons, but there was a feeling that one had more input into these sorts of things. Ask for them, parents!

Posted by: bhrgarden1 | January 21, 2010 6:49 PM | Report abuse

Jay..the earliest list of yours i could find was published in 2003 -- did you produce a national list before then?

Why do I ask? Because your response above suggests newer data than the famous DOE study and you point to one with 2002 grads. If your first list came in 2003, then it's fair to say schools didn't have a chance yet to work to get higher placement on your list by removing all barriers to AP classes and encouraging (requiring??) kids to take more and more of them.

Bottom line...if your first list did come out in 2003...the real answer is about how all kids with AP backgrounds do...not just the ones qualified in the good ol' pre-list days.

Posted by: rayburt456 | January 21, 2010 7:38 PM | Report abuse

I teach AP. Some of my best students don't take the test. The tests are are very expensive, especially if the student is taking more than one. Also, there is some gamesmanship going on: Retaking the entry level college course after high school AP gives a real benefit in a very hard major and some colleges may not accept the AP course in their major.

Posted by: altaego60 | January 21, 2010 7:50 PM | Report abuse

In our experience at Blair and Richard Montgomery, we were given the previous year's AP and IB results at Back-to-School night. These were *department-wide* numbers for all students who took AP Basketweaving (or BC Calc, IB HL European History, etc.) the previous year, broken down by # of students at each score level and the % of students scoring 3 or higher.

We could see that in one class, 93% passed with a 3 or better, but we could also see that only 15% got a 5, 26% got a 4, and 52% got a 3. It gave us an idea as to where the class was being pitched -- and that if a kid wanted a 5, he would have to really work to maximize what he got out of the class (or buy/borrow a study guide to fill in the gaps).

We couldn't tell if one AP Calc class did better than another, which for better or worse preserved some anonymity for students and teachers. (This is fine by me.)

You betcha MCPS knows how those scores break down among each high school. They pay the College Board good money for that data!

Posted by: rlwasserman | January 21, 2010 8:43 PM | Report abuse

What about Loudoun County?

Posted by: pfallsgirl | January 21, 2010 9:15 PM | Report abuse

AP Teachers do their best to fit it all in, often at a break neck pace, but once the kids enter the test, the teachers can only wash their hands and hope for the best.

No college 101 course would test students on material not presented in class.

AP Tests are an abnormality in educational assesment.

In what other realm would parents or students honestly accept being tested on material that is a mystery all year long?

Posted by: hatchlaw | January 21, 2010 9:15 PM | Report abuse

Hokiematt is correct that Jay is contradicting himself, but I would imagine that he doesn't concern himself with the same stance from day to day.

He gave a talk at our school, and he was well recieved...and then he kept talking and the crowd turned on him.

I think his goals is to just get people talking, and asking questions, which is never wrong, but the answers to those questions are never simple either.

Posted by: hatchlaw | January 21, 2010 9:29 PM | Report abuse

When I taught Calculus BC my pass rate was 95%. For my AP English Literature class, my pass rate is 88%. We have an open-door policy where even freshmen and sophomores sign up for the class. I don't think I am any better at teaching math than I am at teaching literature.

Posted by: ericpollock | January 22, 2010 5:36 AM | Report abuse

I suspect that it's been said before but it is a huge (and unwarranted) leap from knowing test results to concluding that one teacher is more effective than another. First, test results reflect many factors, only some of them relate to the teachers. Second, test results reveal something about teaching effectiveness only when one can be certain that all the other factors are more or less the same. If teacher A and teacher B taught the same pupils under the same conditions and so on, one might have some basis for using test results to compare their teaching effectiveness. Otherwise any differences are just as likely to result from some other variable.

Posted by: Jphubba | January 22, 2010 10:27 AM | Report abuse

I would love to know the scores for individual students. Of course, I need not know their names. Perhaps each student could be given an individual ID number (different than their student ID).

Why do I want this data? Because I think it would more accurately reflect the "success" of the AP program.

MCPS currently reports data based on the number of tests taken (28,575) and the number of tests scored at 3 or above (20,648). That can be (and is) easily misconstrued as the number of STUDENTS taking AP tests and the number of STUDENTS earning 3 or above on a particular test.

In fact, many students take multiple exams, so ONE student might score 3 or higher on THREE or more exams, for example. That means fewer STUDENTS (perhaps many fewer) are actually passing AP exams.

In addition, while MCPS touts the fact that a "record-setting" number of AP exams were taken in 2009, it ignores the fact that percentage of exams scored at 3 or higher has pretty steadily declined since 2005

Overall, the percentage has dropped from 77.0% in 2005 to 72.3% in 2009.

By subgroup, the results are:

African-American: 53.7 to 47.6
Hispanic: 65.8 to 55.3
White: 81.3 to 77.9
Only Asian students were fairly consistent over the years: 76.8 to 76.6

Why are the percentages falling? Are the kids taking the tests unprepared? Are the teachers teaching the courses qualified to teach a "college-level" class? These types of questions aren't asked if MCPS manipulates the numbers in order to tout only "good" news.

I understand that it is the MCPS PR department's job to put a positive spin on all developments, but it is a disservice to the community to ignore data that don't "fit" the spin.

Posted by: daveairozo | January 23, 2010 10:56 AM | Report abuse

Jay said, "...Montgomery counties have their AP subject results for each school on their Web sites..."

And...still no link to substantiate that statement.

Has the $10 million Montgomery County Public Schools PR department been unable to supply the link to validate what has been printed in this Washington Post article?

Jay, why are you misleading readers when MCPS has put out information on less than half of the AP exams given? Does that really serve your readers?

Posted by: jzsartucci | January 23, 2010 1:40 PM | Report abuse

Jay,

Would love your response to the issues raised in this video from yesterday's New York Times. http://video.nytimes.com/video/2010/01/24/opinion/1247466680941/op-ed-advanced-pressure.html

Posted by: rayburt456 | January 26, 2010 8:48 AM | Report abuse

Did Howard County provide any indication of a date by which they planned to post the data? I am very interested in seeing it.

Posted by: wpreader11231 | January 27, 2010 4:55 PM | Report abuse

The comments to this entry are closed.

 
 

© 2010 The Washington Post Company