Problems with the Use of Student Test Scores to Evaluate Teachers

Valued-Added Measures (VAMs) are the newest craze in economists’ and politicians’ attempts to create a “scientific” way to measure teacher effectiveness. VAM takes a student’s achievement level upon entering a teacher’s class–as measured by standardized tests–and compares it to the student’s achievement level after a year with that teacher. The difference purports to be that teacher’s effect upon the student’s learning.

This summer, the Los Angeles Times published VAMs for every teacher in the Los Angeles Unified School District (in testing grades), with the teachers’ names. The list led, predictably, to sensationalist media stories about “10 best” and “10 worst” teachers. One teacher committed suicide.

Now New York City’s Chancellor Klein is going back on his department’s October 2008 written commitment to fight any Freedom of Information requests for Teacher Data Reports, the VAM measures of New York’s 12,000 4th to 8th grade English and math teachers. The City is submitting briefs in court arguing that they should be permitted to release the reports, with teachers’ names, to the media. The UFT us suing to stop them–arguments will be heard just before Thanksgiving.

The problem is…VAMs do not in fact measure teacher effectiveness. They have an enormous margin of error. Teachers shown to be “effective” according to their VAM one year very often show up as “ineffective” the next–leading to the inescapable conclusion that VAMs measure factors other than teacher effectiveness. In a memo obtained via a Freedom of Information request by Class Size Matters’ Leonie Haimson, the consultants who designed New York’s Teacher Data Reports warned that the reports should not be used to evaluate teachers ( http://www.classsizematters.org/FOILed-info-re-tacher-data-report.html ). Click here and here for recent articles on this topic.

In August, the deans of American educational research–Diane Ravitch, Linda Darling-Hammond, and eight others–published an Economic Policy Institute (EPI) briefing paper entitled Problems with the Use of Student Test Scores to Evaluate Teachers. In the paper, the researchers demolish the claim that VAMs measure teacher effectiveness, as well as the broader claims that private-sector workplaces rely on data to evaluate employees. You can download a PDF of the EPI briefing paper here.