Sections:

Article

Fired teacher reinstated with back pay after VAM mess was exposed... 'The 'errors' cannot be corrected because the method itself is the problem. The errors and flaws are integral to the method. VAM is Junk Science...' VAM D.C. Testing Scandal ... This time it's 'Technical Errors' in IMPACT

Over the past few days, best-selling author Diane Ravitch released a scoop. The Washington D.C. public schools -- again -- has a scandal. This time it's about their program for improving test scores (again). This time, it's not Michelle Rhee, but Rhee's prodigy Kaya Henderson for this one.

HERE IS RAVITCH'S REPORT:

Following the scandals that erupted because of the "reforms" of Michelle Rhee, Kaya Henderson, Rhee's successor as D.C. Schools Chancellor continued to try and create fraudulent "Value Added..." measures to elevate data to the primary position in their teacher bashing versions of reality.Last Friday [December 20, 2013], before the winter break, D.C. officials quietly released the news that the D.C. IMPACT evaluation system contained technical errors. It was the perfect time to reveal an embarrassing event, hoping no one would notice. Spokesmen minimized the importance of the errors, saying they affected "only" 44 teachers, one of whom was wrongfully terminated.

But Professor Audrey Amrein-Beardsley explains that what happened was "a major glitch," not a "minor glitch." It was not a one-time issue, but an integral part of a deeply flawed method of evaluating teachers. No amount of tinkering can overcome the fundamental flaws built into value-added measurement of teacher quality.

Beardsley writes:

"VAM formulas are certainly “subject to error,” and they are subject to error always, across the board, for teachers in general as well as the 470 DC public school teachers with value-added scores based on student test scores. Put more accurately, just over 10% (n=470) of all DC teachers (n=4,000) were evaluated using their students’ test scores, which is even less than the 83% mentioned above. And for about 10% of these teachers (n=44), calculation errors were found."

This is not a “minor glitch” as written into a recent Huffington Post article covering the same story, which positions the teachers’ unions as almost irrational for “slamming the school system for the mistake and raising broader questions about the system.” It is a major glitch caused both by inappropriate “weightings” of teachers’ administrator’ and master educators’ observational scores, as well as “a small technical error” that directly impacted the teachers’ value-added calculations. It is a major glitch with major implications about which others, including not just those from the unions but many (e.g., 90%) from the research community, are concerned.

It is a major glitch that does warrant additional cause about this AND all of the other statistical and other errors inherent not mentioned but prevalent in all value-added scores (e.g., the errors always found in large-scale standardized tests particularly given their non-equivalent scales, the errors caused by missing data, the errors caused by small class sizes, the errors caused by summer learning loss/gains, the errors caused by other teachers’ simultaneous and carry over effects, the errors caused by parental and peer effects [see also this recent post about these], etc.).

The "errors" cannot be corrected because the method itself is the problem. The errors and flaws are integral to the method. VAM is Junk Science, the use of numbers to intimidate the innumerate, the use of data to quantify the unmeasurable.

DECEMBER 23, 2013 FROM THE WASHINGTON POST:

Errors found in D.C. teacher evaluations (2nd update). BY VALERIE STRAUSS. December 23 at 5:20

More than 40 teachers in D.C. public schools received incorrect evaluations for 2012-2013 because of errors in the way the scores were calculated and one was fired as a result.

The president of the Washington Teachers’ Union, Elizabeth A. Davis, has asked for details from D.C. Schools Chancellor Kaya Henderson in a letter (text below) that says that the problems were found by Mathematica Policy Research, a partner of the school system’s. The mistakes were found in the individual “value added” scores for teachers, which are calculated through a complicated formula that includes student standardized test scores.

This “VAM” formula is part of the evaluation system called IMPACT, begun under former chancellor Michelle Rhee in 2009. Henderson, Rhee’s successor, continued with IMPACT, though this year she reduced the amount of weight given to test scores from a mandatory 50 percent to at least 35 percent. (See below for IMPACT chart).

Testing experts have long warned that using test scores to evaluate teachers is a bad idea, and that these formulas are subject to error, but such evaluation has become a central part of modern school reform. In the District, the evaluation of adults in the school system by test scores included everybody in a school building; until this year, that even included custodians. In some places around the country, teachers received evaluations based on test scores of students they never had. (It sounds incredible but it’s true.)

My colleague Nick Anderson reported in this story that 44 teachers were involved, and one was fired as a result of an evaluation that was too low. Half of the evaluations for 44 teachers were too high and half too low, according to Jason Kamras, chief of human capital for the school district. The teachers with bad evaluations represent about 1 percent of the system’s 4,000 teachers but nearly 10 percent of those whose evaluations are based in part on student standardized test scores. Kamras said the fired teacher will be reinstated with back pay, that those teachers with ratings that were too high will not be lowered but those with too-low evaluations will be raised.

Randi Weingarten, president of the American Federation of Teachers, the national union to which the Washington local belongs, said in a statement that it is “very troubling when the district continues to reduce everything about students, educators and schools to a nameless, faceless algorithm and test score.”

Here’s the letter from Davis to Henderson, and following that is a statement from Weingarten as well as an IMPACT chart:

Dear Chancellor Henderson,

In an email message from Jason Kamras on Friday, December 20, 2013, I was informed that Mathematica Policy Research, a DCPS external partner, recently found a technical error that affected some teachers’ 2012-2013 Individual Value-Added (IVA) and Teaching & Learning Framework (TLF) evaluation scores. Needless to say, I was deeply disturbed by this preliminary information as well as the time at which it was provided—the day before the winter break.

In our continuing effort to be open, transparent and in accordance with our current collective bargaining agreement, the Washington Teachers’ Union (WTU) requests all information regarding the miscalculation of teachers’ SY-2012-‘13 Individual Value-Added (IVA) scores and changes made to teacher’s (IMPACT) evaluation scores for that time period. In addition, the WTU requests all information regarding errors made in any other year during the implementation of the IMPACT evaluation system.

Our collective bargaining agreement (CBA) allows the WTU to receive all information relevant to the enforcement of the agreement (10.1.1). Therefore, the WTU immediately requests the following information from DCPS regarding the miscalculation of teachers 2012-13 IVA and TLF IMPACT scores;

• a list of all teachers affected by these errors,

• copies of all correspondence sent to teachers regarding the miscalculation of their scores,

• the impact of the miscalculation on each teacher,

• the SY-2012-13 evaluation score of every DCPS teacher,

• the list of teachers who submitted challenges to their SY-2012-13 evaluations

• the list of all evaluations that were changed as a result of an evaluation challenge

• the list of teachers whose challenges were denied

• the DCPS letters of response to each teacher whose challenge was changed or denied

• a full description of the Mathematica error, the cause of the error, how and when the error was brought to the attention of DCPS officials, the number of teacher scores that were affected by the error,

• a copy of all DCPS communications sent to affected teachers

• each teacher’s SY2012-13 final evaluation: the initial evaluation that was sent to teachers and any changes to those evaluations that were subsequently made and documents explaining the reasons for the changes.

• a copy of the DCPS database reports showing teachers’ name, school, area of certification, actual assignment, IMPACT category, component scores (Teach, CSC, etc.).For teachers whose scores are based in part on value-added calculations, provide the formula and how each teacher’s score was generated

• a listing of all errors in any aspect of IMPACT score calculations in the 2012-13 IMPACT score calculations, and the weight of those errors in IMPACT points, including the notification of the errors and the DCPS response to those notifications.

• a list of each teacher’s final IMPACT score for 2012-13 and the new score that results from correction of the “error.” This includes teachers whose score changes would not move them into a new evaluation category, e.g. from “effective” to “highly effective.”

• all errors and the weight of those errors in IMPACT points that were previously discovered since the implementation of IMPACT and how those errors were corrected.

• the names, addresses, email addresses and school assignments of teachers whose scores in 2012-13 scores would have been lowered if changes resulting from the correction of the error had were implemented, thereby moving the teacher into a lower evaluation category.

Please provide answers to the following questions:

1. to what extent were specific teachers affected, i.e., loss of pay, loss of step in crease or bonus, separation from duty, change in IMPACT category, etc.?

2. what are the specific groups of teachers affected by these or other errors? Elementary? Teachers with Individual student test scores? Teachers in a specific Ward of D.C.?

3. What are the specifics of the proposed remedies for each affected teacher?

4. What is DCPS’s plan for remedying the impact of the error on each affected teacher?

5. When will these remedies be instituted?

6. Was the error the result of actions taken by one or more employees of DCPS? If so, was the person or persons responsible for evaluating or reviewing teacher’s evaluations?

7. Were personnel actions taken against any administrator or private contractor due to these errors?

8. Have affected teachers been notified? If so, how and when?

9. What are the future communications plans regarding these errors?

Finally, the WTU requests copies of all communications between the school district and all third-party contractors regarding these errors and the IMPACT system. Along with general communications, this should include all technical and general reports related to IMPACT, the errors and teaching quality in DCPS. Thank you for your anticipated cooperation.

Sincerely,

Elizabeth A. Davis

President, Washington Teachers’ Union

Statement from Randin Weingarten, president of the American Federation of Teachers:

We believe in D.C. public schools, and have worked with our local union and this mayor in many constructive ways, including on its very successful pre-K program. But there’s something very troubling when the district continues to reduce everything about students, educators and schools to a nameless, faceless algorithm and test score. This was clear in the Rhee era and led to widespread allegations of cheating. And now we see it with the troubling news that teachers’ evaluation scores were miscalculated—with a tremendous impact on the employment and wages of teachers and on our schools and students.

You can’t simply take a bunch of data, apply an algorithm, and use whatever pops out of a black box to judge teachers, students and our schools. And now, we have the disclosure that even the number was miscalculated, affecting dozens, if not hundreds, of educators. Our children deserve better.

HUFFINGTON POST ARTICLE BELOW HERE:

Minor Glitch Leads To Major Criticism Of Michelle Rhee's Signature Initiative

Posted: 12/23/2013 8:17 pm EST | Updated: 12/24/2013 12:07 am EST

The evaluations of 44 Washington public school teachers -- out of 4,000 -- were compromised by an outside contractor, an email from a D.C. official to the city's teachers union reveals.

The union is slamming the school system for the mistake and raising broader questions about the system.

D.C. Public Schools' chief of human capital, Jason Kamras, wrote to Elizabeth Davis, president of the Washington Teachers' Union, on Friday. In the email, which the school system provided to The Huffington Post, he told her of two errors in the evaluation of teachers for the 2012-13 school year.

First, Kamras wrote, the policy on appropriate weighting of administrator and master educator observations of teachers under the evaluation formula, known as IMPACT, was "not clearly communicated." IMPACT scores can affect teachers' bonuses and job security. So the District has recalculated all observation scores that might have been affected by that miscommunication, according to Kamras.

Second, he wrote, the outside contractor, Mathematica Policy Research, "found a small technical error" that affected some teachers' Individual Value-Added scores. Those scores have been recalculated as well. Kamras assured Davis that teachers who would have had a lower IVA score as a result "will be held harmless."

The teachers unions, both local and national, are calling attention to the errors, saying that they highlight inherent problems with these methods for sorting and evaluating teachers.

"These errors make clear that this evaluation system is flawed," Davis said in a statement late Monday. "Teachers, parents and students deserve full transparency and accountability." Davis also wrote to D.C. Public Schools Chancellor Kaya Henderson seeking more details on the extent of the errors.

Randi Weingarten, president of the American Federation of Teachers, chimed in with her own condemnation of the system. "There’s something very troubling when the district continues to reduce everything about students, educators and schools to a nameless, faceless algorithm and test score," Weingarten wrote. "You can’t simply take a bunch of data, apply an algorithm, and use whatever pops out of a black box to judge teachers, students and our schools. And now, we have the disclosure that even the number was miscalculated, affecting dozens, if not hundreds, of educators. Our children deserve better."

Michelle Rhee, then D.C. schools chancellor, instituted the use of IMPACT in 2009, making it one of the first teacher evaluation systems to treat students' test scores as a significantly influential factor. The system uses "value-added measurement," a complex algorithm that aims to remove the statistical effects of factors like students' socioeconomic status to uncover how much teachers truly affect their students' test scores. During its first year, IMPACT used value-added measurement to account for a full 50 percent of a teacher's evaluation; that has since been reduced to 35 percent. IMPACT also considers administrators' and master educators' observations of the teachers.

When Rhee first announced that D.C. would be using IMPACT, it was a controversial decision: Many statisticians question the reliability of value-added metrics, and the use of standardized testing to separate good teachers from bad has always received vocal opposition from skeptics such as teachers unions. But as Rhee's signature reform, the use of IMPACT has gained a broader symbolism for the so-called education reform movement, which has sought to emulate the program across the country.

A recent study of the IMPACT system found that it did help D.C. Public Schools keep successful teachers while losing its laggards. D.C. Public Schools advocates have also cited the city's higher national test scores as evidence of the program's success. But sociologist Matthew di Carlo of the Albert Shanker Institute, a think tank affiliated with the American Federation of Teachers, warned that the study's conclusions shouldn't be interpreted as an "overall assessment of IMPACT" because they only pertained to certain groups of D.C. teachers.

See Kamras' full letter below:

Hi Liz,

I hope this message finds you well. I am writing to communicate two important updates regarding last year's final IMPACT scores for teachers.

First, during the 2012-2013 school year, IMPACT policy was to calculate final TLF scores using a weighted average wherein administrator observations counted for 60% and master educator observations counted for 40%. Though this was the policy, it has been brought to my attention by several employees and by our legal counsel that the policy was not clearly communicated. Given our commitment to transparency, all final TLF scores that were lower because of the weighted average policy have been recalculated using a straight average. Later today, the IMPACT team will issue 2012-2013 revised reports for all teachers with higher final TLF scores as a result of the recalculation. For the 2013-2014 school year, all final TLF scores will be calculated using a straight average.

Second, our external partner Mathematica Policy Research recently found a small technical error that affected some teachers’ 2012-2013 Individual Value-Added (IVA) scores. Mathematica corrected the error and recalculated the value-added results. Later today, the IMPACT team will issue 2012-2013 revised reports for all teachers with higher IVA scores as a result of the recalculation. Teachers who would have had a lower IVA score as a result of the recalculation will be held harmless and will not be informed of the IVA recalculation.

If you or any of your teachers have any questions, please contact the IMPACT team ...

As always, we thank you for your collaboration and partnership!

With appreciation,

Jason



Comments:

Add your own comment (all fields are necessary)

Substance readers:

You must give your first name and last name under "Name" when you post a comment at substancenews.net. We are not operating a blog and do not allow anonymous or pseudonymous comments. Our readers deserve to know who is commenting, just as they deserve to know the source of our news reports and analysis.

Please respect this, and also provide us with an accurate e-mail address.

Thank you,

The Editors of Substance

Your Name

Your Email

What's your comment about?

Your Comment

Please answer this to prove you're not a robot:

4 + 2 =