Delivery alert

There may be an issue with the delivery of your newspaper. This alert will expire at NaN. Click here for more info.

Recover password

Faulty teacher evals blamed on bad data from districts

Copyright © 2014 Albuquerque Journal

State Public Education Department officials say insufficient and inaccurate information reported by some school districts led to mistakes in a still undetermined number of teacher evaluations, as the controversial state system for rating teacher performance was rolled out for the first time this spring.

The errors came to light after teachers received their one-page evaluations last month. Although 50 percent of the evaluations were to be based on their students’ achievement, some teachers received no marks at all in that category.

Others who were graded on student achievement received “minimally effective” scores, even when their students had made clear progress and the teacher scored well in the other evaluation areas. Some data on absences also were incorrect.

Teachers who received erroneous evaluations will be given new ones with corrected scores, according to PED.

PED relied on local school districts to report much of the data and then compiled the state’s new teacher evaluations.

Education Secretary-designate Hanna Skandera said this week there was little PED could do when districts failed to report data or reported inaccurate data. “We can only have a good evaluation if we have good data,” she said.

Skandera said most of the evaluation errors brought to the PED’s attention so far have come from Albuquerque Public Schools, the state’s largest district, which enrolls about a third of the state’s students.

The district is working to correct any errors it made, APS chief academic officer Shelly Green said in a statement to the Journal .

“APS is working with PED to better understand how results of the new teacher evaluations were calculated and to make necessary corrections,” Green wrote. “This is a complicated system that we acknowledge we don’t completely understand yet, but we’re making every effort to address the many questions and concerns raised by our teachers and administrators.”

The district does not have an estimate of how many teachers received evaluations with errors, spokeswoman Monica Armenta said.

“The real issue here is that teachers deserve to be evaluated fairly and accurately, and principals should be able to explain the calculation behind their rating,” APS Superintendent Winston Brooks said in a telephone interview.


A trio of APS principals who spoke to Journal reporters and editors last week said they were baffled by the evaluations and unable to explain them to their teachers, despite having attended PED training sessions.

They also said some teachers were docked for being absent more days than they were actually gone from school, and some appeared to be marked down for days when their absences should have been excused. APS officials acknowledge there was a problem with some of those numbers the district reported to the state.

APS is not alone in having issues.

Nine other districts and six charter schools have notified PED that some of their teachers have had problems with evaluations, PED officials said. Skandera said in these districts the problems were not as widespread as they were with APS and that 75 percent of the inquiries from other districts are related to individual teachers who did not receive an evaluation at all.

The Rio Rancho Public School District, however, has told the Journal that about 10 percent of the district’s teachers were given evaluations that contained errors.

PED said it couldn’t comment until it received more specifics from Rio Rancho.

Student achievement

Many of the errors found in APS teacher evaluations were connected to the “student achievement” portion of the evaluations, said Matt Montaño, PED director of educator quality.

Under the evaluation plan put in place this year, in cases where it was possible to do so, 50 percent of a teacher’s score was to be based on student achievement data from standardized tests for the past three years.

Twenty-five percent was to be based on classroom observations, and 15 percent was to be based on a review of the teacher’s lesson plans and his or her professionalism. The remaining 10 percent was to be based on “multiple measures,” like teacher attendance.

Montaño said PED keeps records of student scores on the state’s standardized test, called the Standards Based Assessment, and end-of-course exams, which are two of the tests used to track student achievement.

But it does not keep records of who a teacher’s students were. That information is necessary to give teachers a score, and the department relies on school districts to provide that data.

Some teachers received faulty student achievement scores because the district reported incorrect or incomplete data showing which students they taught, Montaño said

He said if teachers are found to have inaccurate evaluations, they will receive new scores.

PED also relies on districts to provide data for the student achievement tests used in kindergarten and first and second grades. Different tests are given in those grades because students don’t take the SBA until third grade. Most New Mexico districts use a test called Dibels, although APS uses the Developmental Reading Assessment.

In some cases, Montaño said, schools hadn’t kept or didn’t report three years of test data, which meant those teachers were given no mark for student achievement. Instead, classroom observations and measures like teacher attendance were weighted more heavily, Montaño said.

This is the first year the state has issued its new teacher evaluations, which have been heavily criticized by teachers unions and some local education officials.

The PED has said it started the new system because previously 99 percent of teachers were considered effective and there was little accountability. Under the new method, 76 percent of teachers scored were “effective” or better in the first year.