Delivery alert

There may be an issue with the delivery of your newspaper. This alert will expire at NaN. Click here for more info.

Recover password

More questions on evals’ accuracy

Copyright © 2014 Albuquerque Journal

Principal Robin Hoberg had no answers when teachers at Double Eagle Elementary School asked her to explain what they believed were errors on their new teacher evaluations.

Nor could she explain inconsistencies in how teachers were graded on attendance and student performance.

Despite controversy surrounding the state’s new teacher evaluations and many teachers’ strong objections, Hoberg said change was needed and that she had hoped some good would come from the new system unveiled this year.

But that was before the state Public Education Department released the evaluations, which were given to Albuquerque Public Schools teachers during the last week of school.

Hoberg said she had attended every meeting available with PED officials through the year to learn about the system.

Yet she and the other two principals said they are frustrated and confused by the evaluations, and have been unable to get answers to their questions.

“When those last reports came out, it was like a kick in our guts,” Hoberg told the Journal during a meeting this week with three veteran Albuquerque elementary school principals – Jill Vice of John Baker Elementary and Jack Vermillion of S.Y. Jackson Elementary.

Both PED and APS now say there were some errors in data on which the evaluations were based, but neither could say how widespread the problems are.

They said they are working on fixing any errors, and then correcting evaluations.

Matt Montaño, PED director of educator quality, said Friday some of the errors were due to information submitted by districts.


Under the evaluation plan adopted for Albuquerque Public Schools, 50 percent of a teacher’s score was to be based on student achievement data from standardized tests. Twenty-five percent was to be based on classroom observations, and 15 percent was to be based on a review of the teacher’s lesson plans and his or her professionalism. The remaining 10 percent was to be based on teacher attendance.

One of the principals’ main concerns was that the evaluations did not always follow this breakdown, and there was a lack of consistency on what teachers’ scores were based on, they said.

Another concern was a lack of transparency. They said they’ve received no clear explanation backing up their teachers’ student achievement scores, which in many cases did not appear to match the data made available to teachers and principals.

Vice said the evaluations have left her and her staff baffled, and they don’t know what to make of the scores.

“What good is a tool … if it doesn’t give us information on how to improve?” Vice said.

The principals stressed they could only address evaluation issues at the elementary school level, and not issues facing other levels.

Among the examples they cited:

  • Some veteran teachers at S.Y. Jackson were not graded at all on student achievement scores from standardized tests, while others were.
  • In certain cases, teachers who were graded on student achievement received “minimally effective” scores, even when their students had made clear progress and the teacher scored well in the other evaluation areas.
  • Some teachers were docked for being absent more days than they were actually gone from school, and some appeared to be marked down for days when their absences should have been excused. PED and APS officials acknowledge there was a problem with some of these numbers and they are working to correct them.
  • Some teachers were graded more heavily on their attendance than other teachers, even though PED had said attendance would count as only 10 percent of an APS teacher’s score. Instead, attendance at Double Eagle, for example, was 20 percent in many cases and ranged to as high as 29 percent, in others.

The PED has said it started the new system because previously 99 percent of teachers were considered effective and there was little accountability.

Results released last month under the new method found 76 percent of teachers scored were “effective” or better.


Montaño said Friday there are explanations for the questions the principals raised.

He said some APS teachers were inaccurately marked down for being absent because the district made errors when reporting attendance to the state. In those cases, PED and APS have corrected evaluations, he said.

APS has sent a letter to those teachers, Shelly Green, APS chief academic officer, said in a statement Friday.

Green acknowledged that it appears some teachers were given faulty student achievement scores.

“We are working with PED to identify any teachers who may have been given scores in the student achievement category based on insufficient data,” Green said, adding she could not give an estimate of how many teachers are affected.

Rio Rancho and Las Cruces schools said they also are working with PED to identify flaws in teacher evaluations.

In some cases, the district didn’t provide enough information, Montaño said. In other instances, he said, districts and PED are working to determine whether the state made mistakes on student achievement data.

Montaño said if teachers are found to have inaccurate evaluations, they will receive new scores.

An inconsistent approach?

Vermillion said what most troubled him was that PED appeared to change the way it scored evaluations after the school year started.

“Regardless of the individual cases – because we all had cases where people got docked when they shouldn’t have – but the reality is still we were told all year long and our staffs were told that attendance would count as 10 percent,” Vermillion said.

Vermillion said it’s not fair to score some teachers more heavily on attendance than others.

Hoberg said another inconsistency was that kindergarten teachers at Double Eagle were graded on student achievement scores, while kindergarten teachers at S.Y. Jackson were not.

Montaño said that could be caused by a school or district failing to report the test data.

He said some teachers were more heavily graded on attendance and classroom observations than their peers because there was not sufficient test data to give them a student achievement score. Student achievement scores are based on three years of test data.

When three years of data were not available, Montaño said, classroom observations and teacher attendance carried more weight because student achievement scores weren’t available.

He said the benefit of the new evaluation is it gives principals and teachers better insight into a teacher’s classroom performance.

“The principals have more information than they had before,” Montaño said.

Hoberg said she agreed with much of the new observation piece of the evaluation, adding that it’s more detailed and provides more information to the teacher. But she said that didn’t negate the seriousness of the other problems.