NEWS

Study assails N.Y. teacher-evaluation system

Gary Stern
gstern@lohud.com
  • Study%3A Formula forces school districts to inflate key teacher grades
  • Study%3A District evaluation results cannot be compared
  • Study%3A Teachers scored lower if formula included student test scores
  • Superintendents want state Education Department to replace evaluation system

Leaders of a regional school superintendents group are calling on the state to scrap its much-debated teacher-evaluation system, contending that a new study proves that the system is irreparably flawed.

The study, released Friday, found that the state formula for calculating evaluations forces school districts to inflate classroom-observation ratings so teachers do not get poor overall scores.

Louis Wool, Harrison schools superintendent.

If districts were to give more accurate grades to teachers after classroom visits, the study found, many teachers would "unjustly" receive overall ratings of "developing" or "ineffective." Such districts would "end up looking like they have an underperforming workforce," the report said.

"This is not something that can be fixed; the state Education Department needs to start over," said Louis Wool, Harrison schools superintendent, who was president of the Lower Hudson Council of School Superintendents when the group commissioned the study last year.

The study reviewed 2012-13 evaluation results for 1,400 teachers in 32 districts in Westchester, Rockland, Putnam and Dutchess counties. The superintendents group provided the data to Education Analytics, a non-profit organization in Madison, Wisconsin, which did the study.

Researchers credited New York state with improving its methods of measuring teacher effectiveness. In fact, the report called New York "a pioneer" in developing a modern evaluation system. But researchers said there are few examples nationally of effective implementation and that strong use of data may not necessarily translate into good policy.

Under the state evaluation system, overall teacher grades are based 60 percent on classroom observations, 20 percent on how students progress on state tests or other measures, and 20 percent on locally determined assessments.

Teachers are supposed to be scored on a legitimate 0-60 scale for their classroom observations, but teachers in the study received an average score of 58.1 and none received a grade of less than 40.

Because of a quirk in the state's scoring formula, the study found, if districts gave more scores in the 40s or 30s for classroom observations, many teachers would get overall ratings of "ineffective" or "developing" rather than "effective" or "highly effective."

"Districts are faced with a tough tradeoff," the report said, and "lose the ability to use the ratings as a helpful tool."

Pleasantville Superintendent Mary Fox-Alter called the release of the study a "watershed moment." She said the study should convince state officials to work with the superintendents group to create new evaluation models to pilot.

"I hope the state will work collaboratively with us and not against us," she said.

A spokesman for the state Education Department released a statement Friday evening saying the study had a lot of praise for the state.

"Judging from the comments from Superintendents Wool and Fox-Alter, it's apparent they did not read the report closely," said Dennis Tompkins, the spokesman. "Most of the report praises New York's methodology for determining growth scores."

He did say that the state Board of Regents and education commissioner, which set education policy in New York, will continue to review the data from the evaluation system over time.

"The real message of the report is that while New York's evaluation system should continue to evolve, overall our system is built on a solid foundation," he said.

The study found that, if a teacher receives average "effective" ratings on the two 20-percent components of the system, that teacher would need a score of 56 out of 60 on the observation component to get an overall average rating of "effective."

Local districts did what they had to: 94 percent of teachers and principals in Westchester County, 98 percent in Rockland, 99 percent in Putnam, and 92 percent in Dutchess received "effective" or "highly effective" ratings for 2012-13.

The main variation in scoring, the study found, came within the 20 percent measure of how a teacher's students progressed during the school year.

For a minority of teachers, the state uses student scores on standardized tests to determine a teacher's grade on a 0-20 scale. For most teachers, districts must devise an alternative way to measure student growth.

But the study found that teachers whose students do not take standardized tests received an average score about 3 points higher than teachers whose ratings were based on test scores. There are "substantial differences," researchers found, in how districts grade their teachers.

Tompkins, the Education Department spokesman, pointed to that portion of the report.

"Any critique of those measures has to be directed locally — and resolved locally," he said.

In June, lawmakers and NYSUT negotiated a two-year "safety net" that will temporarily prevent educators from being dismissed because of student results on state tests.

"We put a lot of money into this and for what?" Valhalla Superintendent Brenda Myers said.

An initial study commissioned by the Superintendents Council, released a year ago, found that the evaluation formula in 2011-12 gave less credit to teachers with disadvantaged students and failed to measure other variables. But the new report praised the state for addressing those issues.

Twitter: @garysternNY