CHARLOTTE, N.C. -- As Charlotte-Mecklenburg Schools continues working on data-accuracy issues, people who care about public education are debating the best path to restore confidence in school measurements.

Consider the clash of two active CMS parents. Pamela Grundy, who opposed last year's push to use test scores for teacher ratings, says the errors that caused CMS to withdraw its school progress reports should prompt the district to hit the brakes on data-crunching.

I think the key is fewer numbers and more care given to generating and using them, Grundy said.

But Bo Boylan, who has been involved in school construction and finance advisory groups, worries that critics will use the errors to set aside data they don't like.

The real danger, he said, lies in blindly making decisions absent credible data to inform such decision making.

On Jan. 20, interim Superintendent Hugh Hattabaugh withdrew reports listing school-by-school data on such things as test scores, graduation rates, school spending and teacher evaluations, citing several data accuracy issues.

Among them: A new rating of students anticipated to graduate on time had artificially high numbers, the origin of which CMS has not yet explained.

Hattabaugh originally planned to issue cleaned-up reports Feb. 3, but has since delayed indefinitely, saying that quality checking is taking longer than anticipated.

CMS has built a national reputation as a leader in using data to chart action. That push helped the district win the 2011 Broad Prize for Urban Education, but also created backlash among teachers and parents who say CMS overtests children to create scores for value-added teacher ratings.

Daria Hall, an official with the Washington, D.C., research and advocacy group Education Trust, said CMS has a pretty strong track record not just on generating numbers but balancing them with good judgment, and that it would be a mistake to devalue all of the data because of this screw-up.

Data is an incredibly valuable tool, she said. What they have to be focused on is making the tool better.

First, fix mistakes

The immediate challenge is identifying what went wrong and how CMS plans to fix it, virtually everyone agrees. Hattabaugh has said he'll provide that information once he has the answers.

Chris Cobitz, an administrator who played a key role in producing the school progress reports and who initially defended the wrong graduation-track numbers, resigned in January, saying he took responsibility for inaccurate data.

Three people who have worked closely with CMS on data - Bill Anderson of MeckEd, Cheryl Pulliam of the Queen's University Public Education Research Institute and Howard Haworth, a former state school board chairman - said they don't believe anyone intentionally fudged the report.

Even though the false numbers made all schools look good, they were too high to be believable and brought no benefit, they said.

CMS, like a lot of other organizations, will accentuate the positive, Pulliam said. But I do not think they intentionally released inaccurate data.
So far, Hattabaugh has said only that the district's quality control process broke down in allowing the wrong numbers to be published.

Haworth and Pulliam said CMS needs to create a system in which principals take a stronger role in ensuring their school data is accurate. Principals saw the school progress reports before they were posted publicly, but apparently did not catch the errors.

Good data is costly

Much of the number-crunching CMS does, including graduation rates and proficiency on state exams, is required by the state or federal government. Reporting numbers to the state can provide an extra level of fact-checking and context, but the mandates can also fuel confusion.

For instance, the rules determining which schools meet the federal No Child Left Behind academic goals are so complex that the Adequate Yearly Progress, or AYP results, mean little to the public.

This year the state also started reporting schoolwide results of teachers' job-evaluation ratings. But the 2010-11 totals did not cover the full faculty at most schools. And the CMS school progress reports used a different method of tallying the schoolwide results than the state used, creating two different pictures of the same data.

Meanwhile, CMS has kept layering on new data of its own, even as it has reduced staff in the accountability department because of budget cuts.

Lindalyn Kakadelis, a former CMS board member who now runs the John Locke Foundation's North Carolina Education Alliance, said the district should hire an outside firm to make sure we're getting clean data we can trust.

Project LIFT, a coalition of local philanthropists partnering with CMS to run nine west Charlotte schools, will require independent third-party evaluation of its efforts - a decision reached before the CMS data problems came to light, said Executive Director Denise Watts.

Watts, a former principal who will run the Project LIFT Zone for CMS, said independent evaluation assures donors they're getting their money's worth, but also diverts money from schools.

Evaluation is expensive. A gold-standard evaluation for a project like Project LIFT is 5 percent, she said of the $55 million, five-year project. Are you willing to take millions away from kids?

LIFT's solution is to require groups that get grants to set aside 5 percent for outside evaluation.

Make it clear

Even when numbers are accurate, their meaning can be murky. The formula used to calculate value-added teacher ratings last year was so complex it fueled skepticism.

Some teachers complained the ratings didn't reflect what they knew about who was weak and strong - and that a low rating provided no guidance for improvement.

CMS is working on a better measure.

This year's school report cards showed dramatic changes in achievement gaps - an important measure of efforts to help black and Hispanic students catch up to white and Asian peers - that turned out to be a quirk of labeling.

A newly-classified and very small group of Pacific Islanders displaced blacks as the lowest-scoring group, making it look like gaps had widened. And even once the anticipated to graduate numbers are corrected, Grundy and board member Tom Tate have questioned whether they will be meaningful.

The measure, created by CMS, counts the percent of students who have never been held back a grade.

Grundy and Tate say that tells little about whether students are at risk of dropping out, especially in the younger grades, when retentions are rare.

For instance, CMS calculated that 100 percent of the 4-year-olds in its preschools for children with disabilities were anticipated to graduate on time.

Although CMS has very low graduation rates for students with disabilities - 42 percent last year - the children had never been held back.

Mary McCray, a former teacher and vice chairwoman of the school board, said CMS needs to focus on producing numbers that make sense.

If people don't understand them, she said, it doesn't mean a thing.

Read or Share this story: